Anda di halaman 1dari 15

See

discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/324485359

Bots & (Main)Frames: Exploring the Impact of


Tangible Blocks and Collaborative Play in an
Educational Programmin....

Conference Paper · April 2018


DOI: 10.1145/3173574.3173840

CITATIONS READS

0 38

2 authors:

Edward Melcer Katherine Isbister


New York University New York University
19 PUBLICATIONS 55 CITATIONS 116 PUBLICATIONS 1,920 CITATIONS

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Embodied Educational Game Design View project

Game Field Bibliometrics View project

All content following this page was uploaded by Edward Melcer on 12 April 2018.

The user has requested enhancement of the downloaded file.


Bots & (Main)Frames: Exploring the Impact of Tangible
Blocks and Collaborative Play in an Educational
Programming Game
Edward F. Melcer Katherine Isbister
New York University University of California, Santa Cruz
Brooklyn, USA Santa Cruz, USA
eddie.melcer@nyu.edu katherine.isbister@ucsc.edu
ABSTRACT [31,40,68] and creating barriers to learning [74]. In recent
While recent work has begun to evaluate the efficacy of years, there have been a number of educational
educational programming games, many common design programming games created to simplify presentation of
decisions in these games (e.g., single player gameplay using these concepts and address the growing needs of computer
touchpad or mouse) have not been explored for learning science education [4,22,35,44]. However, while there has
outcomes. For instance, alternative design approaches such been some work evaluating the efficacy of these
as collaborative play and embodied interaction with educational programming games [4,15], relatively little has
tangibles may also provide important benefits to learners. been done to explore the impact of different design
To better understand how these design decisions impact decisions within them. For instance, a recent survey
learning and related factors, we created an educational examining the designs of existing Computer Science (CS)
programming game that allows for systematically varying games [21] found that the majority feature a number of
input method and mode of play. In this paper, we describe identical design decisions such as: 1) playable characters
design rationale for mouse and tangible versions of our (robot); 2) genre (puzzle); 3) mode of play (single player);
game, and report a 2x2 factorial experiment comparing and 4) input method (touchpad or mouse and keyboard).
efficacy of mouse and tangible input methods with
individual and collaborative modes of play. Results indicate In contrast to these common design decisions, recent
tangibles have a greater positive impact on learning, research has shown substantial promise from alternative
situational interest, enjoyment, and programming self- input methods (e.g., tangibles) and modes of play (e.g.,
beliefs. We also found collaborative play helps further collaborative). For example, tangible programming
reduce programming anxiety over individual play. interfaces have been shown to prevent syntax errors [92]
and increase active player engagement in an informal
Author Keywords learning context [26]. In non-programming domains,
Educational Programming Game; Tangibles; Embodied tangibles have also been found to result in benefits for a
Interaction; Physical Embodiment; Collaborative Play. wide range of factors such as engagement [18], interest [3],
ACM Classification Keywords and collaboration [78]. On the other hand, collaborative
H.5.m. Information interfaces and presentation (e.g., HCI): play has been shown to reduce program errors [42] and
Miscellaneous. increase performance when programming [54], as well as
increase learner enjoyment, engagement, and motivation in
INTRODUCTION non-programming contexts [32,76].
Programming presents many challenges to novice learners,
with issues arising due to the radical novelty of concepts With these potential advantages in mind, we wanted to
and material [12], as well as from difficulty in systematically explore how modulating a subset of common
understanding the syntax [57]. These issues result in a design decisions in educational games (i.e., input methods
mismatch between what programs do and what novices and mode of play) could impact learning outcomes and
think they do [11,33,64], evoking strong negative feelings important related factors such as programming self-beliefs
[75], situational interest [10], and enjoyment [1]. For this
Permission to make digital or hard copies of all or part of this work for paper, we first present the design of mouse and tangible
personal or classroom use is granted without fee provided that copies are versions of our educational programming game, Bots &
not made or distributed for profit or commercial advantage and that copies
bear this notice and the full citation on the first page. Copyrights for
(Main)Frames. Then, we describe a 2x2 factorial
components of this work owned by others than the author(s) must be experiment conducted with 80 novice programmers that
honored. Abstracting with credit is permitted. To copy otherwise, or varied game design in the form of input method (use of
republish, to post on servers or to redistribute to lists, requires prior tangible programming blocks vs. mouse) and mode of play
specific permission and/or a fee. Request permissions from
Permissions@acm.org.
(individual vs. collaborative pairs). This allowed us to
CHI 2018, April 21–26, 2018, Montreal, QC, Canada isolate the impact of utilizing tangibles vs. mouse for input
© 2018 Copyright is held by the owner/author(s). Publication rights and individual vs. collaborative forms of gameplay, keeping
licensed to ACM. all other mechanics, aesthetics, etc. identical. Results show
ACM 978-1-4503-5620-6/18/04…$15.00
https://doi.org/10.1145/3173574.3173840.
that tangibles have a substantial positive impact on players’ programming self-beliefs [49]. However, they did not
perceptions of their own programming abilities, situational compare the efficacy of tangible and collaborative designs
and programming interest, enjoyment, intentions to replay against individual designs. Given the notable relationship
and recommend the game, and performance/learning between use of tangibles and teaching computational
outcomes. We also found collaborative play was beneficial concepts, we feel they are an important design decision that
for reducing players’ anxiety towards the overall act of should be explored for educational programming games.
programming. Finally, we conclude the paper with a
Collaboration
discussion of design implications based on study results. Collaboration has been a long standing variable of
RELATED WORK importance for learning, where collaborative learning can
be viewed as the synchronous interaction of two or more
Tangibles and Computational Concepts
Tangible user interfaces (TUIs) are advantageous in that learners to negotiate shared meaning and jointly solve
they utilize both physical representation (through objects or problems [13]. Prior work has suggested that collaborative
the space) and manipulation of digital data to offer learning environments are more likely to elicit increased
interactive couplings of physical artifacts with intrinsic motivation [80], and working together in small
computationally mediated digital information [27]. Building groups has been shown to increase a learner’s enjoyment,
on this notion within HCI, there is notable work by the engagement, and motivation [32,76]. In contrast to
tangible and embodied interaction (TEI) community on the individual study, group collaboration also appears to be
creation of tangibles to teach computing concepts. For well suited for problem-solving since it encourages learners
instance, roBlocks [72] and Electronic Blocks [92] allow to articulate and rationalize their thinking in order to engage
learners to connect tangible blocks embedded with sensors, in joint elaboration on their decision making [53]. The
actuators, and logic to explore physical computing and beneficial nature of collaboration for problem solving may
programming concepts. Tools such as Note Code [41] have also lend itself particularly well to programming, where
also used music as a metaphor for learners to program by recent work has shown a significant correlation between
connecting ―noteboxes‖ to play musical sequences. Thingy computational thinking and problem-solving ability [69].
Oriented Programming [20] was designed for prototyping For collaborative learning of programming, Lin and Liu
simple electronics applications and systems that involve [42] found that children spent more time on analysis/design
networks of sensors and actuators by using recorded of their programs, solutions contained fewer errors, and
sequences of actions with tangible objects. Additionally, they tended to reflect on their solutions more often than if
TanProRobot 2.0 [88] allows children to program a toy car they worked alone. Additionally, pair programming (two
with physical blocks. It is interesting to note that concepts programmers working collaboratively at a shared computer
covered by these tools focus more on physical computing, on the same task [7]) has been shown to improve self-
electronics, and music rather than explicit use of sufficiency, performance, and overall grades for students in
programming or games. an introductory CS course [54]. Based on these previous
Considering related research within the domain of tangibles findings of collaborative benefits, we believe collaborative
that involve programming and games, some of the earliest play is an important design decision that should be explored
work was Suzuki and Kato’s AlgoBlock system [85,86]— for educational games in our study.
for which they coined the term Tangible Programming to Programming Self-Beliefs
describe it [77]. In AlgoBlock, children used large Learner self-beliefs play an important role in academic
computational building blocks to direct a submarine development and learning outcomes [9]. Programming in
through a maze [48]. More recently, tools such as E-Block particular presents many challenges to learners' self-beliefs
[89] provide children with tangible programming blocks to as the radical novelty [12] of the concepts and material can
solve an isometric maze game. Tangible programming evoke strong negative feelings [31,40,68], creating barriers
languages and interfaces such as TurTan [17] and Tern [26] to learning [74]. To better assess and understand these
have also been employed to afford creative exploration, and complex affective issues, Scott and Ghinea [75] used the
encourage users to more actively engage in an activity and Control-Value Theory of Achievement Emotions [58,59] to
groups than they would when using a graphical interface. develop and validate an instrument that assesses
Strawbies [29] (now Osmo) utilizes wooden tiles to programming self-beliefs. Underlying the self-beliefs model
program a game character to solve mazes. However, this of their scale are five core dimensions: 1) debugging self-
was only evaluated through qualitative descriptions of play efficacy—learners' cognitive self-assessments of whether or
sessions at local schools. Melcer et al. created a similar type not they are confident in their ability to write and debug
of programming game that uses wooden blocks with hooks simple programs; 2) programming self-concept—a
instead of tiles to qualitatively explore differences between composite of self-perceptions that are formed through
tangibles and mouse on interactions underlying player experience with and interpretations of one’s environment
collaboration [52], as well as explore collaborative impacts [46]; 3) programming interest—the extent to which an
of tangibles on enjoyment and some dimensions of individual enjoys engaging with a set of programming tasks
role in determining our emotional engagement during a task
[70], what we choose to learn [87], and how well we learn
such information [2]. Two important forms of interest are
personal or individual interest—characterized by an
intrinsic desire and tendency to engage with a particular
topic that persists over time [60]—and situational
interest—characterized by the context-specific, affective
and attentional reactions elicited by the environment (i.e.,
the appealing effect of an activity on the individual, rather
than the individual’s personal preference for the activity)
[23,43,71]. Situational interest in particular is critical to
education and the design of educational tools since it is
essential for developing personal interest in learners, which
in turn positively impacts learning as discussed above [24].
In the vein of situational interest and games, Plass et al.
[60] examined the impact of individual, collaborative, and
Figure 1. The Bots & (Main)Frames interface. (A) The competitive modes of play on situational interest in an
puzzle/level to solve. (B) The commands available to use. (C) educational mathematics video game. They found that
The display box for the current program. (D) The indicator collaborative and competitive forms of play elicited greater
for number of commands available to use in that level. (E) The situational interest than individual play. Additionally,
additional box for creating a function—only visible if the physical activity has also been found to have a positive
function command is available that level. relationship with situational interest in exergames [56],
[75]; 4) programming anxiety—a self-reflected state of commercial dance games [30], and active educational video
experiencing negative emotions while writing and games (AVGs) [83].
debugging programs; and 5) programming aptitude—based DESIGN OF BOTS & (MAIN)FRAMES
on Dweck’s [14] notion of mindsets where students have We designed Bots & (Main)Frames (see Figure 1) to
either a growth mindset (i.e., belief that their capacities can incorporate common design characteristics of existing
be improved through practice) or a fixed mindset (i.e., educational programming games (i.e., players program a
belief that their capacities are inherent). Students with a virtual robot to solve puzzles [21]), while also providing a
growth programming aptitude tend to maintain practice comparison point for differing input methods and modes of
when they encounter difficulty while those with a fixed interaction. We chose to utilize the robot aesthetic that is
programming aptitude do not [73]. Given the direct common in most programming games to ensure results
influence these self-beliefs have on learning programming, mirror existing real world applications, since recent work
we measured it in our study to compare outcomes. has shown that changing avatars in educational
Situational Interest programming games can impact player affect [36,37].
Interest (i.e., liking and willful engagement in a cognitive During gameplay, players are presented with a series of
activity [71]) is an important motivation factor that guides levels in the form of puzzles (Figure 1A), with the goal of
individuals to learn [65]. More specifically, it has a central programming a virtual robot to reach all red tiles from a

Figure 2. The 10 training levels used to teach programming concepts. Levels 1 – 3 introduce basic algorithm building; levels 4 – 6
teach looping concepts; and levels 7 – 10 cover functions.
Figure 3. The 5 additional challenge levels used to evaluate performance and learning outcomes during the performance task.
given starting point. Each puzzle presents players with a set one command a specified number of times (Figure 4C); and
of different programming commands (Figure 1B), and a using a function to abstract and reuse patterns (Figure 1E
number limiting how many commands they can use to solve and Figure 4D). Level progression was also designed to
the level (Figure 1D). Limiting the number of useable either introduce a new concept or reiterate a previous one
commands for a level was done in order to prevent players using a more challenging puzzle. Designing the complexity
from brute forcing answers rather than learning to use of each level was based on the notions of depth (the total
intended programming concepts (e.g., using 5 individual number of solution commands needed for the optimal path),
forward commands rather than looping one forward breadth (the number of unique solution commands needed
command 5 times). for the optimal path), and obfuscation (presenting unique
additional potential paths that appear correct but are
For the study, we created two sets of game levels. The first
incorrect)—which [52] identified as effective principles for
was a 10 level training set (see Figure 2) designed to teach
designing challenge into puzzle-based programming levels.
fundamental introductory programming concepts and
computational thinking skills—i.e., algorithm building, Block-Based Programming Interface
loops, and functions [6,8,19]. The second was a 5 level Block-based programming interfaces are well represented
performance set (see Figure 3) consisting of more in educational programming games and environments
challenging levels to evaluate players’ performance and today—with notable examples being Scratch [66], Blockly
learning after completing the training levels. In the training [16], Reduct [4], BOTS [22], and Tern [26]. These
set, as players progressed through levels they were interfaces utilize graphical or physical, block-based
presented with a variety of programming commands to manipulatives where the visual properties of the blocks
control the robot that corresponded with a specific indicate how they are supposed to connect [48]. One of the
programming concept (see Figure 4). More specifically, the notable advantages to this block-based manipulative design
commands consisted of: moving/translating forward in the approach is that it helps to eliminate syntax errors [92], a
direction the robot is facing (Figure 4A); rotating the robot major barrier to entry for novice programmers [57].
90 degrees left or right (Figure 4B); using a loop to repeat Therefore, our interface for programming (Figure 1C)
similarly adopts usage of block-based manipulatives both
digitally (e.g., commands are represented as digital blocks,
and specific commands such as loops indicate syntax using
an arrow with automatic indentation of the looped
command) and physically (in the design of the physical
blocks which is discussed in more detail below and shown
in Figure 5).

Figure 4. The programming concepts/commands introduced in


Bots & (Main)Frames. (A) Forward: moves the robot one tile
forward. (B) Left/right: rotates the robot 90 degrees left or Figure 5. The tangible programming blocks used in Bots &
right. (C) Loop: repeats a single command a specified number (Main)Frames. Players must physically chain blocks together
of times. (D) Function: use commands from the function. in order to program.
Figure 6. (A) A typical command block with a metal hook on
the right side and eye on the left. (B) The Loop programming
block where players insert a command that will be repeated a
specified number of times. (C) The Function and Use Function
programming blocks connected by a chain.
Design of the Tangibles Version
One of the primary challenges and reasons we decided to
design a new game, rather than use an existing one, was to
create a system that supports both analog and digital Figure 7. Overview of the 2x2 factorial design for the study.
interfaces. In contrast to the mouse version of Bots & Consists of 4 conditions: (A) Individual Mouse Condition. (B)
(Main)Frames where players click buttons with iconic Collaborative Mouse Condition. (C) Individual Tangibles
representations of programming commands (Figure 1B), the Condition. (D) Collaborative Tangibles Condition.
tangibles version has players physically connect wooden educational programming games, comparing them to the
blocks to program (see Figure 5). Fiducial markers on top traditional single player, touchpad or mouse experience.
of the blocks are utilized in conjunction with tracking from Considering the physical nature of manipulating and
the ReacTIVision framework [34] to detect block interacting with tangibles and prior work showing the
placement on the table and determine connections/ordering. positive effect of physical activity on situational interest
Notably, other than these differences in input method [30,56,83], we hypothesized the use of tangibles might have
(tangibles vs. mouse), the graphical UI and gameplay are a similar positive outcome in our educational programming
identical for both versions. game. Furthermore, since collaboration has been shown to
The blocks themselves utilize a similar design approach to have a positive impact on situational interest [60], we also
[26,29,41,88,92], taking advantage of many physical believed that collaborative play might have a similar
affordances that tangibles provide [50,51,61,63]. For beneficial effect here. Earlier research findings also led us
instance, there are several aspects of the block’s physical to hypothesize that both the tangible input method and
properties intended to eliminate syntax errors and more collaborative mode of play were likely to have positive
clearly illustrate programming concepts. All programing impacts on programming self-beliefs [49] and enjoyment
command blocks have a hook on the right side and eye on [32,62,76,93]. On the other hand, while previous studies
the left side (see Figure 6A). This helps illustrate to players have found positive learning outcomes from the use of
that commands are executed in a sequential order along the tangibles [79], collaboration has shown very mixed
―chain of execution‖, as well as prevents players from effectiveness [45,60,90]; sometimes even showing negative
connecting blocks in incorrect ways (e.g., facing the wrong effects. As a result, we hypothesized that tangibles may
direction). Similarly, we created special blocks for the have a positive effect on learning and performance
usage of loops and functions. Loop blocks have an outcomes while collaboration would have little to no effect.
additional slot connected that allows players to insert a Experimental Design and Procedure
single command (see Figure 6B). This helps physically In order to evaluate the influence of input methods and
illustrate how in-game loops take a single command to mode of play on various outcomes in an educational
repeat a specified number of times. Contrary to most other programming game, we used a 2x2 between subjects
educational block-based programming interfaces, the loop factorial design (see Figure 7 for the four study conditions).
command in the digital mouse-based version also only The key differences between the four conditions were the
enables looping of a single command rather than a variable use of a mouse or tangible programming blocks to program
number of commands since it would be difficult to create an (input methods) and playing individually or collaboratively
equivalent variable width slot for the physical blocks. For in pairs (mode of play). To facilitate a valid comparison, all
functions, we connected two blocks (Function and Use other aspects of design—levels, mechanics, aesthetics,
Function) together with a longer chain to visually illustrate feedback, and so forth—remained constant. When playing
how code execution transitions from one function to the game, participants had to either share the set of blocks
another on a function call (see Figure 6C). for the collaborative tangibles condition or share a single
METHODOLOGY mouse for the collaborative mouse condition. This design
For this study, we wanted to explore the effects of choice was made in order to mirror real world collaborative
incorporating tangibles and collaborative play into play contexts of educational programming games as closely
as possible [67,81], rather than create a custom Psychology, Biology, and Film while the remaining 4 were
collaborative environment which did not accurately from Engineering, Economics, and Math. Participants were
represent how learners would cooperatively use an randomly assigned to one of the four study conditions (18
educational programming game elsewhere. Additionally, participants in MI, 22 in MC, 18 in TI, and 22 in TC). Only
comparisons between collaborative use of a single mouse 2 pairs of participants in the collaborative conditions knew
and tangibles have frequently been utilized in existing HCI each other before beforehand.
literature (e.g., [26,55,93]). However, the single access
Measures
point of a mouse and shared access of tangibles during
collaborative play could have differing impacts on the Programming Self-Beliefs
equity of interactions between participants [28,47,82]. We used a programming self-beliefs scale [75] to measure
Learners who do not have active control of input during participant self-beliefs about their programming ability both
collaborative play may exhibit off-task behavior—which before the training task (pre-test) and after (post-test). The
can impact overall activity, engagement, and performance scale consists of 19 5-point Likert type items (1 = strongly
outcomes [32]. Our process to identify this potentially disagree to 5 = strongly agree) that measure the five
confounding factor and control for it during analysis is dimensions of programming self-beliefs mentioned earlier
discussed further in the results section. in related work: debugging self-efficacy, programming self-
concept, programming interest, programming anxiety, and
For the study, participants were first randomly assigned to programming aptitude. Scott and Ghinea [75] established
one of the four conditions: individual play with mouse (MI), the construct validity with confirmatory factor analysis, in
collaborative play with mouse (MC), individual play with addition to showing reasonable internal consistency for all 5
tangibles (TI), and collaborative play with tangibles (TC). dimensions of programming self-beliefs (Cronbach's α
We then asked them to fill out a demographics ranging from .703 to .888). We had to slightly modify a few
questionnaire collecting information such as prior questions in the scale for novice programmers by removing
programming experience, prior video game experience, and wording that was course specific. E.g., ―In my
initial programming self-beliefs. Participants were then programming labs, I can solve even the most challenging
asked to complete 10 training levels (see Figure 2) which problems‖ became ―I can solve even the most challenging
introduced them to the game mechanics and introductory programming problems‖. We also had to remove one
programming concepts. They were given as much time as debugging self-efficacy question that was programming
necessary to complete this training task, and most language specific (i.e., ―I am confident that I can
participants finished the levels in 10 – 20 minutes. This was understand Java Exceptions‖).
followed by a post-test survey assessing programming self-
beliefs, situational interest, and enjoyment/replayability of Situational Interest
the game. In order to assess performance and learning To assess situational interest of the participants after
outcomes after the survey, we also had participants completion of the training task, we used the Situational
complete a post-test performance task where they were Interest Scale [10]. The scale consists of 24 5-point Likert
presented with 5 new levels and instructed to complete as type items (1 = strongly disagree to 5 = strongly agree) that
many of the levels as possible in 10 minutes. These levels measure five dimensions of situational interest: exploration
were designed to be more complex than the prior training intention (i.e., I want to analyze it to have a grasp on it),
levels, requiring longer programs and more complicated instant enjoyment (i.e., it is an enjoyable activity to me),
paths to solve (see Figure 3). In order to mitigate social attention demand (i.e., my attention was high), challenge
loafing and other biases from collaborative play, (i.e., it is a complex activity), and novelty (i.e., this activity
participants in the two collaborative conditions took turns is new to me). Chen et al. [10] established the construct
completing the performance task individually while the validity of the Situational Interest Scale on multiple student
other participant waited in a separate room. Finally, the samples using both exploratory and confirmatory factor
study concluded with a short interview asking participants analyses, in addition to reporting internal consistency for all
about their play experience and for any suggestions to situational interest dimensions (Cronbach's α of .78 to .91).
improve the game. The entire study lasted approximately 1 Enjoyment and Replayability
hour, and was video recorded to capture interview data and After completing the training task, we also asked
player performance during the training/performance tasks. participants three qualitative 7-point Likert scale questions
Participants (1 = strongly disagree to 7 = strongly agree) relating to their
A convenience sample of 80 undergraduates (18–24 years enjoyment and intentions to replay or recommend the
old, 52 female) was recruited for the study from several version of Bots & (Main)Frames that they played:
introductory classes at a large university. All participants 1. Playing the game was enjoyable.
were novice programmers (having less than 6 months of 2. I would like to play this or a similar game again in the
prior programming experience). From our participant pool, future.
76 came from non-mathematical backgrounds such as 3. I would recommend this game to a friend.
Programming Self-Beliefs Mouse Individual Mouse Collaborative Tangibles Individual Tangibles Collaborative
Subscales M SD M SD M SD M SD
Debugging Self-Efficacy 5.625 2.091 5.055 2.42 6.563 2.658 5.091 2.369
Programming Self-Concept 5.56 2.159 6.182 2.26 7.56 2.28 6.364 2.735
Pre-test

Programming Interest 11.188 3.28 10.41 3.157 12.438 2.683 12.546 3.514
Programming Anxiety 13.75 2.517 13.136 2.295 12.313 3.32 14.182 2.363
Programming Aptitude 5.75 1.95 6.409 2.34 7.062 2.205 6.818 2.26
Debugging Self-Efficacy 6.88 2.729 7.64 2.92 9.13 2.68 8.55 3.334
Programming Self-Concept 7.75 2.49 8.68 2.418 9.94 1.879 9.32 2.191
Post-test

Programming Interest 13.88 3.879 13.91 3.663 16.31 2.676 16.18 2.63
Programming Anxiety 14.44 2.732 11.55 3.051 12.375 3.757 12.27 2.898
Programming Aptitude 6.375 2.68 7.14 2.8 8.56 2.421 8.59 2.384
Table 1. Descriptive statistics for pre- and post-test scores on the five dimensions of programming self-beliefs [75] across the four
study conditions. Note: programming aptitude is reverse scored for clarity, so higher scores indicate beneficial growth mindset.
RESULTS programming experience [F(3, 71) = .111, p = .953], prior
Of the 80 participants, two subjects in the individual mouse video game experience [F(3, 71) = 1.217, p = .31], and on
condition and two subjects in the individual tangibles the five pre-test self-beliefs subscales:
condition were unable to complete the 10 training levels
during the 60-minute study period. As a result, they were  Debugging self-efficacy: F(3, 71) = 1.187, p = .321
removed from subsequent analysis, leaving us with 16  Programming self-concept: F(3, 71) = 2.195, p = .10
participants in each of the individual conditions and 22  Programming interest: F(3, 71) = 2.104, p = .11
participants in each of the collaborative conditions.  Programming anxiety F(3, 71) = 1.735, p = .168
 Programming aptitude: F(3, 71) = 1.232, p = .305
As mentioned earlier, participants who did not actively use
the interface in the collaborative conditions could also be We can therefore assume that participants in all four groups
confounding results. In order to control for this potential had similar prior experience and programming self-beliefs.
confounding factor, we identified participants who did not Programming Self-Beliefs
actively engage with the interface during training levels We first examine participant’s post-test programming self-
(e.g., rarely used the mouse or blocks). We found a total of beliefs across the four conditions. Table 1 shows
1 participant in the TC condition and 4 participants in the descriptive statistics for pre-test and post-test scores on the
MC condition that did not actively use the interface—all five dimensions of programming self-beliefs. To analyze
other pairs had a fairly even split in usage. As a result, the differences between post-test scores for the five
following statistical tests we use to assess condition dimensions, we used a series of 2x2 ANCOVA tests with
differences control for usage and non-usage of the interface input method and mode of play as the factors and the
(e.g., as a covariate in an ANCOVA). To further ensure the corresponding pre-test score as a covariate to control for
validity of our results, we also removed the participants individual differences in programming self-beliefs. Results
altogether from a reanalysis, and found the exact same found a significant main effect for tangibles with respect to
results as presented here with almost identical p values. improving debugging self-efficacy [F(1, 70) = 4.49, p =
Participant Experience and Prior Self-Beliefs .038, p2 = .06], programming self-concept [F(1, 70) = 4.17,
According to a series of ANCOVA tests, participants in the p = .045, p2 = .056], programming interest [F(1, 70) =
four experimental groups did not differ with regard to prior 4.088, p = .047, p2 = .055], and programming aptitude

Mouse Individual Mouse Collaborative Tangibles Individual Tangibles Collaborative


Situational Interest Scores
M SD M SD M SD M SD
Exploration Intention 4.844 0.531 4.523 .631 5.328 0.472 5.216 .553
Instant Enjoyment 4.663 0.809 4.345 0.693 5.013 0.7284 5.209 0.507
Attention Demand 5.063 0.887 4.739 1.042 5.5 0.5 5.148 0.454
Challenge 3.594 1.31 3.216 0.943 3.938 1.135 3.466 1.081
Novelty 4.667 0.741 4.167 1.012 5.021 0.803 4.818 1.012
Table 2. Descriptive statistics for post-test scores on the Situational Interest Scale [10] across the four study conditions.
Figure 8. Means of 7-point Likert scale ratings (+/- SEM) for enjoyment and intentions to replay/recommend across conditions.
[F(1, 70) = 5.689, p = .02, p2 = .075]. No other significant recommend to friend [F(1, 71) = 7.019, p = .01, p2 = .09].
effects or interactions were found for these four dimensions Notably, while there were no significant interaction effects
(all other p’s > .36). We also found a significant main effect (all p’s > .08), Figure 8 shows a similar pattern for all 3
for mode of play with respect to collaboration substantially questions with MC scoring lowest and TC scoring highest.
decreasing programming anxiety [F(1, 70) = 6.231, p = This suggests that use of tangibles when playing
.015, p2 = .082], but no significant main effect for input collaboratively might increase enjoyment and replayablity
method or interaction effect (all p’s > .28). Overall, it for learners, while use of a single mouse when playing
appears that while tangibles aided players in feeling better collaboratively might be detrimental. There were also no
about their own programming abilities and increasing significant main effects for mode of play (all p’s > .4).
interest, collaborative play helped players to feel less Performance and Learning Outcomes
anxious about the actual act of programming. To better understand players’ learning and performance
Situational Interest after training, we also analyzed their ability to apply the
For situational interest, we analyzed post-test scores for the concepts learned to more a more complex set of problems
five source dimensions of the Situational Interest Scale on a post-test performance task. For the performance task,
[10] (see table 2 for descriptive statistics of scores across participants were instructed to complete as many of five
conditions). A series of 2x2 ANCOVA tests (input method levels as possible within a ten-minute time limit (see Figure
and mode of play as factors) showed a significant main 3 for the levels). We first consider the average number of
effect in favor of tangibles increasing 4 of the 5 dimensions: completed levels for the conditions with input and mode of
exploration intention [F(1, 71)=19.42, p < .001, p2 = .215], play as factors (see Figure 9). A two-way ANCOVA shows
instant enjoyment [F(1, 71) = 13.876, p < .001, p2 = .163], no main effect for input method [F(1, 71) = .002, p = .968,
attention demand [F(1, 71) = 5.365, p = .023, p2 = .07], p2 < .001] or mode of play [F(1, 71) = 2.404, p = .125, p2
and novelty [F(1, 71) = 5.556, p = .021, p2 = .073]. There = .033]. Similarly, there was no interaction between the
were no other significant main effects or interactions found factors [F(1, 71) = .122, p = .728, p2 = .002]. This suggests
for the four dimensions (all p’s > .09). The last dimension that neither form of input (mouse and tangibles) nor mode
of challenge showed no significant main effects for input of play (individual and collaborative) impacted the number
method [F(1, 71) = 1.079, p = .302, p2 = .015] or mode of of levels completed by players on the performance task.
play [F(1, 71) = 1.817, p = .182, p2 = .025], and no
significant interaction effect [F(1, 71) = .081, p = .776, p2
= .001]. This makes sense considering that participants
were playing the same educational programming game with
identical levels, so perceived challenge of the game activity
should be roughly the same.
Enjoyment and Replayability
We also collected and analyzed subjective ratings on player
enjoyment, desire to play a similar or the same game again,
and intention to recommend the game to a friend (see
Figure 8). A series of 2x2 ANCOVA tests found a
significant main effect on all three questions for input
method in favor of higher ratings after using the tangibles:
enjoyment [F(1, 71) = 14.02, p < .001, p2 = .165], play Figure 9. Means for number of levels completed in the
again [F(1, 71) = 15.166, p < .001, p2 = .176], and performance task out of 5 possible levels (+/- SEM).
Figure 10. Means for number of mistakes made per completed Figure 11. Means for average time taken to completed a level
level in the training and performance tasks across conditions. in the training and performance tasks across conditions.

However, when examining the average number of mistakes interest in programming and aiding participants in feeling
per completed level made by players in the four conditions, better about their own programming abilities.
there is a notable increase in mistakes for mouse conditions Similarly, for situational interest, use of tangibles showed
during the post-test performance task (see Figure 10). A greater exploration intention, instant enjoyment, attention
two-way ANCOVA to control for individual differences in demand, and novelty. No condition showed any difference
mistakes made during training confirms this, revealing a in challenge—which would be expected for different
significant main effect of input method in favor of tangibles versions of the exact same game. One aspect to note is that
making less mistakes during performance [F(1, 70) = prior work has shown the impact of exergames [56],
26.711, p < .001, p2 = .276]. Mode of play and interaction commercial dance games [30], and active educational video
between the factors was not significant (all p’s > .24). games (AVGs) [83] on situational interest. However, to our
Additionally, we also examined the average playtime taken knowledge, the effects of tangibles on situational interest
to successfully complete a level in the performance task. have not previously been explored. While these games are
When computing the playtime of players, we excluded more focused on physical activity’s influence on situational
uncompleted levels (i.e., where the participant ran out of interest, it would appear that the physicality of
time) from the playtime calculation since it is unclear how manipulating tangibles might have a similar effect.
long it would actually take that participant to complete the We also saw that enjoyment and intentions to replay and
level. Using a two-way ANCOVA to control for individual recommend the game were positively influenced by use of
differences in time taken to complete the training levels, we tangibles. This aligns with previous research that has shown
found a significant main effect in favor of the tangibles 1) tangibles to be more enjoyable than a mouse [32,49,76];
taking less time to complete a level when compared to the and 2) tangible programming interfaces to be more actively
mouse [F(1, 70) = 4.419, p = .039, p2 = .059]. Again, mode engaging for learners [26,94].
of play and interaction between the factors were not
significant (all p’s > .434). Looking at the average training Finally, players performed best when using tangibles. For
and performance times per completed level in Figure 11, we participants in the tangible conditions, levels were
see that tangibles remained relatively consistent for time completed in roughly the same amount of time and with
taken, regardless of training or performance level roughly the same amount of mistakes regardless of
complexity. On the other hand, the mouse conditions were difficulty (i.e., similar performance for both training and
substantially faster during training—which is reasonable challenging performance levels). On the other hand,
considering the physical differences in speed with which participants in the mouse conditions appear to have
players can program using a mouse in comparison to struggled with retaining and applying knowledge from the
tangible blocks—but players appear to have struggled with training levels; taking substantially longer per level and
retaining and applying knowledge more during the making significantly more mistakes during the harder
challenging performance levels; taking substantially longer performance levels.
per level than on the training levels. Advantages and Drawbacks of Collaborative Play
DISCUSSION One notable advantage of collaborative play was in its
ability to reduce participants’ anxiety towards the overall
Tangible Programming Blocks Rock act of programming (programming anxiety), while tangible
Our results highlight that incorporating tangibles into the programming blocks did not demonstrate this effect. Prior
design of an educational programming game can have work has shown cooperative learning to be an effective tool
significant impacts on performance and learning outcomes, for reducing participant anxiety towards learning foreign
as well as on important related factors such as programming languages [84] and basic computer skills [38]. Our results
self-beliefs, situational interest, enjoyment, and intentions suggest that this similarly applies towards learning
to replay/recommend the game. For programming self- programming in an educational programming game.
beliefs, tangibles are beneficial for increasing participants’
On the other hand, collaborative play with a single mouse point) and tangibles (shared access) for the collaborative
showed notably poor results when compared to the other conditions. Although comparisons between collaborative
conditions for performance, situational interest, and use of a single mouse and tangibles have been commonly
enjoyment/replayability. In our study, use of a single mouse utilized in existing HCI literature [26,55,93], single access
(i.e., single access point) in the collaborative play point interfaces have been shown to be detrimental in
conditions resulted in one player dominating control of existing literature due to negative impacts on the equity of
input more frequently than the use of tangibles (i.e., shared participation [28,47]. Specifically, limited access to and
access). This mirrors findings from existing single-display control of input has been shown to result in off-task
groupware, multiple-mice, and multiple-device research behavior and lower engagement for learners [32]. However,
which has shown that group usage of a single access point in our analysis we explicitly control for participants that do
such as a mouse can negatively impact interaction equity not actively engage with the interface, so it is unlikely that
when compared to shared access [28,39,47,82]. single access and shared access differences between the
mouse and tangibles influenced our results.
Ultimately, our results bolster existing research which has
suggested that while tangibles are flexible and useful tools Another limitation of this work is the novelty of using
for facilitating and enhancing collaboration [25], simply tangible interfaces and the potential effects it could have on
adding additional players to individual single-mice games our results. Addressing the issue of novelty is complicated
might not achieve the desired benefits of collaborative play since doing new things is fun for many people, and would
[32]. require a longitudinal study design to explore impacts.
However, novelty is also a notable component of situational
Future Design Improvements
From our interviews with participants and observations of interest [10], and as a result desirable for beneficial long-
their behavior during gameplay, we found a few notable term learning outcomes.
design suggestions that would help to both further refine the CONCLUSION AND FUTURE WORK
design of Bots & (Main)Frames, as well as guide future In this paper, we presented a study that systematically
designs for educational programming games. One common explored how modulating input methods and mode of play
suggestion made by almost all participants in the tangible in an educational programming game can impact learning
conditions was to make connecting the blocks easier. For outcomes and important related factors. Our results provide
participants, connecting blocks was an extraneous and concrete evidence that incorporating tangibles into the
sometimes difficult task that took too long for rapid design of an educational programming game can improve
experimentation. By using some other physical form of programming self-beliefs (i.e., debugging self-efficacy,
connection with similar syntax enforcing properties (e.g., programming self-concept, programming interest,
magnets), participants may be able to manipulate the blocks programming aptitude), situational interest (i.e., exploration
more quickly and effectively for improved performance. intention, instant enjoyment, attention demand, novelty),
subjective feelings towards enjoyment, and intentions to
Another frequently made suggestion by more than half of
replay/recommend the game in the future. They also
participants in the tangibles conditions was to color code
provide evidence that, while not as effective in other
the blocks based on functionality. We created over 40
aspects, collaborative play does reduce players’ anxiety
tangible programming blocks, and participants noted that
towards the act of programming.
the space frequently got messy during more challenging
problems. This ultimately made identifying desired blocks Ultimately, we have only just begun to touch upon the
difficult. Color coding is used in popular educational impact of tangibles and collaborative play in educational
programming environments (e.g., Scratch [66] and Blockly programming games. One notable aspect of our study was
[16]) as well as some tangible programming interfaces (e.g., the university age group of participants. However, many of
Tern [26]), and would likely further benefit players’ the current CS education initiatives—such as Hour of Code
performance in Bots & (Main)Frames as well. [91]—are geared towards the K-12 demographic [5]. In
future work, we plan to test our tangible and collaborative
Finally, while the input design of the tangibles encouraged
designs on younger audiences who may benefit more
parallel play and problem solving, collaborative play using
strongly from such approaches.
a single mouse encouraged a far greater number of
instances with sequential turn taking. As a result, future REFERENCES
work would also benefit from examining additional 1. Mary Ainley and John Ainley. 2011. Student
collaborative multi-device input designs for educational engagement with science in early adolescence: The
programming games (e.g., supporting multiple inputs with contribution of enjoyment to students’ continuing
two mice or using a multi-touch tabletop device). interest in learning about science. Contemporary
Educational Psychology 36, 1: 4–12.
LIMITATIONS
One of the primary limitations in this paper was the 2. Patricia A Alexander and Tamara L Jetton. 1996. The
differing forms of access between the mouse (single access role of importance and interest in the processing of
text. Educational Psychology Review 8, 1: 89–121.
3. Alissa N. Antle, Milena Droumeva, and Greg Corness. 16. N. Fraser. Blockly: A visual programming editor.
2008. Playing with The Sound Maker: Do Embodied Retrieved October 9, 2016 from
Metaphors Help Children Learn? In Proceedings of the https://developers.google.com/blockly/
7th international conference on Interaction design and 17. Daniel Gallardo, Carles F Julia, and Sergi Jorda. 2008.
children - IDC ’08, 178. TurTan: A tangible programming language for creative
4. Ian Arawjo, Cheng-Yao Wang, Andrew C Myers, Erik exploration. In Horizontal Interactive Human
Andersen, and François Guimbretière. 2017. Teaching Computer Systems, 2008. TABLETOP 2008. 3rd IEEE
Programming with Gamified Semantics. In International Workshop on, 89–92.
Proceedings of the 2017 CHI Conference on Human 18. Alessandro Gnoli, Anthony Perritano, Paulo Guerra,
Factors in Computing Systems, 4911–4923. Brenda Lopez, Joel Brown, and Tom Moher. 2014.
5. David Barr, John Harrison, and Leslie Conery. 2011. Back to the future: Embodied Classroom Simulations
Computational Thinking: A Digital Age Skill for of Animal Foraging. In Proceedings of the 8th
Everyone. Learning & Leading with Technology 38, 6: International Conference on Tangible, Embedded and
20–23. Embodied Interaction - TEI ’14, 275–282.
6. Valerie Barr and Chris Stephenson. 2011. Bringing 19. Shuchi Grover and Roy Pea. 2013. Computational
computational thinking to K-12: what is Involved and thinking in K--12: A review of the state of the field.
what is the role of the computer science education Educational Researcher 42, 1: 38–43.
community? ACM Inroads 2, 48–54. 20. Florian Güldenpfennig, Daniel Dudo, and Peter
7. Andrew Begel and Nachiappan Nagappan. 2008. Pair Purgathofer. 2016. Toward Thingy Oriented
programming: what’s in it for me? In Proceedings of Programming: Recording Marcos With Tangibles. In
the Second ACM-IEEE international symposium on Proceedings of the TEI’16: Tenth International
Empirical software engineering and measurement, Conference on Tangible, Embedded, and Embodied
120–128. Interaction, 455–461.
8. Matthew Berland and Victor R. Lee. 2011. 21. Casper Harteveld, Gillian Smith, Gail Carmichael,
Collaborative Strategic Board Games as a Site for Elisabeth Gee, and Carolee Stewart-Gardiner. 2014. A
Distributed Computational Thinking. International Design-Focused Analysis of Games Teaching
Journal of Game-Based Learning 1, 2: 65–81. Computer Science. In Proceedings of Games+
9. Lisa S Blackwell, Kali H Trzesniewski, and Carol Learning+ Society 10.
Sorich Dweck. 2007. Implicit theories of intelligence 22. Andrew Hicks, Barry Peddycord, and Tiffany Barnes.
predict achievement across an adolescent transition: A 2014. Building games to learn from their players:
longitudinal study and an intervention. Child Generating hints in a serious game. Lecture Notes in
development 78, 1: 246–263. Computer Science 8474 LNCS: 312–317.
10. Ang Chen, Paul W Darst, and Robert P Pangrazi. 1999. 23. Suzanne Hidi and Valerie Anderson. 1992. Situational
What constitutes situational interest? Validating a interest and its impact on reading and expository
construct in physical education. Measurement in writing. The role of interest in learning and
Physical Education and Exercise Science 3, 3. development 11: 213–214.
11. Michael Clancy. 2004. Misconceptions and attitudes 24. Suzanne Hidi and K Ann Renninger. 2006. The four-
that interfere with learning to program. Computer phase model of interest development. Educational
science education research: 85–100. psychologist 41, 2: 111–127.
12. Edsger W Dijkstra and others. 1989. On the cruelty of 25. Michael S Horn, R Jordan Crouser, and Marina U Bers.
really teaching computing science. Communications of 2012. Tangible interaction and learning: the case for a
the ACM 32, 12: 1398–1404. hybrid approach. Personal and Ubiquitous Computing
13. Pierre Dillenbourg. 1999. What do you mean by 16, 4: 379–389.
collaborative learning. In Collaborative-learning: 26. Michael S Horn, Erin Treacy Solovey, R Jordan
Cognitive and computational approaches. 1–15. Crouser, and Robert J K Jacob. 2009. Comparing the
14. Carol S Dweck. 2000. Self-theories: Their role in use of tangible and graphical programming languages
motivation, personality, and development. Psychology for informal science education. In Proceedings of the
Press. SIGCHI Conference on Human Factors in Computing
Systems, 975–984.
15. Michael Eagle and Tiffany Barnes. 2009. Experimental
evaluation of an educational game for improved 27. Eva Hornecker and Jacob Buur. 2006. Getting a grip on
learning in introductory computing. In ACM SIGCSE tangible interaction. In Proceedings of the SIGCHI
Bulletin, 321–325. conference on Human Factors in computing systems -
CHI ’06.
28. Eva Hornecker, Paul Marshall, Nick Sheep Dalton, and Rosemary Luckin, and Amanda Harris. 2008. ―I’m
Yvonne Rogers. 2008. Collaboration and interference: keeping those there, are you?‖ The role of a new user
awareness with mice or touch input. In Proceedings of interface paradigm--Separate Control of Shared Space
the 2008 ACM conference on Computer supported (SCOSS)--in the collaborative decision-making
cooperative work, 167–176. process. Computers & Education 50, 1: 193–206.
29. Felix Hu, Ariel Zekelman, Michael Horn, and Frances 40. Paivi Kinnunen and Beth Simon. 2010. Experiencing
Judd. 2015. Strawbies: Explorations in Tangible programming assignments in CS1: the emotional toll.
Programming. In Proceedings of the 14th International In Proceedings of the Sixth international workshop on
Conference on Interaction Design and Children - IDC Computing education research, 77–86.
’15, 410–413. 41. Vishesh Kumar, Tuhina Dargan, Utkarsh Dwivedi, and
30. Chaoqun Huang and Zan Gao. 2013. Associations Poorvi Vijay. 2015. Note Code – A Tangible Music
between students’ situational interest, mastery Programming Puzzle Tool. In Proceedings of the 10th
experiences, and physical activity levels in an International Conference on Tangible, Embedded, and
interactive dance game. Psychology, health & medicine Embodied Interaction - TEI ’15, 625–629.
18, 2: 233–241. 42. Janet Mei-Chuen Lin and Shu-Fen Liu. 2012. An
31. Meriel Huggard. 2004. Programming trauma: Can it be investigation into parent-child collaboration in learning
avoided. Proceedings of the BCS Grand Challenges in computer programming. Journal of Educational
Computing: Education: 50–51. Technology & Society 15, 1: 162.
32. Kori M Inkpen, Wai-ling Ho-Ching, Oliver Kuederle, 43. Lisa Linnenbrink-Garcia, Amanda M Durik,
Stacey D Scott, and Garth B D Shoemaker. 1999. This AnneMarie M Conley, Kenneth E Barron, John M
is fun! we’re all best friends and we’re all playing: Tauer, Stuart A Karabenick, and Judith M
supporting children’s synchronous collaboration. In Harackiewicz. 2010. Measuring situational interest in
Proceedings of the 1999 conference on Computer academic domains. Educational and psychological
support for collaborative learning, 31. measurement 70, 4: 647–671.
33. Lisa C Kaczmarczyk, Elizabeth R Petrick, J Philip 44. H Lode, G Franchi, and N Frederiksen. 2013.
East, and Geoffrey L Herman. 2010. Identifying Machineers: playfully introducing programming to
student misconceptions of programming. In children. In CHI ’13 Human Factors in Computing
Proceedings of the 41st ACM technical symposium on Systems, 2639–2642.
Computer science education, 107–111. 45. Yiping Lou, Philip C Abrami, John C Spence,
34. Martin Kaltenbrunner and Ross Bencina. 2007. Catherine Poulsen, Bette Chambers, and Sylvia
reacTIVision: a computer-vision framework for table- d’Apollonia. 1996. Within-class grouping: A meta-
based tangible interaction. Proceedings of the 1st analysis. Review of educational research 66, 4: 423–
international conference on Tangible and embedded 458.
interaction: 69–74. 46. Herbert W Marsh and Andrew J Martin. 2011.
35. Dominic Kao and D. Fox Harrell. 2015. Mazzy: A Academic self-concept and academic achievement:
STEM Learning Game. In Foundations of Digital Relations and causal ordering. British Journal of
Games. Educational Psychology 81, 1: 59–77.
36. Dominic Kao and D Fox Harrell. 2015. Exploring the 47. Paul Marshall, Eva Hornecker, Richard Morris, Nick
impact of role model avatars on game experience in Sheep Dalton, and Yvonne Rogers. 2008. When the
educational games. In Proceedings of the 2015 Annual fingers do the talking: A study of group participation
Symposium on Computer-Human Interaction in Play, with varying constraints to a tabletop interface. In
571–576. Horizontal Interactive Human Computer Systems,
37. Dominic Kao and D Fox Harrell. 2016. Exploring the 2008. TABLETOP 2008.
Impact of Avatar Color on Game Experience in 48. Timothy S McNerney. 2004. From turtles to Tangible
Educational Games. In Proceedings of the 2016 CHI Programming Bricks: explorations in physical language
Conference Extended Abstracts on Human Factors in design. Personal and Ubiquitous Computing 8, 5: 326–
Computing Systems, 1896–1905. 337.
38. Carolyn M Keeler and Robert Anson. 1995. An 49. Edward F Melcer, Victoria Hollis, and Katherine
assessment of cooperative learning used for basic Isbister. 2017. Tangibles vs . Mouse in Educational
computer skills instruction in the college classroom. Programming Games: Influences on Enjoyment and
Journal of Educational Computing Research 12, 4: Self-Beliefs. In CHI’17 Extended Abstracts.
379–393. 50. Edward Melcer and Katherine Isbister. 2016. Bridging
39. Lucinda Kerawalla, Darren Pearce, Nicola Yuill, the Physical Learning Divides: A Design Framework
for Embodied Learning Games and Simulations. In Educational Psychology 105, 4: 1050–1066.
Proceedings of the 1st International Joint Conference 61. Wim T. J. L. Pouw, Tamara van Gog, and Fred Paas.
of DiGRA and FDG. 2014. An Embedded and Embodied Cognition Review
51. Edward Melcer and Katherine Isbister. 2016. Bridging of Instructional Manipulatives. Educational
the Physical Divide: A Design Framework for Psychology Review 26, 1: 51–72.
Embodied Learning Games and Simulations. In 62. Sara Price, Yvonne Rogers, M. Scaife, D. Stanton, and
CHI’16 Extended Abstracts, 2225–2233. H. Neale. 2003. Using tangibles to promote novel
52. Edward Melcer and Katherine Isbister. 2017. forms of playful learning. Interacting with Computers
Embodiment, collaboration, and challenge in 15, 2: 169–185.
educational programming games: exploring use of 63. S Price, J G Sheridan, T Pontual-Falcao, and G
tangibles and mouse. In Proceedings of the 12th Roussos. 2008. Towards a framework for investigating
International Conference on the Foundations of Digital tangible environments for learning. International
Games, 62. Journal of Arts and Technology 1, 3/4: 351–368.
53. Dejana Mullins, Nikol Rummel, and Hans Spada. 64. Vennila Ramalingam, Deborah LaBelle, and Susan
2011. Are two heads always better than one? Wiedenbeck. 2004. Self-efficacy and mental models in
Differential effects of collaboration on students’ learning to program. In ACM SIGCSE Bulletin, 171–
computer-supported learning in mathematics. 175.
International Journal of Computer-Supported
Collaborative Learning 6, 3: 421–443. 65. Ann Renninger, Suzanne Hidi, and Andreas Krapp.
2014. The role of interest in learning and development.
54. Nachiappan Nagappan, Laurie Williams, Miriam Psychology Press.
Ferzli, Eric Wiebe, Kai Yang, Carol Miller, and
Suzanne Balik. 2003. Improving the CS1 experience 66. Mitchel Resnick, John Maloney, Andrés Monroy-
with pair programming. ACM SIGCSE Bulletin 35, 1: Hernández, Natalie Rusk, Evelyn Eastmond, Karen
359–362. Brennan, Amon Millner, Eric Rosenbaum, J a Y Silver,
Brian Silverman, and Yasmin Kafai. 2009. Scratch:
55. Izabel C Olson, Zeina Atrash Leong, Uri Wilensky, Programming for All. Communications of the ACM 52:
and Mike S Horn. 2011. It’s just a toolbar!: using 60–67. https://doi.org/10.1145/1592761.1592779
tangibles to help children manage conflict around a
multi-touch tabletop. In Proceedings of the fifth 67. Janko Roettgers. 2016. Tangible Play, Maker of Osmo
international conference on Tangible, embedded, and Augmented Reality Toys, Raises $24 Million From
embodied interaction, 29–36. Sesame Workshop, Others. Variety. Retrieved from
http://variety.com/2016/digital/news/osmo-25-million-
56. Denis Pasco, Cédric Roure, Gilles Kermarrec, Zachary funding-1201936425/
Pope, and Zan Gao. 2017. The effects of a bike active
video game on players’ physical activity and 68. Christine Rogerson and Elsje Scott. 2010. The fear
motivation. Journal of Sport and Health Science 6, 1: factor: How it affects students learning to program in a
25–32. tertiary environment. Journal of Information
Technology Education 9, 1: 147–171.
57. Arnold Pears, Stephen Seidman, Lauri Malmi, Linda
Mannila, Elizabeth Adams, Jens Bennedsen, Marie 69. Marcos Román-González, Juan-Carlos Pérez-
Devlin, and James Paterson. 2007. A survey of González, and Carmen Jiménez-Fernández. 2017.
literature on the teaching of introductory programming. Which cognitive abilities underlie computational
ACM SIGCSE Bulletin 39, 4: 204–223. thinking? Criterion validity of the Computational
Thinking Test. Computers in Human Behavior 72:
58. Reinhard Pekrun. 2006. The control-value theory of 678–691.
achievement emotions: Assumptions, corollaries, and
implications for educational research and practice. 70. Ulrich Schiefele. 1999. Interest and learning from text.
Educational psychology review 18, 4: 315–341. Scientific studies of reading 3, 3: 257–279.
59. Reinhard Pekrun and Elizabeth J Stephens. 2010. 71. Gregory Schraw and Stephen Lehman. 2001.
Achievement emotions: A control-value approach. Situational interest: A review of the literature and
Social and Personality Psychology Compass 4, 4: 238– directions for future research. Educational psychology
255. review 13, 1: 23–52.
60. Jan L. Plass, Paul A. O’Keefe, Bruce D. Homer, 72. Eric Schweikardt and Md Gross. 2008. The robot is the
Jennifer Case, Elizabeth O. Hayward, Murphy Stein, program: interacting with roBlocks. In Proceedings of
and Ken Perlin. 2013. The impact of individual, the second international conference on Tangible,
competitive, and collaborative mathematics game play embedded, and embodied interaction - TEI ’08, 167–
on learning, performance, and motivation. Journal of 168.
73. Michael J Scott and Gheorghita Ghinea. 2014. On the 2010. The impacts of cooperative learning on anxiety
domain-specificity of mindsets: The relationship and proficiency in an EFL class. Journal of College
between aptitude beliefs and programming practice. Teaching and Learning 7, 11: 51.
IEEE Transactions on Education 57, 3: 169–174. 85. Hideyuki Suzuki and Hiroshi Kato. 1993. AlgoBlock: a
74. Michael James Scott and Gheorghita Ghinea. 2013. tangible programming language, a tool for
Educating programmers: A reflection on barriers to collaborative learning. In Proceedings of 4th European
deliberate practice. In Proceedings of the 2nd Annual Logo Conference, 297–303.
HEA STEM Conference. 86. Hideyuki Suzuki and Hiroshi Kato. 1995. Interaction-
75. Michael James Scott and Gheorghita Ghinea. 2014. level support for collaborative learning: AlgoBlock—
Measuring enrichment: the assembly and validation of an open programming language. In The first
an instrument to assess student self-beliefs in CS1. In international conference on Computer support for
Proceedings of the tenth annual conference on collaborative learning, 349–355.
International computing education research, 123–130. 87. Suzanne E Wade, Gregory Schraw, William M Buxton,
76. Stacey D Scott, Regan L Mandryk, and Kori M Inkpen. and Michael T Hayes. 1993. Seduction of the strategic
2003. Understanding children’s collaborative reader: Effects of interest on strategies and recall.
interactions in shared environments. Journal of Reading Research Quarterly: 93–114.
Computer Assisted Learning 19, 2: 220–228. 88. Danli Wang, Lan Zhang, Chao Xu, Haichen Hu, and
77. Orit Shaer and Eva Hornecker. 2010. Tangible user Yunfeng Qi. 2016. A Tangible Embedded
interfaces: past, present, and future directions. Programming System to Convey Event-Handling
Foundations and Trends in Human-Computer Concept. In Proceedings of the TEI’16: Tenth
Interaction 3, 1--2: 1–137. International Conference on Tangible, Embedded, and
78. Tia Shelley, Leilah Lyons, Moira Zellner, and Emily Embodied Interaction, 133–140.
Minor. 2011. Evaluating the Embodiment Benefits of a 89. Danli Wang, Yang Zhang, Tianyuan Gu, Liang He, and
paper-based TUI for Spatially Sensitive Simulations. In Hongan Wang. 2012. E-Block: a tangible programming
Extended Abstracts of the 2011 Conference on Human tool for children. In Adjunct proceedings of the 25th
Factors in Computing Systems, 1375. annual ACM symposium on User interface software
79. Alexander Skulmowski, Simon Pradel, Tom Kühnert, and technology, 71–72.
Guido Brunnett, and Günter Daniel Rey. 2016. 90. Chun-wang Wei, Hsin-hung Chen, and Nian-shing
Embodied learning using a tangible user interface: The Chen. 2015. Effects of Embodiment-Based Learning
effects of haptic perception and selective pointing on a on Perceived Cooperation Process and Social Flow. In
spatial learning task. Computers & Education 92–93: 7th World Conference on Educational Sciences, 608–
64–75. 613.
80. R J W Sluis, Ivo Weevers, CHGJ Van Schijndel, 91. Cameron Wilson. 2014. Hour of code: we can solve the
Lyuba Kolos-Mazuryk, Siska Fitrianie, and JBOS diversity problem in computer science. ACM Inroads 5,
Martens. 2004. Read-It: five-to-seven-year-old children 4: 22.
learn to read in a tabletop environment. In Proceedings 92. Peta Wyeth. 2008. How Young Children Learn to
of the 2004 conference on Interaction design and Program With Sensor, Action, and Logic Blocks.
children: building a community, 73–80. Journal of the Learning Sciences 17, 4: 517–550.
81. Taylor Soper. 2015. Microsoft, UW participate in 93. Lesley Xie, Alissa N Antle, and Nima Motamedi.
―Hour of Code‖ events to get students interested in 2008. Are tangibles more fun?: comparing children’s
computer science. GeekWire. Retrieved from enjoyment and engagement using physical, graphical
https://www.geekwire.com/2015/microsoft-uw- and tangible user interfaces. In Proceedings of the 2nd
participate-hour-code-events-get-students-interested- international conference on Tangible and embedded
computer-science/ interaction, 191–198.
82. Danae Stanton and H R 2003 Neale. 2003. The effects 94. Oren Zuckerman and Ayelet Gal-Oz. 2013. To TUI or
of multiple mice on children’s talk and interaction. not to TUI: Evaluating performance and preference in
Journal of Computer Assisted Learning 19, 2: 229– tangible vs. graphical user interfaces. International
238. Journal of Human-Computer Studies 71, 7: 803–820.
83. Haichun Sun and Yong Gao. 2016. Impact of an active
educational video game on children’s motivation,
science knowledge, and physical activity. Journal of
Sport and Health Science 5, 2: 239–245.
84. Ornprapat Suwantarathip and Saovapa Wichadee.

View publication stats

Anda mungkin juga menyukai