Anda di halaman 1dari 24

Effect of Assessment Delivery 1

Running Head: Effect of Assessment Delivery

The Effect of Assessment Delivery on Students End Results


Nadia P. Williams
Kennesaw State University

Effect of Assessment Delivery 2

Introduction
Much literature is being written on the use of games within the classroom environment
for the purposes of heightening student engagement. Such published and peer-reviewed
articles have emerged in outlets such as the Academy of Educational Leadership Journal to
BizEd. That said, in this current environment of high stakes testing within education (NCLB
2002), one could wonder if there could be a correlation with testing and its method of delivery.
How, then could games have an effect on classroom learning?
According to Carlson (2013), there is an effect on differing instructional methods upon
student learning. When students are offered the opportunity to learn through lecture, videos,
discussion, brainstorming, individual projects, and presentations, only four of these options
provide students with a high level of engagement which they, in turn, interpret as heightening
their own critical thinking within the classroom (Carlson, 2013). These four methods are lecture
with discussion, brainstorming, discussion, and individual projects (Carlson, 2013). Educators
must be sure to stay in tune with what engages their specific student clientele to ensure that the
concepts being delivered through instruction are in a format to heighten their efficacy with such
students (Carlson, 2013). Thus, teachers must be conscious while designing their lessons to
create ones that do heighten student engagement as there is a correlation between student
engagement and how well they perform in class (Ozturk, 2012). Such self-reflection, as in the
student self-video, can also add another layer to designing lessons and instruction that will
heighten student engagement (Maloney, Paynter, Morgan, 7 Ilic, 2013). Furthermore, teachers
must be provided with the appropriate level of autonomy in order to remain connected with their
student clientele (Ozturk, 2012). In fact, the contribution of teachers, with sufficient knowledge,
skills, and motivation, to the effective use of the newest teaching methods and materials are at a
higher level (Ozturk, 2012). The inclusion of an instructional method perceived as new such

Effect of Assessment Delivery 3


as gaming, particularly of a technological sort, will as such have an effect on instructional
methods of student learning.
The question posed by Dian Schaffhauser serves as one founding point of this research:
Can Gaming Improve Teaching and Learning? (Schaffhauser, 2013) The article refers back to
the ways in which games have actually been a part of the classroom and that, in fact, as was
stated by Katie Selen, a game designer and DePaul University professor, play is the way that
human beings learn about the world (Schaffhauser, 2013). Play makes student learning fun.
That said, play can disguise a challenging problem in a package that the students will want to
uncover (Schaffhauser, 2013). The best games are disguised to make problems both hard and
fun so that students will want to continue to persist on that problem (Schaffhauser, 2013). The
efficacy of a game-based delivery system hinges upon its design. Thus, a well-designed
gamification system has the most impact on the middle 40 percent to 60 percent of the
students (Gamifying the Classroom, 2012). Top-performing students will always be motivated
to do their best, however other students will indeed be motivated to do more work through
gamification (Gamifying the Classroom 2012). Providing students with regular, almost-tangible
rewards, such as digital badges is one way in which a game-like environment could be created
and maintained (Bell, 2013). In the end, otherwise unmotivated students are indeed motivated
to work harder through video game-like assignment delivery (Hunt, 2013).
With the methods of assessments staying relatively standardized as a result of No Child
Left Behind and the more recent Race to the Top legislation, what alternative assessments are
being utilized and how are they working to change the landscape of educational assessments?
As a result of Race to the Top and the adoption of the Common Core State Standards, the State
of Georgia has adopted a new assessment system named the Georgia Milestones in which
students will work to answer questions in multiple choice and in written format. Such a shift
from the previously-standard multiple choice question tests continues to send a strong

Effect of Assessment Delivery 4


message to students about what counts in the learning environment and curriculum
( Weurlander, Soderberg, Scheja, Hult, & Wernerson 2012). Thus, will a shift to utilizing more
computer-based testing be effective? According to research by Stacy McDougall, Kelli
Vessoyan, and Brent Duncan, the use of augumentative and alternative communication does
appear to work with a variety of groups as an assessment tool. Thus, within the realm of
education, these findings may prove useful.

Statement of the Problem


In keeping with the mandates set forth from Race to the Top legislation and the
implementation of the Common Core curriculum, the State of Georgias Department of
Education is implementing its Common Core-based statewide assessment this year. The
Georgia Milestones were brought into the fold after the PARCC assessment was found to be
too expensive. This exam is structured differently from its predecessor, the CRCT, which was
solely multiple choice. The Georgia Milestones will include multiple choice, short answer, and
longer-form responses. As an educator, it is concerning that this shift in assessment types will
yield different results even if the same standards-based instructional methods are implemented.
Thus, the existence of this new test brings about a number of unknowns mainly centering
around student performance since they have not previously been issued a test in this manner
nor have current educators been state mandated to administer such an exam.

Purpose of the Study


The purpose of this study is to evaluate whether the way in which an assessment is
delivered will have an impact on the students end results. While it can be assumed that wellprepared students will perform well on any type of exam, could the format either encourage
them to do better or worse than they actually could.

Effect of Assessment Delivery 5


Theoretical Framework
This study primarily employs the use of the constructivist theory. From a constructivist
lens, the goal of this study is to evaluate whether students being social impacts their
assessment data. Furthermore, as the constructivist theory supposes that learning is influenced
by language employed in this study could relate to non-verbal language. Thus the non-verbal
clues would be those set forth within the administration methods of the assessments. As Lev
Vygotsky identified, children learn a lot in a play-filled environment. As a result, an assessment
presented in a game-type format should appear less threatening than the typical multiple-choice
exam.

Research Questions and Hypotheses


This study will be mostly quantitative in nature but will also have some qualitative
elements. Thus the research questions for those portions are as follows:
Quantitative:
Do different types of assessments yield different results?
Do students respond better to an assessment delivered in a game-like
environment?
By how many percentage points can a students score be affected by exam
delivery alone?
Qualitative:
How do students feel about assessments in general?
How do students feel they respond emotionally to different assessment types?
Do students feel more of an incentive to perform better on an assessment
presented in more of a game-like fashion?

Effect of Assessment Delivery 6


With the aforementioned research questions I hypothesize that the assessment type will indeed
affect the final results yielded. Furthermore, I surmise that the students will experience less test
anxiety with an exam presented in a game-type format and will thus, perform better than on a
standard multiple-choice exam.

Literature Review
Much literature is being written on the use of games within the classroom environment
for the purposes of heightening student engagement. Such published and peer-reviewed
articles have emerged in outlets such as the Academy of Educational Leadership Journal to
BizEd. That said, in this current environment of high stakes testing within education (NCLB
2002), one could wonder if there could be a correlation with testing and its method of delivery.
How, then could games have an effect on classroom learning?
According to Carlson (2013), there is an effect on differing instructional methods upon
student learning. When students are offered the opportunity to learn through lecture, videos,
discussion, brainstorming, individual projects, and presentations, only four of these options
provide students with a high level of engagement which they, in turn, interpret as heightening
their own critical thinking within the classroom (Carlson, 2013). These four methods are lecture
with discussion, brainstorming, discussion, and individual projects (Carlson, 2013). Educators
must be sure to stay in tune with what engages their specific student clientele to ensure that the
concepts being delivered through instruction are in a format to heighten their efficacy with such
students (Carlson, 2013). Thus, teachers must be conscious while designing their lessons to
create ones that do heighten student engagement as there is a correlation between student
engagement and how well they perform in class (Ozturk, 2012). Such self-reflection, as in the
student self-video, can also add another layer to designing lessons and instruction that will
heighten student engagement (Maloney, Paynter, Morgan, 7 Ilic, 2013). Furthermore, teachers
must be provided with the appropriate level of autonomy in order to remain connected with their

Effect of Assessment Delivery 7


student clientele (Ozturk, 2012). In fact, the contribution of teachers, with sufficient knowledge,
skills, and motivation, to the effective use of the newest teaching methods and materials are at a
higher level (Ozturk, 2012). The inclusion of an instructional method perceived as new such
as gaming, particularly of a technological sort, will as such have an effect on instructional
methods of student learning.
The question posed by Dian Schaffhauser serves as one founding point of this research:
Can Gaming Improve Teaching and Learning? (Schaffhauser, 2013) The article refers back to
the ways in which games have actually been a part of the classroom and that, in fact, as was
stated by Katie Selen, a game designer and DePaul University professor, play is the way that
human beings learn about the world (Schaffhauser, 2013). Play makes student learning fun.
That said, play can disguise a challenging problem in a package that the students will want to
uncover (Schaffhauser, 2013). The best games are disguised to make problems both hard and
fun so that students will want to continue to persist on that problem (Schaffhauser, 2013). The
efficacy of a game-based delivery system hinges upon its design. Thus, a well-designed
gamification system has the most impact on the middle 40 percent to 60 percent of the
students (Gamifying the Classroom, 2012). Top-performing students will always be motivated
to do their best, however other students will indeed be motivated to do more work through
gamification (Gamifying the Classroom 2012). Providing students with regular, almost-tangible
rewards, such as digital badges is one way in which a game-like environment could be created
and maintained (Bell, 2013). In the end, otherwise unmotivated students are indeed motivated
to work harder through video game-like assignment delivery (Hunt, 2013).
With the methods of assessments staying relatively standardized as a result of No Child
Left Behind and the more recent Race to the Top legislation, what alternative assessments are
being utilized and how are they working to change the landscape of educational assessments?
As a result of Race to the Top and the adoption of the Common Core State Standards, the State

Effect of Assessment Delivery 8


of Georgia has adopted a new assessment system named the Georgia Milestones in which
students will work to answer questions in multiple choice and in written format. Such a shift
from the previously-standard multiple choice question tests continues to send a strong
message to students about what counts in the learning environment and curriculum
( Weurlander, Soderberg, Scheja, Hult, & Wernerson 2012). Thus, will a shift to utilizing more
computer-based testing be effective? According to research by Stacy McDougall, Kelli
Vessoyan, and Brent Duncan, the use of augumentative and alternative communication does
appear to work with a variety of groups as an assessment tool. Thus, within the realm of
education, these findings may prove useful.

Methodology Design
Setting and Participants
This study will be conducted with the students within my 8th grade Advanced
Content/Gifted English Language Arts classroom. Though these students present a small
portion of my suburban Atlanta, Georgia middle school, they will serve as a valid basis to begin
such a study. It is understood at the outset that as this group of students tend to test well, as is
evidenced by their placement within an Advanced Content class as a result of the Georgia
Department of Educations Criterion-Referenced Competency Test (CRCT), further studies with
students who are not already meeting or surpassing grade-level standards may affect the
results of this study in future iterations.

Design
The students will participate in a short assessment on five differing English Language
Arts topics where they are able to participate in either a paper-based version of the assessment
or the game-based assessment. The students will be divided into two distinct, yet randomlyselected groups: one group that takes the paper assessment and the other group that will

Effect of Assessment Delivery 9


participate in the game-based assessment. The students within each group will aim to be
comprised of an equal amount of males and females and much care will be given to ensure that
the average CRCT scores of these students within English Language Arts and Reading within
each group will be roughly the same (within 5 points). These groups will be administered either
the paper or the game-based assessment on the same day to see if the students are indeed
doing better or worse within the type of assessment presented. By administering the
assessment on the same day, the hope is that all environmental factors within the classroom will
be the same between both groups. Furthermore, each group will be provided with the same
instructions in order to maintain consistency of their perceived expectations. The students will
be instructed to:
1. Answer each question to the best of his or her ability, and to
2. Work individually.

Data Collection
As mentioned in the research design section, the students will take part a short
assessment over five different English Language Arts concepts and be separated into two
groups. For the paper-based assessments, the students responses will be hand-scored. For
the game-based assessments, the students will make use of their personal electronic devices
(i.e. smart phones, iPads, eReaders, tablets, etc.) in order to participate in an assessment
delivered via Kahoot. Kahoot, found at www.GetKahoot.com, is an online assessment tool that
is free to use. Once the user creates an account, he or she can choose to create a new quiz,
discussion, or poll called a Kahoot. Should the user opt not to create an original Kahoot, he or
she can use an existing Kahoot created by another user. Once launched for student use, the
teacher or facilitator (previously referred to as the user) will then direct the students as they

Effect of Assessment Delivery 10


work against a thirty second clock to respond to the posted question. Unless otherwise decided
upon the facilitator, each question is displayed with accompanying upbeat and carnival-like
music. At the end of each question, a leaderboard with the standings of the top five students is
shown along with their posted points. This online assessment tool will provide instant results
and standings based upon which students respond correctly the quickest. The results are
automatically aggregated into an Excel spreadsheet-compatible .CSV file which will be
downloaded at the end of the admission of the assessment. These results will then be
converted into points, where students gain one point per correct question so that these results
can be compared to that of the students in the paper assessment group to see if the
assessment method has an effect on the students success or failure on that assessment.
Furthermore, both assessment types will feature the same exact questions with the same
answer options and wording so that that does not stand as a possible variable that could affect
the final results.

Data Analysis
As stated above, the assessments will be either graded through the Kahoot online
assessment system game or by hand for the paper-issued assessments. In order to doublecheck the results, the students who are administered the paper assessment will be provided
with the correct answers. This will serve to decrease the inherent fallibility of a human scorer.
The data will be input into spreadsheets in order to evaluate how each student faired
over the course of the four assessments. They will be organized in order to identify the
following:
Whether or not the type of assessment issued affected student success between the
assessment groups,
Whether or not the type of assessment affected student success for individual students,

Effect of Assessment Delivery 11


Whether the student responses overall improved or worsened as the assessment
progressed, and
How each class or group of students responded to either assessment type
administered.

The organization of the data within these four avenues will once again work to try and eliminate
any variables beyond that of the types of assessments delivered for analysis. Furthermore, by
looking at the assessment types effects on individual students, other unforeseen correlations
may start to emerge that may also present interesting information that could drive which types of
assessments educators choose to utilize for gauging students standards mastery.

Results
Descriptive Analysis of Data

The students participated in either a traditional or game-style version of the same


assessment of five general English Language Arts content questions. Thus, in
analyzing the data I collected comparing how my students faired in an assessment
delivered through traditional means (paper and pencil) versus a game-type method
(using Kahoot), I sorted the results in a variety of ways. First, the data was collected
and recorded into different spreadsheets. The results for each classes traditional
assessment results were in separate sheets than the results from the Kahoot version.
Thereafter, the averages of correct versus incorrect answers were compiled into a chart
alongside a comparison of the percentage of students that answered each question
correctly (Table A).

Effect of Assessment Delivery 12


Table A.

Class
Period
4th
5th
6th
All

Average
Correct
Answers
out of 5
(Paper)
4.18
4
3.82
4

Average
Average
Correct
Incorrect
Answers Answers
out of 5
out of 5
(Kahoot)
(Paper)
3
0.72
2.86
1.42
3.25
1.18
3.036667 1.106667

Average
Incorrect
Answers
out of 5
(Kahoot)
1.83
1.79
1.75
1.79

I conducted the single-factor ANOVA to examine differences among the class-by-class


averages of correct to incorrect answers between the two assessment types tested.
Most importantly, by looking at the p-value, one can determine whether or not there is a
significant difference between the results. In this instance, the p-value for the correct
questions in both formats was a value of 0.003355 and therefore less than the accepted
alpha value of 0.05. This indicates that indeed, the difference was significant. The
same was true with regard to the p-value of the difference between the traditional
(paper) assessments incorrect answers and those of the Kahoot version of the
assessment (Table B, Table C). The value of 0.029753 is also less than 0.05, and is
therefore significant.

Table B
Anova: Single Factor
(Correct Answers)
SUMMARY
Groups

Count

Su
m

Average

Variance

Effect of Assessment Delivery 13

Correct
Answers
Paper

Correct
Answers
Kahoot

12

0.0324

9.11 3.036667 0.039033

ANOVA

Source of
Variation

SS

Between
Groups

1.392017

1 1.392017 38.97387 0.003355 7.708647

Within
Groups

0.142867

4 0.035717

Total

1.534883

df

MS

P-value

F crit

Table C
Anova: Single Factor
(Incorrect Answers)
SUMMARY
Groups
Incorrect
Answers
Paper

Su
m

Count

Incorrect
Answers
Kahoot

Average

Variance

3.32 1.106667 0.126533

5.37

1.79

0.0016

ANOVA
Source
of
Variation

SS

df

MS

P-value

F crit

Effect of Assessment Delivery 14


Between
Groups
Within
Groups
Total

0.700417

1 0.700417 10.93262 0.029753 7.708647

0.256267

4 0.064067

0.956683

The data collected from this experiment indicate that contrary to my initial hypothesis,
the students clearly did far better in with the traditional assessment versus the gametype Kahoot assessment. This finding comes from the actual number of items correct
(out of a total of five questions) as well as from the average number of items correct
from class-to-class. Furthermore, there was a higher percentage of students who
answered each question correctly in the paper assessment versus with the game-based
Kahoot assessment (Table D).

Table D

Class
Period

Q1
Paper

Q1
Kahoot

Q2
Paper

Q2
Kahoot

Q3
Paper

4th

45%

25.00%

100%

67.67%

100%

5th

64.28%

35.71%

92.86%

64.29%

6th

54.55%

58.33%

90.90%

66.67%

All

55%

39.68%

95%

66.21%

Q3
Kahoot

Q4
Paper

100.00%

81%

100.00%

85.71%

100.00%

100.00%

100%

95.24%

Q4
Kahoot

Q5
Paper

Q5
Kahoot

16.67%

100%

91.67%

43.86%

7.14%

100.00%

92.86%

36.36%

16.67%

100.00%

83.33%

54%

13.49%

100%

89.29%

Though a total of four classes participated within this study, the results of only three of
them were analyzed. The first classs assessments presented several challenges that

Effect of Assessment Delivery 15


made the resulting data innacurate. First, the students were not evenly divided between
the assessment types. This inequality thereby makes it such that the percentages and
averages assessed would be skewed more heavily in the direction of those participating
in the Kahoot since more students participated in that. Secondly, the second question
required the students to identify a specific element within a plot diagram. Unfortunately,
the diagram was mistakenly left off of the paper assessment, but was in the Kahoot
version. This error was corrected in the subsequent administrations within the classes
that followed (Appendix A).

Inferential Analysis
The change in assessment format yielded a difference in the students demeanor toward the
activity. While both groups received the same instructions, in each class I observed that the
students experienced challenges in maintaining general assessment protocol while participating
in the Kahoot version of the assessment. Both assessment groups were provided with the
same instructions identified below:
1. Answer each question to the best of your ability.
2. Work individually.
Even with these clear-cut and standard instructions, an interesting phenomenon began to arise
with the administration of each of the Kahoot versions of the assessment: the students worked
collaboratively and consulted one another whilst also competing against one another (Appendix
D). Thus, I surmised that by it being a more game-like format, and games tend to be social in
nature, the students were naturally inclined to engage with one another. Furthermore, this
supported my observation that the students did not take the Kahoot assessment as seriously as
did the groups which responded to the standard paper assessment. As is evidenced by their

Effect of Assessment Delivery 16


choice of screen names, in both my fourth and fifth period classes, the majority of students
chose nicknames, and used emoticons (small, digital images in lieu of Roman characters or
letters) as part of their names as well. This lighthearted approach continues to support the
inference that the students viewed this version of the assessment as being one of lesser
academic weight.

Implications for Further Study


Should this or a similar study be replicated, I would recommend that the facilitator be aware of
the students attitudes toward each assessment types and to include an element where student
interviews are included as part of the data collection process. Furthermore, the type of gamelike assessment could possibly be changed to one where time is less of a factor in being
awarded more game points, as is the case with Kahoot, so that students are more inclined to
take as much time as necessary to answer correctly instead of being so driven to answer first. I
would also suggest that other iterations of this study be reframed so that the questions asked
require different thinking processes for the students. For example, instead of responding to
multiple choice questions, the students could be asked to provide definitions to content terms or
vice-versa, defend their position on a topic using supporting details from a specific source (i.e.
the defense of a side of an argument presented within a work of literary fiction), or to provide an
example of a content term shown or provided.

Effect of Assessment Delivery 17


Appendices
A. Assessment Questions (Paper version shown, same questions used in Kahoot
version)

Effect of Assessment Delivery 18

B. Paper Assessment Results by Class Period

Student

Class
Period

Gender

Format

Har
Les
Qui
Nik
May
Hay
Chr
Kat
Ram
Mar

4th
4th
4th
4th
4th
4th
4th
4th
4th
4th

M
F
M
F
F
F
M
F
M
M

Paper
Paper
Paper
Paper
Paper
Paper
Paper
Paper
Paper
Paper

Fel

4th

Paper

Total Correct

Student

Eli
Ste
Sou
Ela
Cam
Mar
Jac
Nic
Ang
Wal
Kai
Gus
Ben
Pen

Class
Period

5th
5th
5th
5th
5th
5th
5th
5th
5th
5th
5th
5th
5th
5th

Correct
Answers

Incorrect
Answers

Q1

4
0
3
2
4
1
5
0
4
1
4
1
4
1
5
0
4
1
5
0
4
1
4.18
0.73
Average

Q2

Q3

Q4

Q5

1
0
0
1
0
0
0
1
0
1
1
5

1
1
1
1
1
1
1
1
1
1
1
11

1
1
1
1
1
1
1
1
1
1
1
11

1
0
1
1
1
1
1
1
1
1
0
9

1
1
1
1
1
1
1
1
1
1
1
11

45.00%

100.00%

100.00%

81.00%

100.00%

Q3

Q4

Q5

Gender

Format

Correct
Answers

Incorrect
Answers

M
M
F
F
F
M
F
F
F
M
M
M
M
M
Total

Paper
Paper
Paper
Paper
Paper
Paper
Paper
Paper
Paper
Paper
Paper
Paper
Paper
Paper

3
4
5
3
5
5
4
4
4
3
5
5
3
3

2
1
1
2
0
0
1
1
1
2
0
5
2
2

0
1
1
0
1
1
1
1
1
0
1
1
0
0

1
0
1
1
1
1
1
1
1
1
1
1
1
1

1
1
1
1
1
1
1
1
1
1
1
1
1
1

0
1
1
0
1
1
0
0
0
0
1
1
0
0

1
1
1
1
1
1
1
1
1
1
1
1
1
1

1.43

13

14

14

64.28%

92.86%

100.00%

43.86%

100.00%

Average

Q1

Q2

Effect of Assessment Delivery 19

Student

Sel
Car
Sof
CJ
Bri
Ade
Gin
Ali
Pax
Chl
Jav

Class
Period

6th
6th
6th
6th
6th
6th
6th
6th
6th
6th
6th

Gender

Format

M
M
F
M
F
F
F
M
M
F
F
Total

Paper
Paper
Paper
Paper
Paper
Paper
Paper
Paper
Paper

Correct
Answers

Incorrect
Answers

4
3
4
5
3
3
3
5
5
4
3

1
2
1
0
2
2
2
0
0
1
2

1
0
0
1
0
0
0
1
1
1
1

1
1
1
1
1
1
1
1
1
1
0

1
1
1
1
1
1
1
1
1
1
1

0
0
1
1
0
0
0
1
1
0
0

1
1
1
1
1
1
1
1
1
1
1

3.81

1.18

10

11

11

54.55%

90.90%

100.00%

36.36%

100.00%

Paper
Paper

Average

Q1

Q2

Q3

Q4

Q5

C. Kahoot Assessment Results by Class Period


Class
Period

Gender

Lylo

4th

Kahoot

Rahjshiba

4th

Kahoot

Jeff-f-f Dunham

4th

Kahoot

Celene?

4th

Kahoot

Yourmom.com

4th

Kahoot

Korean Jesus

4th

Kahoot

DonutJudgeAlex

4th

Kahoot

ht

4th

Kahoot

Yourmom.org

4th

Kahoot

Gandalf

4th

Kahoot

#NoahC.is life

4th

Kahoot

Reeeeex Meerton

4th

Kahoot

1.83

12

11

25.00%

67.67%

100.00%

16.67%

91.67%

Student

Total

Format

Correct
Answers

Average

Incorrect
Answers

Q1

Q2

Q3

Q4

Q5

Effect of Assessment Delivery 20

Class
Period

Gender

Format

5th
5th
5th

F
F
F

Pluto

5th
5th
5th
5th
5th
5th
5th
5th
5th
5th

Ellie

5th

Student
Mariah
Ava

Eddie.savage
.
Johnny
Kellen
Frank
Mikayla
Rikki ( Tyler)
AJ
taylor
Angie

Correct
Answers

Incorrect
Answers

Kahoot
Kahoot
Kahoot

4
4
4

1
1
0

1
1
1

1
1
1

1
1
1

Q4
0
0
0

M
M
M
M
F
F
M
F
F
M

Kahoot
Kahoot
Kahoot
Kahoot
Kahoot
Kahoot
Kahoot
Kahoot
Kahoot
Kahoot

4
3
3
3
3
3
3
2
2
1

1
2
2
2
2
2
2
3
3
0

0
0
0
0
0
1
0
0
0
1

1
1
1
1
1
0
1
0
0
0

1
1
1
1
1
1
1
1
1
0

1
0
0
0
0
0
0
0
0
0

1
1
1
1
1
1
1
1
1
0

Kahoot

2.86

1.79

12

13

35.71%

64.29%

85.71%

7.14%

92.86%

Total

Average

Class
Period

Kahoot

1
1
1
1
1
1
0
1
0
0
0

1
1
1
1
0
1
1
0
1
0
0

1
1
1
1
1
1
1
1
1
1
1

1
1
0
0
0
0
0
0
0
0
0

1
1
1
1
1
0
1
1
1
1
1

Kahoot

3.25

1.75

12

10

58.33%

66.67%

100.00%

16.67%

83.33%

adoniA$

6th

Jordan
Donavan
Courtney
Chaz
Angel
Audrey
Dylan

Total

1
1
1

0
0
1
1
2
2
2
2
2
3
3

eema

Zach

Q5

5
5
4
4
3
3
3
3
3
2
2

6th
6th
6th
6th
6th
6th
6th
6th
6th
6th
6th

Kennede

Q3

Incorrect
Answers

Format

Maribel

Q2

Correct
Answers

Gender
F
F
M
M
M
F
M
M
F
M
F

Student

Q1

Kahoot
Kahoot
Kahoot
Kahoot
Kahoot
Kahoot
Kahoot
Kahoot
Kahoot
Kahoot

Average

Q1

Q2

Q3

Q4

Q5

Effect of Assessment Delivery 21


D. Class-by-Class Observations
Class
Period

Study Procedures

3rd
For this class, I had the students
(27 total) who had a phone on them complete
the Kahoot version, while the nonphone-carrying students completed
a paper and pencil version. The
students completing the paper
version did so in isolation from their
peers as their peers were getting
logged on to Kahoot. Thus, I had
23 students doing the Kahoot as
opposed to 4 students doing the
paper and pencil version.

Observations/Rationale
The students who did the paper and pencil
version finished quite quickly. They asked no
questions.
The students working with Kahoot could not
avoid collaboration. They were discussing
answers and were working in tandem with
one another even though it was a
competition. After the first question, I told
them not to discuss the answers, which
ended the collaboration, but the students
were still vocal and extremely excited to
participate.

Error of note: The students for the


paper and pencil version did not
Another observation, Kahoots game-like
have the diagram for #2 to identify format encourages students to use funny and
the other name for denouement,
silly nicknames unless otherwise specified.
and so I voided that question. None
of those students got the answer
correct.
For next time: I will have a drawing
of the diagram on the whiteboard to
assist the students completing the
paper and pencil version. I will also
have the paper quiz students sit
toward the front of the room.
4th
(24 total,
1
student
absent)

In this period, I divided the class as


close to halves as possible. There
were 12 students in the Kahoot
group and 11 in the paper and
pencil group. I also worked to have
each group be as balanced as
possible gender-wise. I did ask for
volunteers of those who did have
cell phones or devices on them to
join the paper and pencil group.
Luckily, a handful of students
obliged and volunteered with little
hesitation.
Before the Kahoot began, I told the

The students were still a little vocal during


Kahoot, but they were not sharing answers.
The students still wanted to put silly
nicknames, whereas the paper and pen
students put their actual given names.

Effect of Assessment Delivery 22


Kahoot students not to share
answers and to treat it like a real
quiz as much as possible.
5th (27
total)

The students were divided in the


The paper and pencil group had no
manner as in 4th period, where they challenges with students knowing how to act
were split in half as close to true half or approach the task even when I specified
as possible. There were 14
that they were to treat this as a true quiz.
students in the paper and pencil
group and 13 students in the Kahoot
group.
As observed before, the students in
the Kahoot group could not resist
including nicknames. In this class,
however, some students chose to
put the names of other students and
even of myself. As a result, I had to
take more time in order to have the
students register correctly for
participating in the Kahoot.

6th (24
total, 1
absent)

This class was split up almost in


The students were extremely competitive and
half. There were 11 students who students who typically do moderately well on
took the paper and pencil test,
quizzes and test were very motivated to have
whereas there were 12 who took the their names show up on the leaderboard.
Kahoot test. This class did not have
issues with having nicknames too
divergent from their first names, but
the were instructed NOT to put
nicknames into the login portion.
Some students taking the paper and
pencil test failed to see that the
diagram fro #2 was on the
whiteboard even though they had
been notified of this verbally. In
fact, one student (a girl) waited until
#2 was displayed through Kahoot in
order to complete the quiz.

Effect of Assessment Delivery 23

References

Alkan, F. (2013). THE EFFECT OF ALTERNATIVE ASSESSMENT TECHNIQUES ON


CHEMISTRY COMPETENCY PERCEPTIONS AND CHEMISTRY SUCCESS OF
PROSPECTIVE SCIENCE TEACHERS. Journal Of Baltic Science Education, 12(6),
774-783.

Bell, M. (2013). I'll Show You My Badges if You'll Show Me Yours!. Internet@Schools,
20(3), 23-25.

Carlson, S. C. (2013). INSTRUCTIONAL METHODS INFLUENCE CRITICAL


THINKING: DO STUDENTS AND INSTRUCTORS AGREE?. Academy Of Educational
Leadership Journal, 17(1), 27-32.

Gamifying The Classroom. (2012). BizEd, 11(6), 52-53.

Hunt, M. W. (2013). Video & Sound Production: Flip Out! Game On!. Techniques:
Connecting Education & Careers, 88(1), 36-38.

Maloney, S., Storr, M., Paynter, S., Morgan, P., & Ilic, D. (2013). Investigating the
efficacy of practical skill teaching: a pilot-study comparing three educational methods.
Advances In Health Sciences Education: Theory And Practice, 18(1), 71-80.
doi:10.1007/s10459-012-9355-2

Effect of Assessment Delivery 24


McDougall, S., Vessoyan, K., & Duncan, B. (2012). Traditional Versus Computerized
Presentation and Response Methods on a Structured AAC Assessment Tool. AAC:
Augmentative & Alternative Communication, 28(2), 127-135.
doi:10.3109/07434618.2012.677958

ZTRK, . (2012). Teacher's Role and Autonomy in Instructional Planning: The Case
of Secondary School History Teachers with regard to the Preparation and
Implementation of Annual Instructional Plans. Educational Sciences: Theory & Practice,
12(1), 295-299.

Schaffhauser, D. (2013). Can Gaming Improve Teaching and Learning? (cover story). T
H E Journal, 40(8), 26-33.

Smimou, K., & Dahl, D. W. (2012). On the Relationship Between Students Perceptions
of Teaching Quality, Methods of Assessment, and Satisfaction. Journal Of Education For
Business, 87(1), 22-35. doi:10.1080/08832323.2010.550339

Weurlander, M., Sderberg, M., Scheja, M., Hult, H., & Wernerson, A. (2012). Exploring
formative assessment as a tool for learning: students experiences of different methods
of formative assessment. Assessment & Evaluation In Higher Education, 37(6), 747760. doi:10.1080/02602938.2011.572153

Anda mungkin juga menyukai