Anda di halaman 1dari 7

KA2- Data Analysis

Section 1: Students
The students in this study are from middle to upper middle class
families in Istanbul, Turkey. They all attend a private middle
school ranging from grade 6 to grade 8. Their average age is 12.6
years. There are 21 students in all including 13 males and 8
females. They are attending a Game Design summer camp. This
information was retrieved from the following article.
Akcaoglu, M. (2014). Learning problem-solving through making
games at the game design and learning summer
program. Educational Technology Research and
Development, 62(5), 583-600.
Section 2: Course
The Game Design and Learning Program (GDL) is a 10 day summer
program designed to teach a specific kind of computer
programming. The primary purpose of the program is to teach the
students digital game design. The secondary purpose is to teach
the students problem-solving skills by giving them practice in
solving complex problems. To reach these goals, four types of
activities were offered during the program: game design, problemsolving, troubleshooting, and free-design. The assessment used to
measure any change in the students problem solving was created by
The Program for International Student Assessment (PISA). This
test was given the first and the last day of the program. The
assessment had a total of nineteen items that covered system
analysis and design, decision making, and troubleshooting. The
questions were in the form of multiple choice and short answer
items. This information was retrieved from the following article.
Akcaoglu, M. (2014). Learning problem-solving through making
games at the game design and learning summer
program. Educational Technology Research and
Development, 62(5), 583-600.

Section 3: Descriptive Analysis


The information used for analysis is the post program assessment
information. The mean of the student scores is 15.42105263. The
median score is 16. The standard deviation is 5. The assessment
items covered three types of questions: Decision making, Troubleshooting, and System analysis and design. The questions were
presented in multiple choice, open constructed response, and
closed constructed response formats. Based on the data, the items
that seemed to be the most difficult for the students to answer
are were questions X423Q01 and X423Q02. Both of these were
Trouble-Shooting type questions. They were also multiple choice.
In fact, of the six questions with an accuracy of 20% or lower,
five of them are multiple choice. It is possible that the
students had problems with the way the questions or answers were
written.

Mean
Median
Standard
Deviation

15.42105
263
16
5.008579
896

First Half of the Data Spreadsheet

Second Half of the Data Spreadsheet

Question Type Difficulty


60
50
40
30
20
10
0

CCR- Closed Constructed Response


OCR- Open Constructed Response
MC- Multiple Choice

Reliability
Using the Spearman-Brown reliability test (Shown Below), I
calculated the reliability of this assessment. Rounding to the
nearest hundredth the reliability was 0.82 making this assessment
fairly reliable. I think that the assessment was reliable because
it assessed what the program instructors taught. The students
knew what to expect and were given activities that prepared them
for what they would encounter.

Spearman-Brown

Section 4: Analysis of student strengths and weaknesses


Below is a chart of each individual students strengths and
weakness on the PISA assessment. Based on the item difficulty
data and how each student performed, Decision making and System
analysis and design were where students performed the best.
Students had less accuracy with Trouble-shooting type questions
and questions in the multiple choice format. This could mean
several things. First we need to look at the question. It is
possible that it was poorly written, too difficult, or presented
in the wrong question format. Next we need to look at
instruction. Was trouble-shooting explained properly? Were there
enough activities done so that the students were familiar with it
enough to know how to go about working through a problem?

Student
ID
1
2

Strengths
Decision Making and System
analysis and design
System analysis and design
questions in the openconstructed response
format
Received full credit on
one Decision making
question

Decision making

5
7

System analysis and design


Achieved at least partial
credit on most open
constructed response
questions
Received at least half
credit on two system
analysis questions

Weakness
Trouble-Shooting
Trouble-shooting and
decision making
Decision making, system
analysis and design, and
trouble shooting. Also
had problems with open
constructed response
System
analysis
and
design
and
troubleshooting
Trouble-shooting
Decision making, system
analysis and design, and
trouble-shooting
Decision making, system
analysis and design, and
trouble-shooting/
also

System Analysis and design


in the open constructed
response format

10

Received full credit on


one system analysis
question
System Analysis and design
in the open constructed
response format

11

12
13

14
15

16

17

18

19

Was able to achieve at


least partial credit for
twelve of the 19 questions
Received full credit on
one decision making
question
Decision Making, and
System analysis and design
Decision Making, and
System analysis and design
System analysis and design
in the open constructed
response format and
decision making
Received at least half
credit on four constructed
response type questions
and full credit on an open
constructed response
system analysis question
No real strengths but does
receive at least partial
credit for most open
constructed response
questions
System Analysis and
design, and Decision
making

all questioning formats


System Analysis and
design in the multiple
choice format, and
decision making
System Analysis and
design, and decision
making
System analysis and
design in other formats,
decision making and
trouble-shooting in all
questioning formats
Showed some weakness in
all question types and
formats
Decision making, system
analysis and design, and
trouble-shooting/ also
all questioning formats
None
Trouble-shooting, and
system analysis and
design in the multiple
choice format
Trouble-shooting

Decision making, system


analysis and design, and
trouble-shooting/
also
all questioning formats
Open constructed response
type questions

Trouble-shooting

21

System analysis and design

Open constructed response


type questions

Section 5: Improvement Plan


In order to improve future assessment results, there are several
things that could be done. The first things is to look at the
questions and the questioning formats. It may be best to remove
the multiple choice format altogether since the type of
information that is being assessed lends itself more to open and
closed constructed responses. Next, since teaching problemsolving is one of the main goals of the program, the instructors
will have to look at their strategies for teaching problemsolving through trouble-shooting. They may have to spend more
time on it than they have previously.

Anda mungkin juga menyukai