Anda di halaman 1dari 2

Peer Evaluation of Student Generated Content

Jared Tritz

Nicole Michelotti
University of Michigan Department of Physics Ann Arbor, MI

Ginger Shultz
University of Michigan Department of Chemistry Ann Arbor, MI

University of Michigan Department of Physics Ann Arbor, MI

jtritz@umich.edu

michelot@umich.edu

gshultz@umich.edu

Tim McKay

Barsaa Mohapatra
University of Michigan Department of Ind. Oper. Eng. Ann Arbor, MI

tamckay@umich.edu ABSTRACT
We will present three similar studies that examine online peer evaluation of student-generated explanations for missed exam problems in introductory physics. In the rst study, students created video solutions using YouTube and in the second two studies, they created written solutions using Google documents. All peer evaluations were performed using a tournament module as part of the interactive online coaching system called E2 Coach[4] at the University of Michigan. With the theme of LAK 2014 being intersection of learning analytics research, theory and practice, we think this poster will provide an accessible example that combines a classroom experiment with rigorous analysis to understand outcomes.

University of Michigan Department of Physics Ann Arbor, MI

barsaasm@umich.edu

wrong and asked to submit their redo solution to a tournament style peer review process. Borrowing techniques from cognitive psychology [5] to stimulate learning in the hard sciences and measuring the eect is becoming an open area of exploration[2]. To activate meta-cognitive reection, students were asked to emphasize any mistake(s) they made on the exam in their solution explanation. For the rst study (Table: 1) in Physics 240 (Electricity and Magnetism for Engineers), students had only one chance to make a video solution and participate in peer review for the second midterm. In the later parallel studies in the Physics 135 (Mechanics for Life Scientists) and 235 (Electricity and Magnetism for Life Scientists), we altered the submission requirement to use written Google documents instead of videos. The study was also expanded to include all midterms in order to explore student participation and learning eects over time. Course Phys 240 Phys 135 Phys 235 Term W12 F13 F13 Mode video doc doc Students 352 396 222 Chances 1 3 3 Submits 206 647 284

Categories and Subject Descriptors


K.3.0 [Computers and Education]: General

General Terms
peer evaluation, tournaments, videos, google docs, blended learning

1. INTRODUCTION
First inspired by ubiquitous screen capture technology, we conducted an experiment that encourages students to create video solutions to an exam problem they got wrong. Knowing that there would probably be a wide range of quality in the content produced we also wanted to provide an ecient means of ltering the content using a student powered peer review system. To accomplish these, we implemented a process whereby students were assigned a problem they got Developer and presenter Principle Investigator of E2 Coach project
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for prot or commercial advantage and that copies bear this notice and the full citation on the rst page. Copyrights for thirdparty components of this work must be honored. For all other uses, contact the Owner/Author. Copyright is held by the owner/author(s). LAK 14, Mar 24-28 2014, Indianapolis, IN, USA ACM 978-1-4503-2664-3/14/03. http://dx.doi.org/10.1145/2567574.2567598

Table 1: Description of three similar studies In all studies, students were given extra credit equivalent to one exam question (5% = 1/20 questions) for each chance they participated, where participation meant creating a solution and reviewing two assigned solutions to select the one they thought was better. Appropriately, the extra credit is not included in the exam data analysis.

2.

PROBLEM ASSIGNMENT

We wanted individual students to revisit problems that were most salient to them, therefore, most students were assigned the easiest problem they got wrong. This was to help ensure the best opportunity for learning and reection at the boundary of their understanding. We had to balance this with the somewhat arbitrary goal of assigning each problem an equal number of times. For voting, we tried to assign students the second easiest problem they go wrong again in an attempt to make the process most useful for them.

3. PARTICIPATION AND PERFORMANCE


While data and results from Phys 235 and 240 will be presented on the poster, for the purposes of this abstract, we will focus on Phys 135. The Fall 2013 term in Physics 135 also had the most students, most chances to participate, and a highest fraction of participants, making it the most statistically signicant dataset. Group identier Participations per student Number of students per group A 0 94 B 1 69 C 2 121 D 3 112

Table 2: Physics 135 F13 groups by participation There are many ways to analyze this kind of data, but we want to try and identify any noticeable performance difference between participants and non-participants. Others have established evidence of eective teaching methods using pre/post testing [1], however, in rapidly changing environments where students make real time choices about learning resources a more exible approached is needed. In general, reducing a noticeable dierence to a quantied eect in this kind of data is plagued with sensitivities to hidden factors, abnormal distributions, noise, and over-tting. Matching techniques have been studied as a way of extracting this kind of information in observational data[3]. Therefore, we cautiously apply and present a simple one-parameter matching technique as a way to diminish selection bias and provide discriminating magnication of a small signal.

Figure 3: Comparison of average dierence in nal exam scores between two groups, the 94 students who gained credit three times and the 112 students who never participated. The bell curve represents a probability distribution for randomly drawing two groups of size 94 and 112 from the class and getting that dierence in exam scores. The pink bar is placed at the actual dierence (82.6 - 76.0 = 6.6 points) observed between groups D and A, and the green bar is placed at the estimated dierence of between the two groups after matching on GPA to remove the selection bias. A 1.5 sigma eect of 3.6 exam points benet is observed after matching on GPA (GPA 0.1). The width of the bars represents the error in their position.

= 0.25) or 6.3% of the 4 point scale. This is similar to the dierence in nal exam scores which is (82.6 - 76.0) 6.6% on the 100 point scale.

4.
Figure 1: Physics 135 F13 nal exam scores

CONCLUSIONS

We present a novel approach to engaging students creatively in their physics coursework and a reasonable approach to measuring its eectiveness. Participation in this creative process and peer evaluation may have a positive eect on student performance, which we measured to be a 3.6 point gain on the nal exam with a 1.5 sigma signicance.

5.

REFERENCES

Figure

2:

Physics 135 F13 incoming (35 freshmen not included)

GPAs

To motivate the matching technique, gures 1 and 2 illustrate that Group A has slightly lower GPA, and slightly lower nal exam scores, than group D. The reason for including the GPA plot is to see that the selection bias is visible in the GPA, a number that described the students before they started the class. Ignoring the 15 freshmen in group A, and 9 freshmen in group D who dont have an incoming GPA the dierence between D and A is (3.54 - 3.29

[1] C. H. Crouch and E. Mazur. Peer Instruction: Ten years of experience and results. American Journal of Physics, 69(9):970, Sept. 2001. [2] L. Deslauriers, E. Schelew, and C. Wieman. Improved learning in a large-enrollment physics class. Science (New York, N.Y.), 332(6031):8624, May 2011. [3] B. B. Hansen. Full Matching in an Observational Study of Coaching for the SAT. Journal of the American Statistical Association, 99(467):609618, Sept. 2004. [4] T. Mckay, K. Miller, and J. Tritz. What to do with actionable intelligence : E2 Coach as an intervention engine. Conference Paper: Learning Analytics and Knowledge 2012, pages 8891, 2012. [5] G. Schraw. Promoting general metacognitive awareness. Instructional science, pages 113125, 1998.

Anda mungkin juga menyukai