Anda di halaman 1dari 14

Meridian: A K-16 School Computer Technologies Journal

ISSN: 1097-9778

Effective Methods for 21ST Century Learning:


A Teacher Action Research Project
Aaron Johnson
Virginia Tech

Abstract:
This teacher action research project examines the effectiveness of three different
instructional methodologies over the course of two instructional units in a technology rich,
middle school social studies classroom. The instructional methods examined in this study
included a student-centered/tech-heavy form of instruction, a teacher-driven or traditional form
of instruction, and a hybrid or blended form of instruction. Pre and post assessments were
implemented for both instructional units of this study, in addition to both student surveys and
interviews. Findings suggest the hybrid version of instruction as more effective compared to
both the traditional and student-centered/tech-heavy instructional methods concerning student
overall preferences for learning and learning outcomes. Other significant findings to emerge
included student choice of essay items as a possible factor for increasing students writing
scores, the use of technology can enhance students learning if appropriately balanced with
teacher involvement, and when coupled with student choice, raising expectations for students
writing may yield greater writing outcomes including higher quality and more detailed student
written responses.
Keywords: 21st century learning, middle school social studies, 1:1 environment, effective
strategies for learning

Effective Methods for 21ST Century Learning:


A Teacher Action Research Project

Introduction:

Since becoming a classroom teacher six years ago, I have experimented with a variety of
instructional methods and approaches to learning with my students to see what works best. Two
years ago my district adopted several technology-based initiatives, including both the S.T.E.M.
(Science, Technology, Engineering, and Mathematics) and 1:1 computing initiatives that
provided me with an incredible amount of tech-based tools, which opened many new avenues for
instruction. With a classroom full of 21st century technology, I have anecdotally observed an
increase in student engagement with content - apathy has been replaced (in many cases) with
more receptive and open student dispositions toward learning. Increased engagement however,
does not always equal successful instruction or meaningful educational experiences for students.
Meridian: A K-16 School Computer Technologies Journal 2
ISSN: 1097-9778

With this in mind, I began to ponder my effectiveness as a teacher and whether my instruction,
which has become increasingly reliant on technology, is contributing to student development in
my classroom? This professional reflection and curiosity ultimately led me to my teacher
research question, In a 21st century learning environment, what instructional methods and
practices are most effective with my students?
To address my teacher research question I decided to design a study with the use of three
different instructional methods over the course of two instructional units. The instructional
methods chosen for my study represented methods or aspects of methods I frequently used with
my students - these methods included a student-centered/tech-heavy method of instruction, a
teacher-driven or traditional method of instruction, and a hybrid or blended method of
instruction, which included elements of both the student-centered/tech-heavy and traditional
methods of instruction. The first unit of my study compared the learning outcomes of students
receiving the student-centered/tech-heavy instruction to that of students who received the
traditional method of instruction. The second and final unit of my study featured both groups of
students receiving the hybrid method of instruction.
For the sake of clarity, I will identify the methods and practices I used for this study
beginning with the student-centered/tech-heavy form of instruction. This method, in my
classroom, is usually accompanied with or in conjunction with a presentation or project to
display student content understanding. The idea behind this method is to engage students in
the content first, instead of the intuitive pedagogy of providing foundational knowledge before
application (Parker et al., 2011). The student-centered/tech-heavy method included elements of
both Project Based and Problem Based Learning. Project Based Learning provides an alternative
to lecture and recitation approaches by applying knowledge in novel ways that result in an end
product or project (Hernandez-Ramos & De La Paz, 2009). Problem Based Learning fosters
student engagement in a problem, without preparatory study, requiring students to extend their
learning by pursuing a solution or answer to that problem (Wirkala & Kuhn, 2011). The students
who received this method of instruction, Group A, were presented with the task of digitally
telling the story of Americas first permanent British settlement, Jamestown, using the web 2.0
tool Voicethread. Voicethread enables students to insert and arrange images onto a presentation
storyboard. With the use of a microphone, Voicethread then allows students to record narrations
over the images, explaining the significance of the image and how it fits into the larger narrative,
which in this case was the story of Jamestown. A rubric was given to students to establish
expectations, procedures, and guidelines for how their project would be assessed. Before
engaging in the content and putting together the digital narrative, Group A was presented with
two historical questions to help guide their subsequent studies. The questions presented to
students were as follows: What specific challenges faced the early settlers at Jamestown? and
How was the settlement able to survive despite facing overwhelming odds? With these two
historical questions in mind, and a brief web tool tutorial, Group A then began their Jamestown
digital narratives. The students received no direct content instruction - I (as the instructor)
served only as facilitator as the students independently pursued answers to their guiding
questions. The students were given two weeks to complete their digital narratives.
The students who received the traditional method of instruction, Group B, were also
given two weeks of classroom time to cover the Jamestown unit. Unlike the recipients of the

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.


Meridian: A K-16 School Computer Technologies Journal 3
ISSN: 1097-9778

student-centered/tech-heavy method of instruction, Group B received significant amounts of


direct instruction. The instruction consisted of teacher delivered lectures along with class
discussions, multiple text readings (from both text books and printed articles) with corresponding
comprehension activities, two videos which also featured discussion opportunities and
corresponding assignments, and one graphic organizer which was used in conjunction with two
printed articles. The textbook readings came from the McDougall/Littell publication Creating
America (pages 82-89); the online articles came via Learn NC
(http://www.learnnc.org/lp/editions/nchist-colonial/2029) and Powhatan Renape Nation
(http://www.powhatan.org/pocc.html). The two videos were used to support the text readings by
offering a visual component to student learning and included National Geographics: Nightmare
at Jamestown and NOVAs: Pocahontas Revealed. Two reading strategies were employed
during this instruction, teacher model reading and silent, independent reading. No significant
assessments occurred during this unit, i.e. no student-created projects or major tests - the grades
collected for this unit were primarily calculated based on assignment completion and class
participation.
For the second unit of my study, the students from both groups (A & B) received the
same method of instruction, which was the hybrid method of instruction. This method of
instruction contained elements of both the student-centered/tech-heavy instruction and the
traditional method of instruction. The topic for this unit of study was Colonial Carolina. The
instruction for the Colonial Carolina unit consisted of teacher led lectures, text readings with
corresponding class discussions, the use of the Cornell Note taking strategy - with the inclusion
of a student reflection piece, and the use of the web 2.0 tool Glogster. Before beginning the
Colonial Carolina unit, students were given a rubric, which featured specific guidelines,
procedures, and expectations for the end product, a Glogster presentation Glogster is an
engaging tool that functions much like an online scrapbook and allows students to demonstrate
their understandings in a personalized way. The text readings for the Colonial Carolina unit
came from the Pearson/Prentice Hall textbook publication of North Carolina: Land of
Contrasts (pages 86-115). The reading strategies that were used for this unit included the use of
teacher model reading, silent and independent reading, and the use of an on-line, audio version of
the text. After completing each section of readings, both groups of students utilized the Cornell
Note taking strategy and recorded what they considered to be the most significant aspects of that
section. Upon the completion of Cornell Notes, students were to then use Glogster to create a
presentation that would reflect the text readings and their own understandings of the content.
The hybrid form of instruction for this aspect of my study featured the use of technology,
project-based learning, and certain aspects of independent learning that were associated with the
student-driven/tech-heavy method of instruction. Additionally, this unit featured aspects of the
traditional instruction method with the use of teacher led-lectures, class readings, and
discussions. Students were given two weeks of class time for the Colonial Carolina unit.

Methodology

Data Collection
The data for my study was collected in multiple ways and featured both quantitative and

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.


Meridian: A K-16 School Computer Technologies Journal 4
ISSN: 1097-9778

qualitative methods. Before beginning the Jamestown unit of my study, both groups (A & B)
took a Quia pre-assessment, which consisted of five multiple-choice questions and was
administered to gauge their knowledge before beginning instruction. Quia is quite simply a test
facilitating software that provides teachers with detailed feedback about student achievement via
multiple-choice forums. Multiple-choice forums, as noted by Reich (2009), are not complete,
accurate assessments of student knowledge, and as such, extended, written response items were
added to the post-assessments. During the entire teacher research project process I kept a
research journal, which housed my daily observations and reflections. My observations were
recorded in real time in an attempt to capture classroom occurrences as they happened. My
reflections typically occurred during my planning period, several hours after the fact, which
allowed me time to process the day's events. The reflection process, in particular, proved to be a
very valuable aspect of my study, as it maintained my focus and often provided direction and
clarity to my study (MacLean & Mohr, 1999).
After completing the Jamestown unit, both groups of students were reassessed using the
same multiple-choice quiz that was used in the pre-assessment. The post-assessment included
three extended essay items, which were a reflection of major themes that were addressed as part
of the Jamestown unit. Upon the completion of the Jamestown post-assessment, both groups
then took a post-survey or exit survey that provided them with an opportunity to voice their
thoughts about the just completed unit. The survey questions were mostly open-ended, which
allowed students to elaborate their likes and dislikes of the instruction they received. The exit
surveys for Group A and Group B were slightly different in light of the different instructional
methodologies employed. Some of they survey questions were specifically tailored toward
instructional methods, which differed between groups. Exactly seven days after the completion
of the Jamestown unit, five students from each group were selected and interviewed. The
interviews took place during class, but were held while other students were busy working on
another assignment. The interviewed students were asked questions about the information we
had learned during the Jamestown unit. The five students that were chosen from each group
represented the diversity, which exists in my classroom. These interviews took place a week
after the completion of the Jamestown unit in an effort to see how effectively students
maintained content knowledge and to determine if one method of instruction is more effective in
achieving this goal.
For the Colonial Carolina unit of my study, students from both groups received the
hybrid form of instruction. Before beginning instruction students took a Quia pre-assessment,
which included twenty multiple-choice questions. These questions addressed areas I felt to be of
significance for student learning. My observations and reflections for the Colonial Carolina unit
were consistent with those practiced during the Jamestown unit. Upon the completion of the
two-week unit, students took a post-assessment quiz, which was the same as the pre-assessment,
with the addition of three extended essay items. Unlike the Jamestown post-assessment extended
response items, the Colonial Carolina assessment allowed for student choice. The students were
allowed to choose one of the possible three essay items to write about. Both groups of students
had knowledge of the essay items from the previous class, in which I read them aloud and
discussed them with students in the form of a class discussion. This same strategy of preparing
students for the assessment was also utilized in the Jamestown unit of my study. After

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.


Meridian: A K-16 School Computer Technologies Journal 5
ISSN: 1097-9778

completing the post-assessment, the students took an exit survey, which like the first unit's exit
survey, was primarily presented in an open-ended forum, allowing students to freely elaborate
their thoughts concerning the overall learning experience. Unlike the Jamestown exit survey, the
Colonial Carolina exit survey was similar for both classes involved - nearly identical surveys
were consistent with same instructional methods used for both classes. Approximately ten days
after the completion of the Colonial Carolina unit, student interviews were conducted. The same
ten students that were chosen for my first session of interviews were also chosen for my study's
final interview sessions. The interviewed students were asked questions about Colonial Carolina
in an effort to ascertain student levels of content retention over extended periods of time.

Data Analysis
As I've previously stated, my teacher research project produced significant amounts of
data, including both quantitative and qualitative categories. My research journal, which
contained both my observations and reflections, proved to be a great source of qualitative
inspiration for this study, often providing patterns and "ah-ha" moments of clarity, which have
added substance to my research narrative. My pre-assessments for both instructional units and
groups of students provided me with the foundational information I needed from which to
determine student growth - these are the easiest numbers to compare due to the simplistic nature
of multiple choice analysis and the efficiency of Quia as an assessment tool. The post-
assessments, which were exactly the same as the pre-assessments in terms of multiple-choice
items, provided comparison data of both classes and the instructional methods employed. For
both units extended response portions, student answers were graded with strict adherence to an
established rubric, which provided consistent quality response levels. Like the comparison data
afforded by Quia and the multiple choice items, the extended response rubrics provided
additional comparison data in terms of student overall achievement.
Using a mixed-methods analysis, the exit surveys from both instructional units were first
separated and sorted based on items being either sources of quantitative or qualitative data. The
quantitative sources, which represented the survey items and were limited to either a Yes,
No, or Uncertain options, were calculated in order to ascertain a majority consensus toward
certain items. The qualitative items, which represented the open-ended response survey items,
were then analyzed and compared to the quantitative scores in an attempt to identify any
correlations between the sources of data. The data was then applied to a spreadsheet in order to
better visualize their responses and make a more valid and authoritative generalization of
students' collective voice.
The data from the student interview sessions were analyzed using the same methodology
consistent with the exit surveys. Since the same students were interviewed for both interview
sessions, their responses were compared both independently (over the course of two interview
sessions) and collectively (as a group receiving that specific type of instruction). The
information provided by the surveys and interview sessions proved to be an equally important
source of data as compared to the culminating assessments which concluded each unit of study.

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.


Meridian: A K-16 School Computer Technologies Journal 6
ISSN: 1097-9778

Findings

What follows are groups or sets of data and surveys which represent my findings over the
course of my teacher research project. The data has been appropriately labeled with brief
commentary from both my students and myself. The implications and interpretations of the data
collected will be reserved for the concluding discussion portion of my study.

Table 1: Assessment Scores for Jamestown Unit

Pre-Assessment Post-Assessment Extended Response


Mean Score Mean Score Mean Score
(% out of 100) (% out of 100) (1-12 scale,
12 being best)
Group A: Recipients
of student-driven/ 52.8% 88.4% 7.3
tech-heavy instruction

Group B: Recipients 47% 89.8% 6.8


of traditional
instruction

Little variation exists in terms of the multiple-choice post-assessment scores (1.4


percentage points). However, a closer look at the data reveals that Group B, which received the
traditional method of instruction, outgained the hybrid group by more than seven points in
overall gain. Also, worthy of mentioning is the half percentage point that Group A, which
received the student-driven/tech-heavy method of instruction, outscored Group B.
After concluding this unit of instruction, both groups of students completed an exit
survey. An abbreviated version of both surveys is as follows. The open-ended survey responses
mentioned are a reflection of the type of feedback I received from students.
Group A Survey (Recipients of Student-Driven/Tech-Heavy Method of Instruction):
1. Did you enjoy this way of learning?
Yes: 70 % No: 17% Uncertain: 13%
One student who responded Yes commented: "I liked it because you don't have to sit in class
and only answer questions out of a book."
2. For the next unit would you prefer the teacher to be more involved?
Yes: 50% No: 42% Uncertain: 8%
One student who responded Yes added: "Yes I would. I feel that I needed more direction with
this. Another student responded: "Yes, so I wouldn't get lost as much."
Group B Survey (Recipients of Traditional Method of Instruction):
Based on collective responses from these students, the following hierarchy represents
Group B's favorite activities completed during the Jamestown unit:
1. "Nightmare at Jamestown" DVD followed by class discussion

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.


Meridian: A K-16 School Computer Technologies Journal 7
ISSN: 1097-9778

2. "Pocahontas Revealed" with corresponding questions


3. Two Pocahontas articles (from different viewpoints), use of Venn diagram, followed
by a class discussion
4. Reading an on-line article, followed by a class discussion
5. Reading from the text book, followed by answering comprehension questions
When asked what types of activities students would like to do in the future, 18 students out of the
26 students surveyed from Group B responded specifically that they would like to do a project.

Table 2: Assessment Scores for Colonial Carolina Unit


(Both groups received hybrid method of instruction for this unit)
Pre-Assessment Post-Assessment Extended Response
Mean Score Mean Score Mean Score
(% out of 100) (% out of 100) (1-6 scale, 6 being
best)

Group A 29% 88% 3.9

Group B 28% 91% 4.4

For the Colonial Carolina unit of my study Groups A and B show similar progressions in
terms of the multiple-choice portions of the assessment Group B did have an overall gain of
four more points that Group A. Group B shows a .5-point higher score on their extended
responses - Group A had five students score a perfect 6; Group B had six students scoring a
perfect 6. The first extended response assessment from the Jamestown unit was on a 12-point
scale, while the Colonial Carolina units extended response was on a 6-point scale. By
multiplying the Colonial Carolina extended response scores by 2, you get a better comparison of
any growth from students that occurred across assessments. It should be noted that the Colonial
Carolina Unit extended response portions included the element of choice, which allowed
students to select which essay item to answer.

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.


Meridian: A K-16 School Computer Technologies Journal 8
ISSN: 1097-9778

Table 3: Comparison of Extended Response Portions of Assessments

Jamestown Unit Colonial Carolina


Mean Score Mean Score
(12 pt. scale) Unit (score x 2)

Group A 7.3 7.8

Group B 6.8 8.8

Upon completion of the second instructional unit, students from both groups took an exit
survey. What follows is an abbreviated version of that survey.
Group A Survey:
1. Did you like me (the instructor) being more involved than the previous unit?
Yes: 76% No: 16% Uncertain: 8%
One student who responded Yes commented: "I liked you being involved more because it was
more organized. Another student responded: " I liked you being involved more because I felt
like I stayed more focused.
2. Do you think you learned more or less with me (as the instructor) being more
involved?
More: 72% Less: 20% Uncertain: 8%
Two students who responded More added the following comments: "It was easier to pay
attention." "I learned more because I stayed on task more and didn't play around as much."
Group B Survey:
1. Did you like me (the instructor) being involved less than the previous unit?
Yes: 46% No: 46% Uncertain: 8%
One student who responded Yes added: "It let us be more responsible and independent."
2. Do you think you learned more or less with me (as the instructor) being less involved?
More: 38% Less: 38% Same: 24%
One student who responded More commented: "I learned better with the teacher not so involved
because I got to see it in a different way than the teacher." One student who responded Less
added: "I learned better with you more involved because you explain it in a language that we
can understand easier."
Student interviews were conducted at the conclusion of each unit of study. The first
interview session took place seven days after completing the Jamestown unit. The second
interview session took place ten days after the completion of the Colonial Carolina unit. The
same ten students were interviewed for both interview sessions.
For the first interview session students were interviewed individually and were asked the
same questions which were featured as the extended response items for the Jamestown post-
assessment. The following chart shows the distribution of correct response from both groups

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.


Meridian: A K-16 School Computer Technologies Journal 9
ISSN: 1097-9778

along with the extended response item:

Table 4: Correct Responses from First Interview

Why did settlers come List some of the Why did the colony
to Jamestown? hardships endured by survive?
settlers at Jamestown.
Group A: Recipients
of Student- 3 out of 5 5 out of 5 1 out of 5
Driven/Tech-Heavy Answered correctly Answered correctly Answered correctly
Instruction

Group B: Recipients 3 out of 5 5 out of 5 2 out of 5


of Traditional Answered correctly Answered correctly Answered correctly
Instruction

For the second interview session the students were asked the same item that they chose to
respond to on the extended response portion of the Colonial Carolina post-assessment. Every
student interviewed was able to correctly respond to her or his chosen essay item, displaying
significant knowledge about the topic - some students even answered (with detail) answers to
other items they were not responsible for. The responses articulated by students for the second
interview session were substantially more detailed compared to the responses received during the
first interview session. For example, one student from Group A, when asked questions #1 and
#3 from the first interview session, responded that she did not know. When asked about the
Cape Fear region of Colonial Carolina and its significance, she responded, "The colonists and the
Lords Proprietors did not get along well. When Governor Burrington sold those blank patents to
the Moore family, the Lords Proprietors had had enough of the colonists. This forced King
George to take over and the colony and it went from being a proprietary colony to a royal
colony." Another great example came from a student in Group B, when asked questions #1 and
#3 from the first interview session, he responded with partially correct responses, responding
simply "gold" for question #1 and only the word "tobacco" for question #3. When asked about
the Pamlico region of Colonial Carolina and its significance, he responded "Well in Pamlico you
had Cary's Rebellion which was about when people were forced to go to church. And you also
had the Tuscarora War, which was about land; the Tuscarora were mad at the colonists for taking
their land. The colonists won the war and forced the Tuscarora out; they moved to New York
and lived with the Iroquois. Pamlico also had North Carolina's first towns, Bath and New Bern."
These types of detailed responses were commonplace during the second interview sessions.

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.


Meridian: A K-16 School Computer Technologies Journal 10
ISSN: 1097-9778

Table 5: Correct Responses for Student Chosen Extended Response Items

Group A: Recipients 5 out of 5


of Hybrid Instruction Answered correctly

Group B: Recipients 5 out of 5


of Hybrid Instruction Answered correctly

Discussion

What follows is a list of conclusions that have emerged from my teacher research project
findings. These findings are followed by their implications for my own classroom practice.
1. A hybrid form of instruction may be most effective at reaching my students in terms
of student preferences for learning and learning outcomes.
2. Student choice regarding the selection of essay items may to increase writing scores.
3. My students desire structure and active teacher involvement in their learning.
4. Raising student-writing expectations may be an additional factor to increase writing
scores.
After analyzing the effects of the three instructional methods utilized for this study, the
hybrid method proved most successful in terms of student performance and was also preferred by
most students. Both groups showed gains on their writing assessments after having received the
hybrid method compared to the other two instructional methods. This finding will directly affect
my future planning because of the important role writing plays in my class. Since I teach a non-
tested subject, part of my responsibilities, in addition to teaching my curriculum, include
supporting my English colleague by cultivating and encouraging both reading comprehension
and student writing. The realization that student choice in regards to essay items, when coupled
with the hybrid form of instruction, has shown to increase students writing scores will assist me
in designing future writing assignments that include multiple items for students to choose from.
I was completely blown away by the difference choice made on student writing assessment
scores - Group B increased their writing assessment scores by nearly 25% when the choice
option was included! The fact that students overwhelmingly retained more information about
topics in which they had a choice further emphasizes the importance of choice in student
learning. This realization has allowed me to see the importance of focusing my instruction on
the main points of interest and then allowing students to extend their learning by further
examining a content topic of their choosing.
Probably one of the most surprising things to emerge from my study was my students
desire for structure and active teacher involvement in their learning. This fact became most
evident after analyzing the results of the exit surveys from both units. After completing the first
unit of study 70% of Group A responded that they enjoyed the student-driven/technology-heavy
method of instruction. When asked if they liked me being more involved after having completed
the second unit, an overwhelming 76% responded Yes, and 72% said they felt they learned

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.


Meridian: A K-16 School Computer Technologies Journal 11
ISSN: 1097-9778

more with me taking a more active role in their learning. This desire for me to be directly
involved more in their learning is both comforting and concerning. While I appreciate and am
flattered by their desires for me to be more directly involved, I also feel that as they grow older
they will need to become more independent in their learning. This realization, I feel, just
reinforces the need for a perfect balance in terms of teacher involvement and the use of
technology in the classroom. The use of technology does allow opportunities for independent
learning, but as my study has shown, direct teacher involvement is also required in order to
maximize learning outcomes.
In addition to allowing my students the choice of which essay item to respond to for the
extended writing portion of the Colonial Carolina units post-assessment, I also established a
minimum length requirement for their responses - page was the minimum acceptable length. I
was disappointed by their written responses in terms of length and substance from the first
assessment, so I decided a minimum acceptable standard was necessary. By raising my
standards in regards to response length, my students in-turn produced a more detailed,
intelligent response; many of their responses surpassed a full written page.
The following is a list of conclusions that emerged from my study that I feel extends
beyond the walls of my own classroom and can apply to any classroom. Following my list of
conclusions are some of my own thoughts on their potential implications for other practitioners.
1. A delicate balance of technology and teacher involvement is needed in the classroom.
2. Involving students in educational decisions such as selecting an essay topic may be an
effective strategy for student learning.
3. A student-centered/tech-heavy method of instruction may carry with it some
hindrances regarding student performance on multiple choice assessments.
After having completed my study it is my belief that an equal balance of technology and
direct teacher involvement in classroom instruction produces the greatest results. Too much of
one and not enough of the other, Ive discovered, can limit a classrooms potential. Students
who receive too much technology and not enough direct instruction often lack discipline,
structure, and direction; while students who receive too much direct instruction and not enough
technology become easily bored, complacent, and uninspired. Finding the right balance will be
up to the individual classroom teacher to find, but once attained, I feel, they will find its
implementation to be most effective.
Once the right balance of technology and teacher involvement is attained, it will also be
up to the classroom teacher to determine what technology to use, and more specifically, what
web 2.0 tools to use. It is my belief, based on the results of my study, that technology can
greatly enhance student achievement. It was upon receiving the use of technology, specifically
the web tool Glogster, that Group B, when coupled with the hybrid method of instruction, raised
their written assessment scores by 25% and also increased their content retention levels
significantly.
After having utilized three different instructional methods for my study, one of the more
interesting realizations to emerge was the possible impact a student-centered/tech-heavy method
of instruction may have on student performance on multiple-choice assessments. Given the
dominant role multiple choice assessments play in standardized testing, teachers should consider
the possible implications of methodologies lacking direct instruction, such as the student-

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.


Meridian: A K-16 School Computer Technologies Journal 12
ISSN: 1097-9778

centered/tech-heavy method, especially when multiple choice assessments are the primary
vehicle for registering student growth and teacher effectiveness.

Closing Thoughts:
I've gathered many valuable experiences and knowledge from my teacher research
project. I have discovered how important and effective student feedback can be in terms of
guiding instructional choices. Giving my students a voice during this teacher research project
has been incredibly beneficial and eye opening. Previous to this process, I had really never given
much thought to how students feel about the way they learn. What I've discovered is how linked
student preferences are to what they actually learn. As a teacher I have to be open to suggestion
and allow for student choice. But additionally, I have learned that my students, whether they'll
readily admit it or not, do not want all of the decision making power to lie solely in their hands -
in fact for this study, students expressed a desire for a teacher who is structured, well-planned,
and highly involved in their learning. When I set out to do this teacher research project I was
determined to find out how my students learn best, and to a certain degree I feel I accomplished
that, but equally important I feel I discovered how I learn best about my students. It's through
this teacher research process that I've learned to better listen to my students and give them an
active voice in their learning. In a 21st Century learning environment technology will and
should have its prominent place alongside an inspired, active teacher, a teacher who places great
value and appreciation to the thoughts, learning preferences, and individual interests of their
students.

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.


Meridian: A K-16 School Computer Technologies Journal 13
ISSN: 1097-9778

References

Hernandez-Ramos, P., & De La Paz, S. (2009). Learning history in middle school by designing
multimedia in a project-based learning experience. Journal of Research on Technology
in Education, 42(2), 151-173.

MacLean, M. S., & Mohr, M. M. (1999). Teacher-researchers at work. Berkeley, CA: National
Writing Project.

Parker, W., Mosborg, S., Bransford, J., Vye, N., Wilkerson, J., & Abbott, R. (2011). Rethinking
advanced high school coursework: Tackling the depth/breadth tension in the ap us
government and politics course. Journal of Curriculum Studies, 43(4), 533-559.

Reich, G. A. (2009). Testing historical knowledge: Standards, multiple-choice questions and


student reasoning. Theory and Research in Education, 37(3), 325-360.

Wirkala, C., & Kuhn, D. (2011). Problem-based learning in k-12 education: Is it effective and
how does it achieve its effects? American Educational Research Journal, 48(5), 1157-
1186.

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.


Meridian: A K-16 School Computer Technologies Journal 14
ISSN: 1097-9778

Effective Methods for 21st Century Learning: A Teacher Action Research Project

Aaron Johnson
Virginia Tech

Virginia Tech School of Education, Teaching and Learning,


308 War Memorial
Blacksburg, VA 24061
apjohns3@vt.edu

Aaron Johnson is a doctoral student and graduate assistant at Virginia Tech in Blacksburg,
Virginia. He currently supervises pre-service social studies teachers enrolled in the universitys
History and Social Science licensure program. Aarons research interests include citizenship and
global education, specifically identifying/understanding the emerging complexities of
globalization as it relates to traditional citizenship conceptualizations. Prior to coming to
Virginia Tech, Aaron taught middle grades social studies for six years in rural North Carolina,
while earning a M.Ed. from North Carolina State University. He is a past contributor to the
segment Ask a Master Teacher featured at Teachinghistory.org, and has conducted
presentations to various audiences on topics ranging from promoting written literacy in the social
studies classroom to facilitating new literacies in a 1:1 classroom environment.

Volume 16 / 2013

http://www.ced.ncsu.edu/meridian | meridianmail@ncsu.edu | All rights reserved by the authors.

Anda mungkin juga menyukai