Amanda L. Wilson
Abstract
This prospectus aims to present the proposal of a thesis for completion of the requirements of an
EdS in Higher Education at Appalachian State University. It presents the case for an action
Spanish 1 course. This paper presents a review of literature emphasizing a culture shift in
education from teaching to student learning and the way assessment and accreditation procedures
are playing a part in this shift. Student perception is noted as being less of a focus in published
literature and one of the reasons for this proposed study. The proposed purpose of this study
would be to explore the reflections of students in a beginning level Spanish class towards various
forms of traditional and authentic assessment tools. The context, participants, research plan, and
plan for evaluation are explicated in great detail. A time line is provided for the study that would
have it begin in January of 2011 and end in April of the same year. Possible methods of showing
validity are discussed, along with issues of the ethics and subjectivity of the study. A lengthy
bibliography is provided followed by appendices that concisely provide the research tools and an
There has been a shift in higher education from teaching objectives to student learning
outcomes. Much of this drive has precipitated from a shift in focus by regional accreditors of
schools from how many resources a school has to educational effectiveness. The focus on
institution as a whole that strives to engage the entire academic community. Authentic
assessment measures, those methods of assessment not only measure achievement as an end
product but are learning experiences as well, are becoming more common, as supplements to
more traditional assessment measures, to provide valuable data needed to inform this culture of
change and improvement. The outcomes assessment movement in foreign language classes has
embraced these authentic assessment measures, mostly in the form of portfolio-style projects for
students. Unfortunately, there is less research on how students are receiving these various forms
of assessment and what they perceive as the benefits or drawbacks of each. The purpose of this
study is to explore the reflections of students in a beginning level Spanish class towards various
from my beginning Spanish 1 classes. This will be an action research study that will build out of
Classroom Research, as developed through the work of Dr. Patricia Cross. Cross defines
Classroom Research as “ongoing and cumulative intellectual inquiry by classroom teachers into
the nature of teaching and learning in their own classrooms” (1996 p. 2). In her text on action
research, Hendricks states that “the purpose of action research is for practitioners to investigate
HOW STUDENTS PERCEIVE THEIR LEARNING 4
and improve their practice” (2009 p. 3). These two perspectives on research as investigation of
teachers into their practice for the purpose of improved student learning allows me to approach
the issue of student perspectives on assessment in a very localized but very powerful way. By
seeking out the opinions of my own students on the specific assessment tools I have used in their
classes, we will have a concrete context in which we can explore their insights on how they learn
and the way assessment affects their learning. These reflections can then directly impact my
Continual (1996 p. 2). My study will focus on my students’ perspectives of learning. It will of
course be facilitated and directed by me, as their former teacher. It is collaborative in the sense
that my students will help direct my research, conclusions, and future behaviors based on their
feedback. This study is context-specific because it focuses on how these assessments are applied
in, not just a foreign language class, but specifically a beginning Spanish class. The project is
scholarly because it builds off ideas and insights researched and presented in the literature review
below. My study is practical and relevant because I will use the data I gather to form conclusions
that will inform my future practice. Finally, my study is continual in the sense that it will serve
as a foundation for future work on student reflections and working with different types of
Cross comments that “a Classroom Research project is not a one-shot effort that is
completed, published, and assumed to contribute one more brick to building the wall of truth”
(1996, p. 12). This study does not aim to predict how all students will view and appreciate
various assessment techniques. Nor should the conclusions drawn be assumed to show any one
HOW STUDENTS PERCEIVE THEIR LEARNING 5
truth. This will be an exploration of a select few students taking a specific course taught a certain
way by me at one given time. The results of this study would most assuredly be different if the
study were duplicated. The purpose of this study is to explore the reflections of specific students
in a specific beginning level Spanish class towards various forms of traditional and authentic
assessment tools. I hope that by gathering and reflecting on their various opinions, I can make an
informed decision about how to make the tools I use more effective for the learning experiences
of my future students.
The process is one of trial and error and any study will give specific information that may
be difficult to generalize to other populations. Cross says that “Classroom Research is based on
the premise that generalizations across classrooms are, at best, tentative hypotheses, to be tested
within the specific context of a given classroom” (1996, p. 12). This study aims to create another
piece of the puzzle. By collecting student reflections on assessment, it aims to give a tiny insight
into how some students perceive these tools. The hope is that those insights, while very context-
specific, might inspire me, and hopefully others, to continue asking the questions that must be
asked in the classroom: How do students learn? How can teachers facilitate student learning?
What tools can educators employ to create the most effective learning environment possible?
Because at the very foundation of what it is to be an educator must be the hope, desire, and
Fundamental Assumptions
I hold a few fundamental assumptions going into this study. The first assumption that has
always rung true for me is that students are experts at being students. I teach university-level
beginning Spanish 1 and my students are primarily freshman and sophomores, the very youngest
being 17 years old. Said another way, every student in my class has been going to school for at
HOW STUDENTS PERCEIVE THEIR LEARNING 6
least 12 years. While few have any knowledge of pedagogy, they have some awareness of how
they learn. I believe that by asking students to reflect on the way they learn and what assessment
tools have benefitted their learning experience in my beginning Spanish 1 courses, I can learn to
see my classroom from their perspective and improve the way I approach and plan future classes.
Students are not teachers and may not always see what a teacher does or understand why they do
it, but by gathering data from these learning experts, I can better understand how to facilitate
their learning.
The second assumption that I hold going into this study is that good assessment is not
simply a measure of student accomplishment but a method to engage and promote learning. I
chose to focus on assessment tools in this study because I worry that the goals of my course may
not align well with the ways I assess learners. I see this as major worry across various institutions
of learning and from various colleagues and students. It is too common to hear negative
anecdotes about the business of education. These pessimistic views do not truly represent the
educational system but there is a grain of truth in every story. I chose to pursue this study
because I want to find the places where my assessment practices are aligning with my course
goals and continue to pursue those avenues while discovering weaknesses in my practices that I
can improve upon. I believe that by focusing on studying my current assessment methods, I can
determine where students are simply going through the motions of assessment to demonstrate
accomplishment and where they are really being engaged by these tools. That is not to say that I
believe accomplishment and engagement are mutually exclusive, simply that I want to make sure
I am working to promote learning. If student learning becomes more effective through the use of
the tools, I hope that demonstration of accomplishment and levels of engagement will also
benefit.
HOW STUDENTS PERCEIVE THEIR LEARNING 7
Research, and focused on my specific students and assessment tools, I believe I will gain
valuable insight into student perspectives on various forms of traditional and authentic
assessment tools and lay a foundation for me to continue improving my teaching practice. The
most important objective is to learn to be the most effective facilitator of student learning that I
Literature Review
and student perceptions, a shifting picture began to emerge. The educational system, due to many
internal and external factors, is shifting its focus from teaching to learning, from teacher to
student, and from product to process. Instead of starting educational planning with the discipline
being taught, planning is beginning with thinking about what educators want students to learn.
reflection, and improvement is being disseminated and accepted all over, much of which is
happening via the regional accrediting agencies. This is sparking collaboration amongst the
entire academic community. Alternative methods of assessing teaching and learning are being
used and the focus is resolving on student learning and continual assessment and improvement.
But what do students think? Is this shifting focus getting to them and making a difference? There
is much less written about this. The review below will highlight this shift and a bit of the force
behind it in the guise of the accreditors. It will hit on the purpose of all this assessment and the
way it is helping to bring some of the various campus populations together to collaborate. It will
explore some of the ways alternative assessments are offering helpful options and the way the
HOW STUDENTS PERCEIVE THEIR LEARNING 8
shift to student learning outcomes, specifically in foreign languages, is playing out. To round it
out, I will bring in the little that I was able to find concerning students’ perspective on this
A culture change is occurring in higher education that shifts the primary focus of
educators from teaching the subject matter of specific disciplines to a perspective of student
learning (Allen 2004, p. 1). “As departmental, organizational, and institutional cultures undergo
change, and as the focus of that change is less on teaching and more on learning, a commitment
to sustainable outcomes assessment becomes essential” (Hernon, et al. 2006, p. 1). According to
Allen, this type of assessment occurs when, “empirical data on student learning is used to refine
programs and improve student learning” (2004, p. 2). The focus becomes here shifts to how
effective programs are at facilitating student learning instead of perhaps a more traditional
perspective on assessment that would focus solely on whether or not the student is showing
tool for enhancing teaching and learning” (2004, p. 194). He goes on, a few pages later, to add
that “Continuous assessment starting early in the semester has the benefit of quickly identifying
those students falling behind and perhaps at risk of dropping out, so remedial action can be
taken” (p. 198). With this focus, assessment becomes more focused on being formative than
Allen comments that “while classroom assessment examines learning in the day-to-day
curriculum” (2004, p. 1). In their 2001 work, Ratcliff, et al., point out that “the continuous
HOW STUDENTS PERCEIVE THEIR LEARNING 9
improvement cycle should begin with clear departmental goals that identify what a student can
expect to gain as a result of studying a particular field or discipline” (p. 25). Hernon, et al. add
that “programs and institutions need to develop a strong and sustainable commitment to
assessment as a process and as a means to improve learning based on explicit student learning
outcomes” (2006, p. 11). Ratcliff, et al., further point out that “while a college’s or university’s
general goals for student achievement can be measured at the university level, the accreditation
programs must contribute to the self-study by assessing their students’ learning” (2001, p. 32).
This process is not, at least at its core, one of displaying big numbers with no real meaning or
impact. Departments must assess student learning so that not only are they able to show
achievement but they are able gather necessary information for improvement of student learning.
accreditation. As this culture shift continues to push the focus towards student learning,
of programs and schools (Ratcliff, et al. 2001, p. 13). The regional accreditors then become one
of the primary exterior forces helping to drive this shift of focus to student learning outcomes.
focus on two major issues: capacity and effectiveness.” She goes on to explain that capacity is
the bean counting of the process: when tallies are taken of the resources any institution has to
support its students such as libraries, technology, physical space, and student support services.
The focus, however, has really turned more towards a long-term commitment to improving
student learning (Hernon, et al. 2006, p. 1). Allen also says that accrediting organizations “expect
HOW STUDENTS PERCEIVE THEIR LEARNING 10
campuses to document their impact on student learning” (2004, p. 18) and that “when accrediting
bodies require assessment, campuses pay attention” (2004, p. 2). She cautions, however, that
external agency requires it” (Allen 2004, p. 2). The entire purpose behind this drive toward
On pages 163 and 164, Allen imparts some “friendly suggestions,” one of which is to
“close the loop,” stating that “good assessment has impact” (2004). The foundation of all this
assessment is that it will drive change towards the continual improvement of the quality of the
educational system. As Ratcliff, et al., put it “Assessment and accreditation are both premised on
the importance of quality assurance” (2001, p. 17). If the data collected through the assessment
procedures is not analyzed for methods of improvement and if those methods are not
implemented, the whole process is invalidated. Assessment is only the beginning of the cycle.
The academic community must work together to be sure that the data collected is used
effectively.
To move toward these lofty goals, faculty, institutional research offices, and “everyone in
the educational enterprise, has [the] responsibility for maintaining and improving the quality of
services and programs” (Ratcliff, et al. 2001, p. 17). Assessment of student learning outcomes
“includes all members of the [educational] community as they strive to contribute to and enhance
the educational enterprise” (Ratcliff, et al. 2001, p. 17). The various smaller communities within
colleges and universities have to coordinate their efforts and pool their resources to make sure
HOW STUDENTS PERCEIVE THEIR LEARNING 11
the assessment process is as productive and effective as possible. One type of tool that is
becoming more popular amongst these communities are authentic assessment tools. More
formative in nature that traditional assessment tools, they are adding more and richer data to the
assessment process.
To understand what tools will work best for these assessments, it is best to start with an
idea of what assessment should aim to do. Brown, et al., expound the functions of assessment on
5. Marking: generating marks or grades which distinguish between students or which enable
6. Quality assurance: providing evidence for others outside the course (such as external
With these purposes in mind, assessment methods can be examined to determine their validity.
On pages 62 and 63, the authors discuss traditional unseen written exams and how they function
as assessments: “In particular, this assessment format seems to be at odds with the most
important factors underpinning successful learning…there is cause for concern that traditional
unseen written exams do not really measure the learning outcomes which are the intended
purposes of higher education” (Brown, et al. 1999). Palmer echoes these concerns on page 194 of
his 2004 paper on authenticity in assessment, stating that “traditional forms of assessment can
HOW STUDENTS PERCEIVE THEIR LEARNING 12
encourage surface learning rather than deep learning.” Banta goes a bit further, with her colorful
simile to discourage purchasing more traditional assessment measures for the purposes of
improving student learning: “Just as weighing a pig will not make it fatter, spending millions to
test college students is not likely to help them learn more” (2007, p. 2). Seeing some deficiencies
in traditional forms of assessment, experts start working on describing what might improve
attempts at assessment.
Watson points out “a need for more authentic, learner-friendly methods to encourage
[student] engagement” (2008, p. 1) which seems to align with Brown, et al.’s first and second
functions of assessment listed above of “capturing student time and attention” and “generating
appropriate student learning activity” (1999, p. 47). Watson goes on to point out, a bit further on,
method of improving learning, testing promoting surface, rather than deeper, learning, and
testing not being aligned with learning objectives as quoted above of Banta, Palmer, and Brown,
et al., respectively. Banta also notes that “authentic and valid assessment approaches must be
learning that do not capture the difficult and demanding intellectual skills that are the true aim of
a college education” (2009, p. 3). She continues on page 4 that the point “is knowledge creation,
not knowledge reproduction” (Banta 2009, p. 4). Her concerns about assessment draw a picture
of a system that needs improvement and, as Brown, et al. also refer to, alignment with learning,
not just an attempt to demonstrate that learning has or has not occurred.
Brown, et al., bring it together nicely when they state that “ultimately, assessment should
be for students…[as] a formative part of their learning experience” and that students who
HOW STUDENTS PERCEIVE THEIR LEARNING 13
develop their test-taking skills also “tend to succeed in assessment” most, regardless of whether
or not they are the most qualified in their field (1999, p. 58). In other words, their first two
functions of assessment, “capturing student time and attention” and “generating appropriate
student learning activity,” along with the fourth and sixth, “helping students to internalize the
discipline’s standards and notions of quality” and providing for “quality assurance,” are just as
important as the fifth, “marking” (Brown, et al. 1999, p. 47). It is marking which tends to get all
the attention but most often seems more prone to be partially invalid in the case of many
traditional assessment measures, whereas authentic assessment puts the focus on student learning
outcomes. Because the focus becomes the experience of learning, rather than grading, “the onus
measure,” and thereby facilitates an increase in the validity of the assessments (Brown, et al.
1999, p. 59).
It is important to note that the point is not to throw away traditional assessment measures.
They can still serve some of the functions of assessment well. As Ratcliff, et al., state on page 28
of their 2001 text, “formative and summative assessment methodologies provide the department
or program with evidence of their students’ learning.” While traditional summative assessments
can, and should, support the functions of assessment processes, their results cannot stand alone to
inform the process of continual improvement of learning (Ratcliff, et al. 2001, p. 28). To really
get at that sixth purpose of “quality assurance,” a balance is needed (Brown, et al. 1999, p. 47).
Authentic assessment measures, matched with more traditional assessment measures, add the
necessary formative piece to the assessment puzzle and provide the necessary data for improved
student learning.
The imbalance and the lack of alignment with learning of assessment can be seen in the
field of foreign languages as well. Trends over the last couple of decades in foreign language
language courses typically are pen and paper exercises that single out discrete points of grammar
or vocabulary” (Higgs 1987, p. 1). Higgs raised the warning almost twenty-five years ago that if
foreign language educators really want to set communication as a goal for their students, then,
“assessment procedures must test for communicative function” (1987, p. 1). While these pen and
paper exams are still common place, language classes have also seen an influx of authentic
Most of these assessments have come in the form of portfolio-style projects. Banta
commented in her 2007 article on assessment that portfolio assessments would be the most
authentic because students develop the content themselves (p. 4). Studies of English as a Foreign
Language (EFL) learners have found that portfolio-style assessments have contributed to student
learning, especially when combined with other assessment measures, and that portfolio
assessments help students take ownership of their learning (Barootchi, et al. 2002 & Caner
2010). Additionally, one study found that some EFL students in writing courses preferred the
portfolio assessments over more traditional assessments (Caner 2010, p. 1) though research into
There are many styles of portfolio assessments depending upon the specific needs of the
assessment, but the seemingly most popular version in language learning is the self-assessment
portfolio. The European Language Portfolio (ELP) was the model for the American adaptations:
LinguaFolio and the Global Language Portfolio (Cummings, et al. 2009, p. 1). These portfolios
1). Moeller points toward these self-assessment models as more valid forms of assessment than
more traditional assessment models (Moeller 2010 Self-assessment in the foreign language
classroom). She describes LinguaFolio, in particular, as, “a portfolio that focuses on student self-
assessment, goal setting and collection of evidence of language achievement” (Moeller 2010
LinguaFolio). The students set their language goals and determine when their goals are met
based on the evidences that they collect of their own work (Fasciano 2010, slide 9). Moeller
points out that if language educators are using LinguaFolio effectively, it will necessitate moving
away from teacher-centered methodologies and toward learning-centered outcomes because it is,
by its very definition, a learner-centered self-assessment tool that facilitates the processes of goal
setting and self-reflection and establishes intrinsic motivation in students (Moeller 2010
LinguaFolio). Brown, et al., also mention that a major advantage of these types of assessments is
that they promote intrinsic motivation through personal involvement because the student is
Self-assessment portfolios are not the ultimate answer to the issues that traditional
assessment has raised. Student self-assessment comes with its own set of new issues. Moeller
points out three main disadvantages to self-assessment in her paper, Self-assessment in the
foreign language classroom: it can be unreliable because students are not experts on assessment,
students can cheat, and few students engage in it (2010, p. 3). Still, these types of portfolios can
It is essential to remember that there is no one ultimate assessment but by using various
methods and blending authentic and traditional assessment measures, the necessary data can be
collected to inform change and promote student learning. The last piece of the puzzle is to
involve the most important part of the academic community to the discussion: the students.
HOW STUDENTS PERCEIVE THEIR LEARNING 16
While the focus has apparently shifted from teachers to students, the assessment process
is still very much a top-down one. “At present, students often feel that they are excluded from
the assessment culture, and that they have to use trial and error to make successive
approximations towards the performances that are being sought in assessed work” (Brown, et. al.
1999 p. 58). Because of the reflections gathered from their students, Brown, et al. encourage
“innovation in assessment” (1999, p. 81). They also stated: “to some students, conventional
learning” (Brown, et al. 1999, p. 81). Instead they found that, “students appreciate assessment
tasks which help them to develop knowledge, skills and abilities which they can…use in other
contexts” and they encourage “assessment which incorporates elements of choice” because “it
can give students a greater sense of ownership and personal involvement in the work and avoid
the demotivating perception that they are simply going through routine tasks” (Brown, et al.
1999, p. 81). From the little bit from Brown, et al., here (1999) and the bit from Caner on EFL
students preferring portfolio assessments to traditional ones (2010, p. 1), it seems that students
prefer having some form of authentic assessment to be involved in the process. Unfortunately,
due to the lack of research in this area, anything more would be merely idle conjecture.
The students are the untapped resource here. Their reflections could provide valuable
information on the assessment process. To that end, the purpose of this study is to gather some of
those reflections towards various forms of traditional and authentic assessment tools.
Research Questions
As previously stated, the purpose of this action research study is to explore the reflections
of students in a beginning level Spanish class towards various forms of traditional and authentic
HOW STUDENTS PERCEIVE THEIR LEARNING 17
assessment tools. To this end, I plan to focus my research on the overarching question: What are
the perceptions of undergraduate students related to traditional and authentic assessments used in
an introductory Spanish course? Both the interviews and surveys, each of which will make up the
methods of data collection in this study and will be described in detail in the methodology
section below, will attempt to solicit information related to this question. To support this
1. What do students think are the benefits or limitations of each type of assessment on
2. How do students feel that these assessments can enhance or detract from their
learning experience?
3. What factors do students feel affect the impact of each type of assessment on their
learning?
4. What preferences do students express toward each type of assessment? What are
The first set of sub-questions: “What do students think are the benefits or limitations of each type
of assessment on their learning?” and “Do students think these assessments reflect their
learning?” aim to organize the reflections of students on what is working and what is not in
relation to the effect, if any, these assessments have on their learning, as well as whether or not
they believe each type of assessment is able to capture and demonstrate what they are learning.
There are both interview and survey questions that will attempt to get students to reflect on these
The second sub-question: “How do students feel that these assessments can enhance or
detract from their learning experience?” is meant to get at how, and whether, students perceive
these assessments having any effects on their entire process of learning. Since Brown, et al.,
listed one of the functions of assessment as “generating appropriate student learning activity”
(1999, p. 47), it seemed important to ask students to reflect on whether they consider these
assessments part of their learning. Since this question is asking only for open-ended responses, it
will only be addressed in the interview questions, not those of the survey.
The third sub-question: “What factors do students feel affect the impact of each type of
assessment on their learning?” will focus the data gathered from student reflections on how the
specific elements of each type of assessment changed or influenced their learning. Like the
previous one, this sub-question calls for open-ended feedback that will most likely come almost
The fourth sub-question set: “What preferences do students express toward each type of
assessment?” and “What are their reasons for these preferences?” will explore where student
preference falls among the types of assessments and their reasons for those preferences. This will
help to show if it is the type of assessment that is preferred by the students or only specific
aspects of the specific assessments the students have been exposed to that they prefer. Some of
this data will come from the survey but the majority of it will be collected via the interviews.
The final sub-question: “What recommendations do students have for enhancing their
perceived effectiveness of each type of assessment?” aims to have students hypothesize about
ways to make each type of assessment more useful to them. This question will be almost
exclusively answered during the interviews but there is also the possibility that some of this
information may be volunteered at the end of the survey when students are asked for any
HOW STUDENTS PERCEIVE THEIR LEARNING 19
additional comments.
Figure 1.1 below summarizes which data collection methods, described in the
methodology section of this work, will most likely address each research question listed above.
Figure 1.1
Research Questions and Related Data Collection Methods Matrix
Research Questions Interview Survey
Questions Statements
Overarching question: What are the perceptions of undergraduate students
related to traditional and authentic assessments used in an introductory X X
Spanish course?
Sub-Q 1: What do students think are the benefits or limitations of each type
of assessment on their learning? Do students think these assessments reflect X X
their learning?
Sub-Q 2: How do students feel that these assessments can enhance or detract
X
from their learning experience?
Sub-Q 3: What factors do students feel affect the impact of each type of
X
assessment on their learning?
Sub-Q 4: What preferences do students express toward each type of
X X
assessment? What are their reasons for these preferences?
Sub-Q 5: What recommendations do students have for enhancing their
X
perceived effectiveness of each type of assessment?
Methodology
Context/Setting
I will conduct my research by gathering data from former students of my fall 2010
beginning Spanish college courses. I teach at a public, state-funded institution, and although we
are moving towards being a more research-focused institution, the current focus is aligned more
with teaching. I am fortunate to have a great deal of freedom and control in my classroom. While
it is true that my general curriculum and my textbook are mandated by the tenured faculty of my
department, I am free to choose whatever path I believe will best help my students achieve the
course goals. That is to say that while I am not free to choose what I teach, I am free to
determine how to facilitate student learning in my classes. While I also receive feedback from
peers once a year, I feel no other demands from any supervisors on my teaching methods. This
HOW STUDENTS PERCEIVE THEIR LEARNING 20
allows me to constantly experiment with ways to improve how I teach my classes. I am currently
in my sixth semester teaching these courses, and I can say with certainty that no two semesters
have held very much in common outside of my general teaching philosophy. I am constantly
trying to improve my methods based on what I have perceived as being effective. This hands-off
situation created by the administration allows me to be fluid in my methods. While the freedom
to teach the way I feel is best has many advantages, it also carries heavy responsibility; I have to
rely on my perceptions of my students’ learning with little feedback from anyone else.
The classes I teach are capped at twenty-eight students. This is a moderate number of
students for a beginning foreign language class. While it would be ideal to have a smaller
number because it would allow for more individualized attention, there are advantages to this
class size as well. With this many students it is easier to employ group learning strategies,
allowing students to facilitate their own, as well as each other’s, learning processes. These large
classes make it easy to use traditional assessment measures because they are easy to administer
and assess, even in so large a group. Authentic assessments, like portfolio-style projects, are
more challenging to administer and evaluate because they take longer to facilitate, collect, and
assess but can provide richer, more detailed feedback to students. During the semester that my
students’ reflections will be based on, they were exposed to various assessment measures, both
traditional and authentic; therefore, these students will have a concrete context on which to base
their reflections.
Participants
For this project, there will be two primary participant groups. First, I will invite the
seventy-nine students who are taking my beginning Spanish 1 courses in the fall semester of
2010 to participate in a broad attitude survey (see Appendix B). These students run the gambit in
HOW STUDENTS PERCEIVE THEIR LEARNING 21
class rank, from freshmen to seniors, and in age, the youngest being seventeen and the older ones
being over thirty, as well as in educational experiences and majors. I hope to see at least forty
The second group of students I plan to solicit for this study will ideally be a group of
nine. I would like to ask three students from each of the three sections to participate in individual
interviews. I will choose these participants based on their performance levels in class. Ideally, I
will find one high-, one mid-, and one low-performing student to ensure a broader range of
perspectives on the assessment measures. I will use a semi-structured interview guide (see
Research Plan
I will complete my research by two methods: individual interviews and a broad attitude
survey. First, I will employ a semi-structured interview method to ask the nine students,
described above, to explore their reflections on various forms of traditional and authentic
assessment tools, which they have encountered in their previous semester of beginning level
Spanish 1. I will use the semi-structured interview guide (see Appendix A) to solicit their
opinions on these assessments. I will record the interviews digitally, after having each student
sign an informed consent form (see Appendix C). When I begin analyzing their reflections, I
will create anonymity for my students by giving each student a pseudonym, known only to me,
broad attitude survey (see Appendix B), which will be housed online. This survey will ask these
former students to comment on their engagement and motivation levels as well as the perceived
effectiveness on their learning experiences with the various assessment tools used throughout the
HOW STUDENTS PERCEIVE THEIR LEARNING 22
course. I hope to have at least thirty-two of those students surveyed to respond, which would be
an approximate response rate of 40%. Based on previous experience with polling students, this
seems to be a realistic goal. I plan to send out an invitation to these students in early January,
after grades have posted for the semester, asking them to complete the survey by February 1,
2011. On January 30, 2011, I plan to send another email reminder asking students to complete
the survey. If my results are still under 50% participation, I will send a final email request during
the second week of February 2011. This survey will be completely anonymous because it will
The data collected from the individual interviews and online survey will be coded and
evaluated from the foundation of the research questions they hope to answer. I will use the
following color scheme, as illustrated in the figures below, to code information related to each of
the respective research questions: the Overarching question will be yellow, the Sub-Q 1 will be
green, the Sub Q-2 will be teal, the Sub Q-3 will be pink, the Sub-Q 4 will be blue, and the Sub-
Q 5 will be red. This color coding will allow me to find and separate out comments and
information that will enable me to reflect on each of my research questions. Figure 1.2
demonstrates the relationship between each set of questions posed in the semi-structured
become comfortable with the conversation and begin to reflect on their learning. To begin, I will
ask student to tell me about their first language learning experiences, why they want to study
Spanish, and what their goals are or what they want to do with language. This will provide a
context to their comments to allow for a deeper understanding of where they are coming from as
students and what they want from learning before delving into how they learn. The second set of
interview questions will then ask students to reflect on the way they learn and what tools and
methods they use. This will help to set up an understanding of how they view learning and what
The third interview question brings the focus of the conversation from learning in general
to the specific context of my beginning Spanish 1 course that they will have just completed. This
question may include reflections related to the overarching question of my research, depending
on what the student decides to focus on in answering. I do not want to lead them into any
specifics with this question, just get a general sense of their learning experience in that course to
help determine how to proceed with the next few, more specific questions about the particular
HOW STUDENTS PERCEIVE THEIR LEARNING 25
assessments.
Interview question sets four and five will ask students to reflect specifically on the two
particular examples of traditional assessments used in the course: the four chapter tests and the
cumulative final exam. To help students reflect on these two tools, I will at this point provide
them with two index cards to hold, point to, or just look at as they think. One card will read
“chapter test” and the other will say “final exam.” These cards will hopefully help them focus
their reflections on traditional assessment measures but I will at points also ask them about
similar assessment measures in other contexts for comparisons and further depth of reflections. It
is my hope that data collected from these two sets of questions will help to answer my
providing insight on sub-questions one, three, and four, related to benefits and limitations of
these assessments, factors they feel affect the impact of these assessments on their learning, and
their preferences for these assessments, respectively. During data analysis, comments specific to
each question will be color coded as indicated above and in Figure 1.2 to help sort this data for
reflection.
Question sets six and seven, in contrast to four and five described above, will ask
students to reflect specifically on the two particular examples of authentic assessments used in
the course: the culture blog and the eLinguaFolio self-assessment tool. Just as I did with the
previous question sets, I will at this point provide students with two additional index cards to
hold, point to, or just look at as they think. One card will read “culture blog” and the other will
say “eLinguaFolio self-assessment.” These cards will offer students tangible focal points for
their reflections on authentic assessment measures but, just as previously, I will at points also ask
them about similar assessment measures in other contexts for comparisons. The data collected
HOW STUDENTS PERCEIVE THEIR LEARNING 26
from these two sets of questions will help to answer my overarching research question as
one, three, and four, related to benefits and limitations of these assessments, factors they feel
affect the impact of these assessments on their learning, and their preferences for these
assessments, respectively. During data analysis, comments specific to each question will be color
coded as indicated previously and in Figure 1.2 to help sort this data for reflection.
To sum up this part of the interview, if students have not already sufficiently covered this
topic, I will ask questions from sub-set eight, giving the student all four of the assessment name
cards: chapter tests, final exam, culture blog, and eLinguaFolio self-assessment, to hold and look
at while sorting their thoughts on comparing each method to the others. This question set will
provide more detail on the same research questions mentioned above: the overarching questions,
plus sub-questions one, three, and four, and the data will be coded in the same manner.
Question set nine from the interview guide asks students to reflect on how completing
these various assessments affected their learning experience. This set aims to get reflections on
research sub-question two related to how students feel these assessments enhance or detract from
their learning experience. Information and reflections that pertain to this research question will
Finally, interview question set 10 will intentionally ask for student recommendations for
improvement on any of these assessments or any additional comments on testing in this course.
This set of questions aims to answer research sub-question five, which deals directly with student
recommendations for enhancing the effectiveness of each type of assessment. The information
related to this research question will be color coded in red for marking and analysis.
Figure 1.3 demonstrates the relationship between each set of questions or statements
HOW STUDENTS PERCEIVE THEIR LEARNING 27
posed in the broad attitude survey (see Appendix B) and the related research questions. The same
color coding scheme will be employed to facilitate the organization of this data: the Overarching
question will be yellow, the Sub-Q 1 will be green, the Sub Q-2 will be teal, the Sub Q-3 will be
pink, the Sub-Q 4 will be blue, and the Sub-Q 5 will be red.
effectiveness
fairness
additional comments
Overarching Question:
What are the perceptions of undergraduate students
12. Please leave any additional comments here.
related to traditional and authentic assessments
Recommendations for improvement are welcome
used in an introductory Spanish course?
and appreciated.
HOW STUDENTS PERCEIVE THEIR LEARNING 28
13. I understand that by completing this anonymous Informed consent: the participant must agree to this
survey I am consenting to allow all information statement in order to submit the survey.
provided to be used as part of an action research
study for a thesis.
The survey statements above are intended to create a larger overview of student opinion by
asking the entire population of the three sections of this course from the fall semester of 2010 to
choose the best description of their opinions on assessment from an attitude scale ranging from
strongly disagree to strongly agree. One final area will also be provided for any comments
The first set of statements is organized by the students’ perceived motivation to complete
the assessments. The second set of statements is organized by the students’ perceived
effectiveness of the assessment tools. These statements will be sequenced as the first four odd
and the first four even numbered questions, respectively. The reason behind the sequencing of
one motivationally directed item followed by one effectiveness directed item, is to allow students
to reflect on their perceived motivation and effectiveness of each assessment tool discretely.
Therefore, statement one is directed toward perceived motivation and statement two is directed
toward perceived effectiveness but they are both concerned with the first traditional assessment
tool, the chapter tests. This is the pattern for the first eight statements. This means that statements
one, three, five, and seven will focus on perceived motivation of the four assessment tools, while
two, four, six, and eight will focus on perceived effectiveness of the four assessment tools. All
eight of these statements are aimed at providing reflections on research sub-question four: What
Statements nine, ten, and eleven aim to gather data on perceived fairness to partially
HOW STUDENTS PERCEIVE THEIR LEARNING 29
answer research sub-question one concerning whether or not students feel that these assessments
recommendations and they will be provided an open textbox in which to provide as much or as
little data as they wish. This is the only statement on the survey that participants may elect to not
answer entirely. However, the first eleven statements do have an option they may choose for
neutral/no opinion. This statement could potentially provide data for any of the research
questions but will mostly likely provide general feedback, if any, that will contribute to the
overarching research question concerning more general perceptions of these types of assessments
in this context.
The last statement on the survey will cover informed consent. Student will be required to
check a box to consent that the research be used as part of this study before the survey can be
submitted.
Time Line
Mar 15-20 Final trips to UWC and final revisions On Monday, January 10,
Mar 21 Full final draft due to committee
Apr 4-7 Defense 2010, I will email the initial invite
Apr 7-17 Final revision
Apr 18 Final thesis submitted to graduate school to the seventy-nine students I am
Apr 20 Graduate school’s deadline for final thesis
asking to participate in the survey, requesting they complete it by Tuesday, February 1, 2010.
Since this is the first day of classes for spring semester and I know students will be very busy, I
plan to wait one week before sending out solicitations via email, on Monday, January 17, 2010,
to my nine ideal interview candidates to start setting up interview slots. I will ask students to pick
a time slot for their interview. These time slots will fall between Tuesday, January 18, 2010-
Friday, February 18, 2010. If any of the nine ideal interview candidates have not responded by
Friday, January 21, 2010, I will email as many alternate candidates as necessary to fill those
spaces.
The week of January 17-20, 2010, I will make three appointments at the University
Writing Center on three separate days to have a consultant work through one of the following
three prospective chapters with me: Chapter 4 (Methodology), Chapter 5 (Validity), and Chapter
6 (Ethics). I will make corrections after each session and send Chapters 4, 5, and 6 to the
committee by Monday, January 24, 2010. The week of January 24-28, 2010, I plan to make an
appointment with the University Writing Center to revise my bibliography and any appendices
On Sunday, January 30, 2010, I will send an email reminder to the survey participants to
remind them to complete the survey by Tuesday, February 1, 2010, if they have not already done
so. This reminder will have to go out to all participants because the survey is anonymous and I
will have no way to identify which participants have already completed the survey. If on
Monday, February 7, 2010, I have a response rate of less than 50% on the survey, I will send out
HOW STUDENTS PERCEIVE THEIR LEARNING 31
one more email reminder, asking students to please complete the survey. I believe this will be
more than sufficient to get the 40% response rate I am hoping for.
University Writing Center for a consultation. Ideally, I would like to make the appointment on
the 4th in case there is not sufficient time to get through the entire chapter in one session. I will
All interviews should be complete by Friday, February 18, 2010. Therefore, I plan to
complete the transcription and coding of all data, including that of the survey, by Sunday,
February 27, 2010. The week of February 28-March 4, 2010, I plan to make at least two
appointments at the University Writing Center to review Chapter 3 (Research Details) and
Chapter 7 (Data Representation). I will submit Chapters 3 and 7 to the committee by Monday,
March 7, 2010.
The week of March 7-11, 2010, I plan to make at least two, but most likely three,
appointments with the University Writing Center to work on Chapter 1 (Introduction) and
Chapter 8 (Conclusion). I plan to submit Chapters 1 and 8 to the committee by Monday, March
14, 2010.
During the week of March 15-20, 2010, I will make any necessary final visits to the
University Writing Center and finalize any revisions the committee has previously advised me
about throughout the course of the semester. By Monday, March 21, 2010, I will submit a
I would like to schedule the defense of my thesis for the week of April 4-7, 2010. This
will allow me one and half to two weeks to make any necessary adjustments before I submit my
final thesis to the graduate school on Monday, April 18, 2010, which is two days before the day
HOW STUDENTS PERCEIVE THEIR LEARNING 32
the graduate school requires receipt on Wednesday, April 20, 2010, to allow for additional issues
Validity
I chose to focus on four types of qualitative validity, described by Hendricks in her text
on action research, to validate this study: democratic, outcome, process, and catalytic (2009, p.
112). Hendricks, paraphrasing Anderson et al., defines democratic validity as, “the extent to
which stateholders have collaborated in the research process and/or the extent to which the
researcher has taken into account their various viewpoints” (2009, p. 112). I chose democratic
validity as the first way to validate my study because I will choose particular students to voice
their opinions. I will choose students to interview whom I attribute to one of the following three
previous semester. Since my research questions seek to solicit the opinions of students on
assessment, I feel it is crucial to solicit the opinions of students who fall at all ranges of assessed
performance. This will hopefully allow me to understand both strengths and weakness of these
The second measure I chose is outcome validity because it speaks to how I will use the
results for continued planning, ongoing reflection, and deepening my personal understanding of
the topics I am exploring. Hendricks, paraphrasing Anderson et al., defines outcome validity as,
“the degree to which there has been a successful resolution to the research problem” and she
emphasizes that “successful resolution” may not mean an end to the research but a place from
which to start the cycle again (2009, p. 112). Through this research I hope to learn which of my
assessment tools are working well and which need improvement. I hope to also learn about new
ideas while conducting the interviews and from the open-ended portion of the survey that will
HOW STUDENTS PERCEIVE THEIR LEARNING 33
help shape how I move forward in my practice. I will take what I learn, reflect on what it means
to me, and begin to brainstorm ways to apply my understanding to the way I approach my
classroom. I also plan to continue to solicit feedback from my students in future research to
continue the cycle and work toward continual improvement of my practice. I hope that by taking
their opinions into consideration, I will continue to improve the way I teach and thereby help my
students engage and participate in class and generally get more out of their learning experience
with me. There is so much more to learn and try as I continue my path to become the best teacher
I can be.
appropriate process for studying the research questions,” and she continues to say that it “relies
on the researcher’s commitment to carry out the study in a way that allows the research to
engage ‘in a way that develops a depth of understanding or change’” (2009, p. 112). I chose
process validity because I need to insure I look deeply at the problem so I can understand the
ways context and processes have influenced my results and how this information can carry me
forward. To insure I look deeply and critically at these issues, I will rely on two main methods:
asking open-ended questions during the interviews that left plenty of room for the students to tell
me what they really think and writing out my reflections on the interviews, and the broad attitude
survey so I remain very conscious of conclusions I have drawn. All of these notes and reflections
Hendricks, paraphrasing Lather, defines catalytic validity as, “the extent to which the
research transforms or changes the researcher’s views and/or practices” (2009 p. 112). I chose
catalytic validity because it will allow me to be aware of the ways my processes and outcomes
will change my practices. This is the most important part of my study. If the results do not
HOW STUDENTS PERCEIVE THEIR LEARNING 34
change the way I approach my classroom, I will need to revise my research and try again. While
I believe that I am a proficient instructor, my primary goal is to improve. Any insight I can gain
from this study will help reshape my perspective and improve my practice.
This study will gather observations from students who have taken beginning Spanish 1
with me in the fall semester of 2010. During this semester these students were exposed to various
types of traditional and authentic assessment measures. While some of these assessment
techniques are new, they were not introduced into the course for the purpose of studying their
specific function, only to serve the purposes of assessment in my classroom. My students are
unaware, at this juncture, that this research project is being proposed. They are only being
presented with these assessment measures in the context of their use in the classroom. While
these measures will receive the benefit of any insight gained during the course of this study, it is
important to note that they are not the primary focus of the study. The purpose of this study is to
explore the reflections of students in a beginning level Spanish class towards various forms of
traditional and authentic assessment tools in general. The specific tools used in the course, which
these participants will have completed, will serve as specific examples of the larger context of
traditional and authentic assessment. Participants in the interviews will be given plenty of
latitude to include other examples of these assessment tools to which they have been exposed in
other contexts. To clarify, the tests and portfolios these students will have completed for me by
the time this research begins, while benefiting from this study, are not the focus. The focus will
more general sense. This distinction is important to make because I want it to be clear that
administering these particular assessments for the purposes of trying to validate them is not an
HOW STUDENTS PERCEIVE THEIR LEARNING 35
Participants in the anonymous survey will be asked to check a box indicating that they
are aware that by completing it they are consenting to allow all information provided to be used
as part of an action research study for a thesis before the survey can be submitted. No identifying
information will be gathered during the survey. The survey will be emailed out to 79 students but
there will be no way to tell which will choose to participate or which submission belongs to any
particular student. This does make unintentional bias possible. There is every likelihood that only
particularly motivated students, whether they are satisfied with the course or not, will complete
the survey. This makes it possible that the survey results could be polarized. The results might
then only reflect either end, or perhaps only one end, of the spectrum of opinions. In my previous
experience with similar student polling, responses have been more or less balanced but this will
be something to watch for and will affect any conclusions drawn from this data.
levels throughout the course. By choosing students at different achievement levels in the course,
I hope to provide a more balanced perspective of student opinions across the board. There are
two concerns here, however. First, I am sure that I would select students from each of the three
achievement levels who I perceive as being the most capable of deep reflection and interest in
these issues. Because I am intentionally choosing these students, there is the chance that my
selections will be faulty in some way. The students I choose may or may not be representative of
the others. I hope this will not be the case, but there is always this risk when sampling from a
larger population and this will be something to address, as the data is being collected and
analyzed. The second concern is that I may have difficulty finding mid- and low-achieving
students who are willing to participate. These students may be upset or embarrassed because of
HOW STUDENTS PERCEIVE THEIR LEARNING 36
their achievement in the course. While these students would have some very valuable input to the
study, they may not be willing to participate in such an open way. I hope I can stress to these
knowing how the experience could have been more successful for them. I hope that this will
convince students in these groups to participate and that they will be able to do so in a deep and
meaningful way with honest reflection. It will be crucial that I remain open and receptive to all
I am very invested in my research topic because my work and I are my research topic.
The first step for me will be to distance myself a bit and try to view the focus of my study as my
teaching practice rather than myself. By trying to put a little intellectual separation between my
practice and myself, I hope to be able to be honest with myself about improving the way I do my
I will try to be as clear, honest and open-minded as I can while I ask these students to
dissect my assessment tools and my performance as their instructor. I will endeavor to make it as
explicit as possible that I am not interested in having my ego stroked. While I do want and need
to know what worked well for them, it is also imperative for them to be honest about the aspects
of the class and the assessment tools that were not particularly helpful. I believe this attempt will
Beyond assuring them that I am open to criticism, I have to really and actually be open. It
is essential that I be neither overtly nor subtly defensive in my responses or my body language.
The last thing I want to do is shut them down or squelch their opinions. This is another reason for
including the broad attitude survey. While I know who will have access to the survey, I will not
HOW STUDENTS PERCEIVE THEIR LEARNING 37
ask for any identifying information on the survey itself, thereby making it anonymous to allow
for more freedom and honesty. I am open to getting some negative or even potentially hurtful
comments in exchange for the chance of obtaining some useful feedback to help me be a more
effective teacher.
Conclusion
The purpose of this study is to explore the reflections of students in a beginning level
Spanish class towards various forms of traditional and authentic assessment tools. Those
Conclusions will then be drawn to inform my future practice and to add one small, and very
focused, piece of the puzzle to the literature surrounding assessment practices in higher
education. Additionally, it will serve to inspire me, and hopefully others, to continue asking the
questions that must be asked in the classroom: How do students learn? How can teachers
facilitate student learning? What tools can educators employ to create the most effective learning
Bibliography
Ainsworth, Larry, & Viegut, Donald (2006). Common formative assessments, How to connect
Allen, M.J. (2004). Assessing academic programs in higher education. Boston, MA: Anker
Angelo, Thomas A. (1993). Classroom assessment techniques: a handbook for college teachers.
Improvement?. Peer Review, 9(2), 9-12. Retrieved from Academic Search Complete
database.
Banta, T. W., Griffin, M., Flateby, T.L., & Kahn, S. (2009, December).Three promising
alternatives for assessing college students' knowledge and skills. (NILOA Occasional
Paper No.2). Urbana, IL: University of Illinois and Indiana University, National Institute
Barootchi, N., & Keshavarz, M. (2002). Assessment of achievement through portfolios and
doi:10.1080/00131880210135313.
Bers, T. (2008). The role of institutional assessment in assessing student learning outcomes. New
Bers, T. (2004). Assessment at the program level. New Directions for Community Colleges,
Bers, T., & Smith, K. (1990). Assessing assessment programs: The theory and practice of
examining reliability and validity of a.. Community College Review, 18(3), 17. Retrieved
Bers, T., Davis, B., & Taylor, B. (2000). The Use of Syllabi in Assessments: Unobtrusive
Indicators and Tools for Faculty Development. Assessment Update, 12(3), 4. Retrieved
Blackburn, B., Dewalt, M., & Vare, J. (2003). A Case of Authentic Redesign: Collaborating with
Research in Middle Level Education Online, 26(2), 45-56. Retrieved from Education
Brimi, H. (2010). Darkening the Ovals of Education. Clearing House, 83(5), 153-157.
doi:10.1080/00098650903505472.
Brint, S., Proctor, K., Murphy, S., Turk-Bicakci, L., & Hanneman, R. (2009). General Education
Complete database.
Brown, S. A., & Glasner, A. (1999). Assessment matters in higher education: Choosing and
using diverse approaches. Buckingham [England: Society for Research into Higher
Carr, Judy F & Harris Douglas E. (2001). Succeeding with standards linking curriculum,
assessment, and action planning. Alexandria, Virginia: Association for Supervision and
Curriculum Development.
Cauley, K., & McMillan, J. (2009). Formative Assessment Techniques to Support Student
Motivation and Achievement. Clearing House, 83(1), 1-6. Retrieved from Education
Choate, J. S. (1995). Curriculum-based assessment and programming. Boston: Allyn and Bacon.
Cross, K. Patricia. (1996). Classroom research: implementing the scholarship of teaching. San
Francisco: Jossey-Bass.
Cummins, P., & Davesne, C. (2009). Using Electronic Portfolios for Second Language
4781.2009.00977.x.
Davis, N., Kumtepe, E., & Aydeniz, M. (2007). Fostering Continuous Improvement and
Earl, L., & Torrance, N. (2000). Embedding Accountability and Improvement Into Large-Scale
Assessment: What Difference Does It Make?. PJE. Peabody Journal of Education, 75(4),
Ellis, Arthur K. (2001). Teaching, learning, & assessment together, The reflective classroom,
Eubanks, D. (2008). Assessing the General Education Elephant. Assessment Update, 20(4), 4-16.
Ewell, P. (2008). Assessment and accountability in America today: Background and context.
Ewell, Peter. (2009). Assessment, Accountability, and Improvement: Revisiting the Tension.
https://sites.google.com/site/nclfpilot/home/presentations/FormativeAssessmentandaBala
ncedAssessmentSystem.pptx?attredirects=0&d=1.
Felner, R., Bolton, N., Seitsinger, A., Brand, S., & Burns, A. (2008). Creating a statewide
information and assessment system for making evidence-based change at school, district,
and policy levels. Psychology in the Schools, 45(3), 235-256. Retrieved from Academic
Gulikers, J., Bastiaens, T., Kirschner, P., & Kester, L. (2008). Authenticity Is in the Eye of the
Vocational Education and Training, 60(4), 401-412. Retrieved from ERIC database.
Hendricks, C. (2009). Improving schools through action research: A comprehensive guide for
Hernon, P., Dugan, R.E., & Schwartz, C. (Eds.) (2006). Revisiting outcomes assessment in
Higgs, T. (1987). Oral Proficiency Testing and Its Significance for Practice. Theory Into
Israel, J. (2007). Authenticity and the assessment of modern language learning. Journal of
Kuh, G. & Inkenberry, S. (2009). More than you think, less that we need: Learning outcomes
assessment in American higher education. Urbana, IL: National Institute for Learning
Outcomes Assessment.
Moeller, Aleidine. (August 3, 2010). Autonomy and self-regulation in language learning. Paper
https://sites.google.com/site/nclfpilot/home/presentations/JigsawPieceSelfRegulation.pdf
?attredirects=0&d=1.
Moeller, Aleidine. (August 3, 2010). Goal Setting. Paper presented at LinguaFolio Institute,
https://sites.google.com/site/nclfpilot/home/presentations/JigsawPieceGoalSetting.pdf?att
redirects=0&d=1.
https://sites.google.com/site/nclfpilot/home/presentations/JigsawPieceLinguaFolio.pdf?at
tredirects=0&d=1.
Moeller, Aleidine. (August 3, 2010). Self-assessment in the foreign language classroom. Paper
https://sites.google.com/site/nclfpilot/home/presentations/JigsawPieceSelfAssessment.pdf
?attredirects=0&d=1.
https://sites.google.com/site/nclfpilot/.
http://www.ncssfl.org/links/index.php?linguafolio.
HOW STUDENTS PERCEIVE THEIR LEARNING 43
Neagu, Maria-Ionela. (2009). The Influence of the Teacher's Experience and of the Institution
Case Studies in Accreditation & Assessment, 1-11. Retrieved from Education Research
Complete database.
Ratcliff, J.L., Lubinescu, E.S., Gaffney, M.A. (Eds.) (2001). How accreditation influences
assessment (New Directions for Higher Education, 113). San Francisco, CA: Jossey-
Bass.
Ross, Steven J. (2005). The Impact of Assessment Method on Foreign Language Proficiency
database.
Sandoval, P., & Wigle, S. (2006). Building a Unit Assessment System: Creating Quality
database.
Sehlaoui, A. (2008). Language Learning in the United States of America. Language, Culture &
4781.2006.00466_6.x.
Tileston, Donna Walker (2005). 10 Best teaching practices, How brain research, learning styles,
and standards define teaching competencies, Second Edition. Thousand Oaks, CA:
Corwin Press.
Watson, D., & Robbins, J. (2008). Closing the chasm: reconciling contemporary understandings
of learning with the need to formally assess and accredit learners through the assessment
doi:10.1080/02671520701755408.
Williams, J., & Kane, D. (2009). Assessment and Feedback: Institutional Experiences of Student
doi:10.1111/j.1468-2273.2009.00430.x.
HOW STUDENTS PERCEIVE THEIR LEARNING 45
1. Tell me a little bit about your first experience with Spanish. How old were you? What
happened? Why did you decide to take Spanish? What are your goals in regards to
Spanish? What do you want to do with the language?
2. Tell me about how you learn. What situations and tools help you learn? What tools and
strategies do you seek out and employ?
3. Tell me a bit about your experience in beginning Spanish I this past fall of 2010.
4. There were four chapter tests that included sections on grammar, vocabulary, listening,
writing, and speaking. Tell me about your experience taking these tests. What did you
think the point of taking the tests was? Do you think they measured your ability to
understand and use Spanish? Did you feel you could better understand and use Spanish as
a result of preparing for and completing them?
5. The final exam was similar in structure to the chapter tests except it was cumulative,
covering all five chapters. Tell me about your experience taking this exam. What did you
think the point of taking the exam was? Do you think it measured your ability to
understand and use Spanish? Did you feel you could better understand and use Spanish as
a result of preparing for and completing it?
6. You were required to create a culture blog this semester that asked you to reflect on
cultural artifacts of your choosing and how they related to you personally and what you
were learning in the class. Tell me about your experience in creating this blog. What did
you think the point of creating the blog was? Do you think it measured your ability to
understand and use Spanish? Did you feel you could better understand and use Spanish as
a result of researching and completing it?
7. Throughout the semester you were required to keep up with the eLinguaFolio self-
assessment project that asked you to reflect on your own learning and provide samples
that demonstrated your best efforts. Tell me about your experience working on this
project. What did you think the point of working on the eLinguaFolio project was? Do
you think it measured your ability to understand and use Spanish? Did you feel you could
better understand and use Spanish as a result of working on it?
8. Considering these four ways (give student four index cards, each with the name of one of
the above assessments to help them concentrate on each one individually and in relation
to the others as they talk about them) in which you were tested during the semester, tell
me how you feel they compared to each other. Do you have a favorite?
9. How did completing these assignments affect the way you learned in the course?
10. What suggestions for improvement would you make about any or all of these
assignments? Any other comments on testing in this course?
HOW STUDENTS PERCEIVE THEIR LEARNING 46
Respond to the following questions about your experience in beginning Spanish I this past fall
1. I felt motivated to study and learn to prepare for the four chapter tests.
2. I felt like the four chapter tests helped to demonstrate what I learned.
3. I felt motivated to study and learn to prepare for the final exam.
4. I felt like the final exam helped to demonstrated what I learned.
5. I felt motivated to work on the culture blog/portfolio.
6. I felt like the culture blog/portfolio helped to demonstrate what I learned.
7. I felt motivated to work on the eLinguaFolio project.
8. I felt like the eLinguaFolio project helped to demonstrate what I learned.
9. Overall, I felt like the individual grades I received in this course were a fair assessment of
my learning.
10. Overall, I felt like my final grade in this course was a fair assessment of my learning.
11. Overall, I felt like I understood the point behind the tests and projects in this course.
12. Please leave any additional comments here. Recommendations for improvement are
welcome and appreciated.
13. I understand that by completing this anonymous survey I am consenting to allow all
information provided to be used as part of an action research study for a thesis.
(participant must agree to submit survey)
Please consider the following points before signing this form. Your signature confirms that you
are happy to participate in this study.
Your contribution to the research will take the form of an interview. This will be audio-recorded
(pseudonyms will be used).
The transcriptions (excluding names and other identifying details) will be retained by the
instructor and analyzed as part of the study.
The findings of the research will be written up as an action research study for a thesis and used to
improve personal teaching methods. The findings will be published by the graduate school as a
thesis. The written work may include quotations from the interviews, but individuals will never
be named.
I confirm that I have freely agreed to participate in this action research project. I have been
briefed on what this involves and I agree to the use of the findings as described above. I
understand that the material is protected by a code of professional ethics.
Participant signature:__________________________________________________
Name:________________________________________________________________
Date:_________________________________________________________________
Researcher signature:___________________________________________________
Name:__________________________________________________________________
Date:___________________________________________________________________