Anda di halaman 1dari 22

Assessment

Overview

Always tested - life


Why need to tested?

Mia 2008

Assessment
Testa specific form of measurement
Measurementinvolves systematic quantification of
data; systematic in that it is collected in an organized
manner according to rules.
Evaluationplaces a judgment on the collection of
data
Assessmentthe integration of both quantitative
and qualitative data collected to provide information
on the nature of the learner, what is learned, and how
it is learned
Assessment=Measurement + Evaluation

Assessment
Why do teachers need to know about
assessment?

To diagnose student strength and


weaknesses
To monitor student progress
To assign grades
To determine instructional
effectiveness
To influence public perceptions of
educational effectiveness
To evaluate teachers
To clarify instructional intentions

Assessment
What is it that teachers really need to
know about assessment?

How to develop classroom


assessments for different purposes
How to interpret standardized test
scores for instruction and clarification
to parents
How to prepare students to take
assessments

Discussion

What are the advantages and disadvantages


of examining a pupils school record folder
before the start of class?
How much must teachers really know about a
pupils home and family background?
Why do the teachers rely so heavily on
informal observation when sizing up pupils?
What kinds of learning are best assessed by
observation?

Types of Assessments
Norm-Referenced

Used to make comparisons to performance of


other like students
Content domain is often very broad with few
items per category
Scores are interpreted relative to the norm
groups performance
Ex. Standardized Tests (UPSR, PMR,
SPM,etc)

Types of Assessments

Criterion-Referenced

Used to compare performance to a specified domain,


typically some mastery level is identified
Domain is more explicitly defined and sampled with more
items
Interpretation involves the percentage of domain mastered
Ex. Classroom Tests

Norms
Norms are scores obtained from individuals which
were tested during standardization of the assessment
instrument
Norms are empirically derived, changes when group
changes
Norms are NOT standards and should not be
confused as having the same meaning
Norms are simply summarized data from the group of
students who were in the standardization sample
Norms are determined by defining a population, then
drawing a sample from the population
Norms are necessary to interpret scores
Norms must be recent, representative, and relevant

Standardized Tests
Developed by experts in content areas and
measurement specialists
Specific content covered (fixed number of
items)
Specific guidelines for administration and
scoring
Administered to specific group (norm
group)
Types of standardized tests:

BatteryCollection of subtests standardized


on same group
DiagnosticTo isolate strengths and
weaknesses
Single Subject MatterTests in specialized
areas

Reliability and Validity


Reliabilityconsistency in scores over
time

Reliability assesses how accurate the scores


derived from the test are, not the test itself

Validitythe process by which scores


from measurements take on meaning

Are you measuring what you think you are


measuring?
One does not validate a scale or test, what is
validated is an interpretation about the scores
derived from the scale

Reliability and validity depend upon the


purpose of the test

Classroom Assessment
Principles of Effective Classroom
Assessment

Promoting learning

Summative and formative


On-going on regular basis

Using multiple sources of information


Providing fair, valid, and reliable
information

Classroom Assessment
Key issues in planning

What do we want students to


understand and be able to do?
Why are we assessing and how will
the assessment information be used?
For who are the results intended?
What methods of assessment will be
used and Why?

Traditional or Alternative?

Traditional
Response Selecting
Assessment

Matching
Alternate Choice/True-False
Multiple Choice

Response Supplying

Simple Recall/Sentence Completion


Essay Items
Performance Assessment

Performance
Assessment
Constructed Response (short answer,
flow work, etc.)
Product Responses (essays, reports,
projects, models, etc.)
Process-Focused Assessments
(observations, interview, think aloud)

Teacher-Made Tests
What instructional situation needs
evaluating?
What was taught? For what level of
learning?
What were the instructional objectives
(content/level)?
**The assessment format chosen
should be linked to the answers to
above questions.

Teacher-Made Tests
When would be a good time to use
selected response?

Useful in grading situation that uses


multiple sources of information
As a pre-test to get information on what
each student knows

When would be a good time to use


supply-type response?

To assess integration of knowledge


To study processes used to solve problems
To assess cooperation

Table of Specifications
Test Blueprint
Development
1.
2.

3.

4.

5.

List content to be taught/tested


Estimate amount of instructional time per
area (tells proportional emphasis)
At what level is the instruction, what level
of understanding is expected? (Blooms
Taxonomy)
How much time is there for testing, what is
appropriate for the students age/level?
As items are written, an ideal answer
must be developed

Blooms Taxonomy
Knowledge
Comprehension
Application
Analysis
Synthesis
Evaluation

Grading Practices
Normative (relative of a specified group)

Normal Curve (a fixed percentage will get


certain grades) DONT DO!!!
Distribution Gap Method (Rank order
composite scores, where gaps occur, make
grade cuts) DONT DO!!!

Competency-Based (some absolute


standard set a priori)

Fixed Percentage (Fixed percentage set a


priori)
Content-based Method (give grade to each
component, multiply each component by
weight, then produce final grade)

Normal Curve Grading

A= 2.27% (1) D= 13.59% (4)


B= 13.59% (4) F= 2.27% (1)
C= 68% (20)
n=30

Anda mungkin juga menyukai