Anda di halaman 1dari 7

Submitted to the Director of Far West Laboratory Management

Submitted by CompassPoint Consulting




CompassPoint Consulting-Response to Evaluation Proposal
FWL Spring 1975
Introduction
The purpose of this proposal is to support Far West Labs (FWL) in the research phases
of the product development process by providing a plan for moving their new product,
the Determining Instructional Purposes (DIP) training program, through preliminary,
main, and operational field testing so that the data available at the end of the labs
research and development process can be used to determine if the program meets its
objectives. The evaluation proposed is of a formative as well as summative nature and
uses a goal-based evaluation model. Based on the information gathered and analyzed,
and recommendations made by our firm to FWLs program development team,
decisions can be made by the director of FWL as to whether further resources should
be committed to actually marketing the DIP training package to a wider constituency. If
the decision is a positive one, then this evaluation will also be able to provide the
necessary data and advice, specifically aimed at school administrators, which can be
used in the marketing of the product and expanding FWLs business.

Description of Program Being Evaluated
After what has been learned by FWL in the development of their Teacher Education
Program in the years 1966-early1970s, and more specifically in their development of the
minicourse model, FWLs Department of Educational Improvement and Policy Support
is now following suit and embarking on the development of a new training product, this
time aimed at school administrators and graduate students in educational
administration. The purpose of the DIP training program is to train school administrators
and graduate students in educational administration programs in skills related to the
planning of effective school programs.

Based on the already successfully trialed minicourse instructional model, the DIP
program consists of three training units that can be used individually or in unison. The
modules consist of reading material related to the skills taught in the module, individual
or small group activities in which the trainees practice the skills, and feedback for the
practice activities. The three training units each have four-to-six modules that provide
training on a set number of instructional objectives related to goal setting, problem
analysis, and deriving objectives. A coordinator, who does not have to be an expert in
program planning, oversees the training. Their role is to organize, guide, and monitor
activities. The coordinator has a handbook to guide the implementation.

Evaluation Method
Far West Labs, sponsored by the US Department of Education, has the goal to help
individuals of all ages obtain more and better learning opportunities as a result of its
research, development, dissemination, evaluation, and technical assistance activities.
Consequentially, the labs primary objective is to identify the effectiveness of their
educational products. It is not FWLs objective to produce a product that has the
technical quality and polish of a commercial product as this would extend the research
and development process beyond finding out if the product meets its objectives, and
this is not a justifiable expense for a non-profit government-funded organization focused
on research and development.

CompassPoint Consulting-Response to Evaluation Proposal
FWL Spring 1975
Therefore, the purpose of this evaluation is to find out if FWL should commit resources
to marketing the DIP program solely based on its effectiveness. Due to the intertwined
multi-step process of research and development used by FWL, the nature of the
evaluation will be both formative and summative. The core question about the future of
DIP will be answered through the summative outcome of this evaluation, the results of
which will be reported to the FWL Director as well as to the Director of the Department
of Educational Improvement and Policy Support so that they will have the necessary
information and recommendations they need to make their decision. Throughout the
formative evaluation phase, which consists of preliminary, main, and operational field
testing, the data gathered will be passed to the program development team so that they
may use it to make as many improvements to the program as possible prior to any
future mass distribution. Additionally, the data collected, along with existing data from
the successful Teacher Education Program which used the same instructional model,
can be used to market the product to school directors, should the outcome be positive.

The information required for assessing whether the program objectives are being met or
not includes whether the knowledge, attitudes, and skills of the participants changed as
a result of participating in the program. The information will be collected using a mixed-
method approach. The quantitative data will be collected through pre- and post-tests,
surveys (participant evaluations of the course), and nominal data collected about the
backgrounds of coordinators and participants, as well as data about the course
materials.

The qualitative data will largely come through observations (one module per testing
session/one school), particularly of the small group activities, during the three phases of
field-testing as well as through the use of a focus group made-up of a sampling of
participants from all three field trials at the end of the research and development
process. The coordinators will also be interviewed, as well as observed during the
course implementation, in order to find out more about the actual implementation
process. Further information about the course content will be gathered from an initial
document analysis of the course materials, and data from the Teacher Education
Program in respect to the minicourse instructional model will be considered.

When analyzing this data, the evaluators will focus on three main formative questions
and related sub-questions pertaining to the instructional model, content and materials,
and implementation. This structure will also be used to organize and analyze the
collected data. The analysis will attempt to summarize major themes that arose across
the different data sources throughout the testing and if these were addressed in the
revisions. The final Focus Group interview will aim to use these themes as a framework
and to delve deeper into these issues for informing the summative evaluation. For the
summative report, the focus will be on answering if the program was effective and
educationally sound, and to make any further recommendations beyond those already
made through the formative evaluation phases. (See Appendix A for details of
questions)

CompassPoint Consulting-Response to Evaluation Proposal
FWL Spring 1975
Task Schedule

Task Purpose Timeline
Meet with FWL
Director and
Director of EIPS
To clarify purpose of evaluation, get input on
evaluation design, gather information about the
minicourse model, and FWL mission and practices.
June 1975
Collect data already
available from
Teacher Program
To find out about the research and development
process used, what was learned from this, and
how successful it was.
June 1975
Review program
materials
To collect data about the content and structure of
the course.
June 1975
Meet with DIP
Program
Development Team
To find out what they have done so far and why, as
well as to feedback any data from initial research.
June 1975
Conduct pretesting
of preliminary field
testing participants
(4-12 from 3
schools)
To find out what participants already know and can
do, as well as their prior experiences.
July 1975
Observe
preliminary field
testing
implementation of
DIP
To collect observational data about the course
implementation as well as anecdotal evidence of
participants learning. The evaluator will observe
group and coordinator interactions and take notes
on conversations in the group activities.
August-
Sept 1975:
Module 1-
3 days

Conduct evaluation
survey of
participants
To get evaluative feedback from the participants
themselves about the effectiveness of the course
content, materials, and implementation.
September
1975
Conduct post-
testing of
preliminary field
testing participants
and coordinator
To find out what new skills and knowledge
participants gained as a result of taking the course.
Early
October
1975
Meet with DIP
Program
Development team
To give feedback in the form of a formative
evaluation report, with recommendations for
course improvements.
Late
October
1975
Time for Development Team to make improvements and re-print materials (Nov.-Feb.)
Conduct pretesting
of main field testing
participants (30-60
from 5 schools)
To find out what participants already know and can
do, as well as their prior experiences.
January
1976
Observe main field
testing
implementation of
DIP
To collect observational data about the course
implementation as well as anecdotal evidence of
participants learning. The evaluator will observe
group and coordinator interactions and take notes
on conversations in the group activities.
Feb-Mar
1976:
Module 2-
4 days

Conduct evaluation To get evaluative feedback from the participants March
CompassPoint Consulting-Response to Evaluation Proposal
FWL Spring 1975
survey of
participants
themselves about the effectiveness of the course
content, materials, and implementation.
1976
Conduct post-
testing of main field
testing participants
and coordinator
To find out what new skills and knowledge
participants gained as a result of taking the course.
Early April
1976
Meet with DIP
Program
Development team
To give feedback in the form of a formative
evaluation report, with recommendations for
course improvements.
Late April
1976
Meet with FWL
Director and
Director of EIPS
To give an update on progress and discuss any
issues arising.
Late April
1976
Time for Development Team to make improvements and to re-print materials (May-Aug.)
Conduct pretesting
of operational field
testing participants
(1 school)
To find out what participants already know and can
do, as well as their prior experiences
July 1976
Observe
operational field
testing
implementation of
DIP
To collect observational data about the course
implementation as well as anecdotal evidence of
participants learning. The evaluator will observe
group and coordinator interactions and take notes
on conversations in the group activities.
Aug-Sept
1976:
Module 3-
3 days
Conduct evaluation
survey of
participants
To get evaluative feedback from the participants
themselves about the effectiveness of the course
content, materials, and implementation.
September
1976
Conduct post-
testing of
operational field
testing participants
and coordinator
To find out what new skills and knowledge
participants gained as a result of taking the course.
Early
October
1976
Meet with DIP
Program
Development team
To give feedback in the form of a formative
evaluation report, with recommendations for
course improvements.
Late
October
1976
Time for Development Team to make any last-minute improvements
Conduct Focus
Group meeting
To gain a deeper insight into issues that arose over
the course of the implementation, with a focus on
the instructional model, content, materials, and
implementation in relation to the stated objectives.
November
1976
Meet with FWL
Director and
Director of EIPS
Presentation of summative results and
recommendations in the form of a final report.
December
1976



CompassPoint Consulting-Response to Evaluation Proposal
FWL Spring 1975
Project Personnel
CompassPoint Consulting recruits experts from the fields of education and research,
and forms teams best suited to our clients needs.

Leroy Brown
Expert in program development and instructional design
If there is something that Leroy Brown excels at it is putting practices into action.
Having received his undergraduate degree from UCLA in Education Studies, and then
traveling abroad for a Masters in Curriculum Pedagogy and Assessment from the
Institute of Education at the University of London, Leroy is an expert in the development
of educational programs and instructional design. His passion is improving teaching and
learning through the development of innovative programs. He joins CompassPoint
Consulting as an external expert for projects that focus on the complexity and feasibility
of putting plans into action in educational settings. His past experiences include
planning and evaluating various school-based projects that focus on the development of
resources to support student achievement, as well as advising schools on current
educational trends and new evidence-based programs and practices that show
promising results.

Harriet Welsch
Expert in evaluation
With an undergraduate degree in Statistics from Columbia University and a Ph.D. in
Educational Administration and Research from the University of California Riverside,
Harriet is our most experienced program evaluator. She is a seasoned educator and
researcher with a deep understanding of what constitutes scientifically valid research.
Her analytical and organizational skills enable her to conduct rigorous research studies
and analyze large and small data sets from multiple perspectives. She is a trusted
advisor to public schools in California and Nevada, as well as having strong, ongoing
connections to the University of California. Her most notable project was the production
of an annual report series documenting progress in achieving the goals put forward by
the 1965 Elementary and Secondary Education Act.

Budget
As the evaluation team is nearby, travel costs should be kept to a minimum, and all
surveys and pre- and post-tests will be distributed by mail rather than in person to
minimize costs.

Personnel:
Harriet Welsch- $500/day x 40 days $20,000.00
Leroy Brown- $400/day x 45 days $18,000.00
Travel and per diem:
Min. 12 round trips Sacramento/San Francisco $1500.00
Per Diem (hotel and meals) $3000.00
Supplies and materials:
paper and postage for surveys and tests $1000.00
==========
$43,500.00
CompassPoint Consulting-Response to Evaluation Proposal
FWL Spring 1975
Appendix A


Formative Questions:
Is DIP effective in helping clients to develop skills in planning effective school
programs?

Does the instructional model help to meet objectives?
o Was there an appropriate balance of knowledge building and application?
o Should workshop format or content be modified?
o Did the structure of the model support learning?
Do the materials help to meet objectives?
o Were needed materials available?
o Was the workshop content accurate and up to date?
Does the implementation process help to meet the objectives?
o Was the workshop implementation schedule and staffing effective?
o Was the full range of topics included in the design actually covered?
o Was communication adequate?
Summative Questions:
Was the program was effective and educationally sound?
Was there evidence of an increase in knowledge and skill as a result of project participation? Did
this vary by teachers or by students characteristics?
Were the workshops of high quality (accuracy of information, depth of coverage, implementation,
etc.)?

Anda mungkin juga menyukai