Anda di halaman 1dari 46

Diosdado M.

San Antonio
DepED Region 4A (CALABARZON)

PROMOTING A CULTURE OF RESEARCH


IN AID OF POLICY FORMULATION

TALK OUTLINE

Clarifying key concepts


The need for evidence-informed policy
Reasons why education research is not used
extensively in policy formulation
Linking educational research to policy
Ways research can support educational practices
Research synthesis
The policy cycle
Social impact analysis

RESEARCH

THREE REALMS OF RESEARCH IN EDUCATION

TWO CATEGORIES OF RESEARCH WORKERS

EDUCATION VS EDUCATIONAL RESEARCH?

Educational
research - studies
for education

consciously geared
towards improving
policy and practice

Education research studies of education

additional substantive
value independent of
its policy-relevance

Clark, C. (2011). Education(al) Research, Educational Policy-Making


and Practice, Journal of Philosophy of Education, Vol. 45, No. 1,
2011, pp. 37-57

EVIDENCE INFORMED POLICY.

an approach which helps people make


well informed decisions about policies,
programs and projects by putting the
best available evidence at the heart of
policy development and
implementation (Davies 1999 in Gough
et al, 2011).

Gough D, Tripney J, Kenny C, Buk-Berge E (2011) Evidence Informed Policy in Education in Europe: EIPEE
final project report. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University
of London.

McColskey & Lewis (ND) Making Informed Decisions About Programs, Policies, Practices, Strategies, &

Interventions

EVIDENCE-BASED DECISION MAKING CYCLE


MCCOLSKEY & LEWIS (ND) MAKING INFORMED DECISIONS ABOUT PROGRAMS,

POLICIES, PRACTICES, STRATEGIES, & INTERVENTIONS

FACTORS DRIVING NEED FOR EVIDENCE-INFORMED


POLICY

a greater concern with student achievement


outcomes;
a related explosion of available evidence due
to a greater emphasis on testing and
assessment;
more explicit and vocal dissatisfaction with
education systems, nationally and locally;
increased access to information via the
Internet and other technologies; and
resulting changes in policy decision-making.
Tracey Burns and Tom Schuller, OECD
Centre for Educational Research and
Innovation,
http://www.oecd.org/edu/ceri/47435459.
pdf

THE IMPORTANCE OF MAXIMIZING


RESEARCH USE
Economic

imperative to justify public

spending
Moral

imperative to ensure those providing


services do so informed by the best
possible evidence (e.g. Oakley, 2000);

Academic

imperative

Maximising research use in policy and practice in education


Judy Sebba Professor of Fostering and Education University of Oxford
Department of Education
judy.sebba@education.ox.ac.uk

11

WHAT IS THE PROBLEM?: THE LACK OF


EVIDENCE-INFORMED POLICY AND PRACTICE

Policy makers rank academic research well below


special advisers (media background), experts and
think tanks as sources of evidence (Campbell et al
2007; Rich 2004; Rigby 2005);

Policy makers often regard research findings as


impenetrable, ambiguous, conflicting, insignificant,
untimely or only partially relevant. In turn, they display
confusion about what constitutes evidence and its role
(Brown, 2012; Rickinson, Sebba & Edwards 2011).

Confusion about evidence is rife among the public


Maximising research use in policy and practice in education
Judy Sebba Professor of Fostering and Education University of Oxford Department of
12
Education

WHAT STOPS EVIDENCE BEING USED?


Numbers

to be influenced by evidence? More than half a


million teachers in the Philippines;
Practitioners

are too busy, cannot locate relevant and


accessible evidence, lack confidence to judge research;
Expert

systems such as EBP [evidence-based practice]


are attempts to manufacture trust as a legitimating exercise
for the mandate of professional authority in social work
(Webb, 2002)
What

counts as evidence, the nature of evidence & how it


is used in decision-making is highly contested.
Maximising research use in policy and practice in education
Judy Sebba Professor of Fostering and Education University of Oxford Department of Education
judy.sebba@education.ox.ac.uk

13

COMMON ISSUES AGAINST EDUCATION(AL)


RESEARCH

Lack of rigor
Failure to produce cumulative research
findings
Theoretical incoherence
Ideological bias
Irrelevance to schools
Lack of involvement of teachers
Inaccessibility and poor dissemination
Poor cost effectiveness
Whitty, G. (2006). Education(al) research and education policy
making: is conflict inevitable?, British Educational Research Journal
Vol. 32, No. 2, April 2006, pp. 159176

RESEARCH TO POLICY AND PRACTICE:


SOME ISSUES

On the processes
and mechanisms
through which
research-based
knowledge may be
transferred into
policy and practice

On the question of
appropriate
relationships
between research,
policy and practice.

Ozga, J. (2004). From Research to Policy and Practice:


Some Issues in Knowledge Transfer, accessed 12
April 2015 from http://www.ces.ed.ac.uk/PDF
%20Files/Brief031.pdf

ON TRANSFERRING RESEARCH-BASED KNOWLEDGE


INTO POLICY AND PRACTICE

Ozga, J. (2004). From Research to Policy and Practice:


Some Issues in Knowledge Transfer, accessed 12 April 2015 from
http://www.ces.ed.ac.uk/PDF%20Files/Brief031.pdf

ON RELATIONSHIPS BETWEEN RESEARCH, POLICY


AND PRACTICE

Ozga, J. (2004). From Research to Policy and Practice:


Some Issues in Knowledge Transfer, accessed 12 April 2015 from
http://www.ces.ed.ac.uk/PDF%20Files/Brief031.pdf

LINKING EDUCATIONAL RESEARCH TO POLICY


Research does not logically or
psychologically provide a basis or
starting point for policy.

Re-frame political and educational


expectations of the research/policy
relationship
in favor of more realistic and sophisticated
models of how policy is developed.

Values, normativity and ideology


are legitimately central to policy
making.

There is a role for research in refining,


critiquing, and developing these elements
within a structure of intelligent
argumentation.

Policy can and should be informed by the


full range of intellectual resources available
in the research community and not just a
narrowly empiricist selection.
There is no simple algorithm for translating
research into policy: Impact depends on
social practices which bring political,
democratic and research voices together in a
shared conversation and process of mutual
influence.

Influence development of ways in which


the
high quality work of scholarship can inform
policy and work to remove current
restrictions
on what is admitted.
Create conversational communities around
central policy issues as a vehicle for
mutual
information and influence, and not
necessarily
for decision-making or even agreed
understanding.

Bridges, D. (2009). Evidence based policy What evidence?


What basis? Whose policy? Teaching and Learning Research
Briefing No. 74, www.tlrp.org

OPTIMIZING USE OF RESEARCH IN CRAFTING EDUCATIONAL


POLICIES

Make use of best available evidence a requirement


in professional standards & build into infrastructure
of policy-making;

Improve access to synthesized, quality assured


evidence in priority areas open access;

Support practitioners to use research (and in some


cases to engage in research through closer
collaboration of researchers and professionals);

Most importantly, interrogate research use and


evaluate any initiatives designed to increase impact
only then can we really know what is achieved.
Maximising research use in policy and practice in education
Judy Sebba Professor of Fostering and Education University of Oxford Department of Education
judy.sebba@education.ox.ac.uk

PROMOTING A CULTURE OF RESEARCH

encouraging an active community of educational


researchers;
promoting cooperation and discussionwith policy
makers and practitioners, as well as national and
international associations in education and related
subject areas;
encouraging and supporting debate about the quality,
purpose, content and methodologies of educational
research;
developing and defending an independent research
culture committed to open inquiry and the
improvement of education;
Whitty, G. (2006). Education(al) research and education policy
making: is conflict inevitable?, British Educational Research
Journal Vol. 32, No. 2, April 2006, pp. 159176

MODELS OF RESEARCH IMPACT


Push - incentivize producers (researchers) to
undertake relevant, robust research;
2. Pull - incentivize users/practitioners
Better articulation of benefits to funders (e.g
value-added, prestige); research training for
policy officials ; role of insider-researchersin
government, two-way secondments;
3. Networks & brokerage - bring together
researchers, users and policy makers - influence
on design, research questions, verifying
findings, on-going dialogue without losing
research integrity.
But not all research shows us the way forward e.g.
attainment gap
1.

(Lavis et al 2003, Levin 2011, Nutley et al 2007, etc)

BUILDING A HIGHER QUALITY EVIDENCE


BASE FOR THE FUTURE

Weaknesses in quality of research in education and


reporting of it descriptive validity (Farrington 2003)

Features of high quality research:

clear questions (that address a need)


methods selected that are fit for purpose
methods executed properly e.g. reliability
use multiple sources of data (integration of quantitative &
qualitative?)
multidisciplinary research needed for complex questions

These are all characteristics assessed through


systematic reviewing.
Maximising research use in policy and practice in education
Judy Sebba Professor of Fostering and Education University of Oxford
Department of Education
judy.sebba@education.ox.ac.uk

22

IMPROVING THE FUTURE


EVIDENCE BASE

Randomly controlled trials


Interrogating large databases e.g. on
educational outcomes & longer term
employment, health etc
Longitudinal studies
Mixed methods to inform us of what and
how
Quality assurance, synthesis and scaling up of
practitioner inquiry.
Maximising research use in policy and practice in
education
Judy Sebba Professor of Fostering and Education University
of Oxford Department of Education
judy.sebba@education.ox.ac.uk

ASSESSING RESEARCH AND ITS IMPACT

Research Excellence Framework (REF), UK;


Research publications assessed on quality,
originality & significance (impact);
Impact separately assessed through case
studies;
Knowledge mobilization work;
Research Supporting Practice in Education
(OISE) - interrogating research impact.

Maximising research use in policy and practice in education


Judy Sebba Professor of Fostering and Education University of Oxford
Department of Education
judy.sebba@education.ox.ac.uk

INTERROGATING RESEARCH USE EMPIRICALLY:


RESEARCH SUPPORTING PRACTICE IN EDUCATION
(RSPE), OISE, UOT HTTP://WWW.OISE.UTORONTO.CA/RSPE/

Research use in secondary schools & districts (LAs).


Used knowledge claims as basis for intervention
mediated head teacher study groups, resources on
web. Had little impact;
KM in universities - Interviewed18 education faculties in
leading research universities worldwide regarding the
role of KM - modest in most faculties, done by individual
faculty members rather than at institutional level;
Survey of 500 grant-holders to determine extent and
nature of their KM efforts - tools and techniques used,
mediators, linkage activities, project funding earmarked
for KM.
Maximising research use in policy and practice in education
Judy Sebba Professor of Fostering and Education University of Oxford Department
of Education
judy.sebba@education.ox.ac.uk

25

RESEARCH SUPPORTING PRACTICE


IN EDUCATION CONTINUED..

Website analysis developed metric for assessing organizational KM


strategies (different types, ease of use, accessibility, focus of audience)
>100 education organisations in Canada, UK, US & Australia: national/
local govt depts., universities, funders & knowledge brokers. Limited
evidence of activities that build interpersonal connections that are
known to lead to greatest research impact.

Facts in Education: service to counter press reporting, correct


significant factual errors about education that appear in various news
media across Canada, providing the source & empirical evidence base
e.g. class size.

Education Media Centre in England is brokering service between


journalists and researchers offering timely evidence & access.
Maximising research use in policy and practice in education
Judy Sebba Professor of Fostering and Education University of Oxford
Department of Education
judy.sebba@education.ox.ac.uk

26

THE ROLE OF RESEARCH MEDIATION IN MAXIMIZING RESEARCH USE

Mediation is undertaken by funders, media, policy analysts, educators,


lobby groups, think tanks, policy advisers, etc;

Knowledge brokering links decision makers and researchers,


facilitating their interaction to better understand each other's goals
and professional cultures, influence each other's work, forge new
partnerships, and promote the use of research (Canadian Health
Services Research Foundation n.d.)

Mediators have multiple positions as trustees for each others


organizations, sit on each others councils, write, speak and appear
on platforms at each others events (Ball & Exley 2010, p.155);

dedicated individual liaison between policy makers and researchers


during commissioning/reporting (Martinez and Campbell, 2007);

problem definition,.expansion of public debate, innovation &


knowledge brokerage (McNutt and Marchildon 2009);

linking researchers with users throughout the research process


increases research impact (e.g. Rickinson et al, 2011; Ward et al, 2009).27

THE MEDIA AND THINK TANKS


Media presented all the think tanks as credible sources
of research, facts, and figures on education, regardless
of the extent to which each think tank emphasized
policy and political advocacy over the professional
norms of academic research e.g. peer-reviewing
(Haas 2007)

28

TYPES OF RESEARCH SYNTHESIS


collective term for the family of methods for
summarizing, integrating and, where
possible, cumulating the findings of different
studies on a topic or research question.
Narrative reviews
Vote counting reviews
Meta-analysis
Best evidence synthesis
Meta-ethnography
Davies, P. (2000). The relevance of systemic reviews to
educational policy and practice. Oxford Review of Education,
26 (3-4), pp. 365-378

NARRATIVE REVIEWS

Attempt to identify what has been written on a


subject or topic, using which methodologies, on
what samples or populations, and with what
findings.
There is usually no attempt to seek generalization
or cumulative knowledge from what is reviewed.
Rather, the task is to identify the range and
diversity of the available literature, much of which
will be inconclusive, and to find a gap which new
research might attempt to fill.
Traditional qualitative literature review
Davies, P. (2000). The relevance of systemic reviews to
educational policy and practice. Oxford Review of
Education, 26 (3-4), pp. 365-378

VOTE COUNTING REVIEWS

Attempt to accumulate the results of a collection


of relevant studies by counting how many
results are statistically significant in one
direction, how many are neutral (i.e. no effect),
and how many are statistically significant in the
other direction (Cook et al., 1992, p. 4).
The category that has the most counts, or votes,
is taken to represent the modal or typical
finding, thereby indicating the most effective
means of intervention.
Davies, P. (2000). The relevance of systemic reviews to educational
policy and practice. Oxford Review of Education, 26 (3-4), pp. 365-378

META-ANALYSIS

the statistical analysis of a large collection


of analysis results from individual studies
for the purpose of integrating the findings.
combines the individual study treatment
effects into a pooled treatment effect for
all studies combined, and/or for specific
subgroups of studies or patients, and makes
statistical inferences (Morton, 1999)
Davies, P. (2000). The relevance of systemic reviews to
educational policy and practice. Oxford Review of Education,
26 (3-4), pp. 365-378

BEST EVIDENCE SYNTHESIS

Reviewers apply consistent, well justified,


and clearly stated a priori inclusion
criteria of studies to be reviewed.
Uses guiding principles for choosing a
priori criteria, including that primary
studies should be germane to the issue
at hand, should be based on a study
design that minimizes bias, and should
have external validity.
Davies, P. (2000). The relevance of systemic reviews to educational policy
and practice. Oxford Review of Education, 26 (3-4), pp. 365-378

META-ETHNOGRAPHY

Attempts to summarize and synthesize


the findings of qualitative studies,
especially ethnographies and interpretive
studies.

ethnographic, interactive, qualitative,


naturalistic, hermeneutic, or
phenomenological.
seek an explanation for social or cultural
events based upon the perspectives and
experiences of the people being studied.

Davies, P. (2000). The relevance of systemic reviews to educational policy


and practice. Oxford Review of Education, 26 (3-4), pp. 365-378

SOCIAL IMPACT ASSESSMENT (SIA)

includes the processes of analyzing,


monitoring and managing the intended and
unintended social consequences, both
positive and negative, of planned
interventions (policies, programs, plans,
projects) and any social change processes
invoked by those interventions.

Its primary purpose is to bring about a more


sustainable and equitable biophysical and
human environment.
http://www.socialimpactassessment.com/

SOCIAL IMPACT ASSESSMENT TOOLS AND METHODS

Analytical tools
Community-based methods
Consultation methods
Observation and interview tools
Participatory methods
Workshop-based methods

http://www.unep.ch/etu/publications/EIA_2ed/EIA_E_top13_
hd1.PDF

ANALYTICAL TOOLS

Stakeholder Analysis addresses strategic questions,


e.g. who are the key stakeholders? what are their
interests in the project or policy? what are the power
differentials between them? what relative influence
do they have on the operation?
Gender Analysis focuses on understanding and
documenting the differences in gender roles,
activities, needs and opportunities in a given context.
Secondary Data Review of information from previously
conducted work is an inexpensive, easy way to
narrow the focus of a social assessment.

COMMUNITY-BASED METHODS

Participatory Rural Appraisal (PRA) covers a family of


participatory approaches and methods, which
emphasizes local knowledge and action. It uses to group
animation and exercises to facilitate stakeholders to
share information and make their own appraisals and
plans.
SARAR is an acronym of five attributes -- self-esteem,
associative strength, resourcefulness, action planning
and responsibility for follow-through -- that are important
for achieving a participatory approach to development.

seeks to optimize people's ability to self-organize, take


initiatives, and shoulder responsibilities.

CONSULTATION METHODS

Beneficiary Assessment (BA) is a systematic


investigation of the perceptions of a sample of
beneficiaries and other stakeholders to ensure
that their concerns are heard and incorporated
into project and policy formulation.
The purposes are

(a) undertake systematic listening, which "gives


voice" to poor and other hard-to-reach beneficiaries,
highlighting constraints to beneficiary participation,
and
(b) obtain feedback on interventions.

OBSERVATION AND INTERVIEW TOOLS

Participant Observation is is based on looking, listening,


asking questions and keeping detailed field notes.
Semi-structured Interviews are a low-cost, rapid method
for gathering information from individuals or small groups.
Focus Group Meetings are brief meetings -- usually one to
two hours -- with many potential uses, e.g. to address a
particular concern; to build community consensus about
implementation plans; to cross-check information with a
large number of people; or to obtain reactions to
hypothetical or intended actions.
Village Meetings allow local people to describe problems
and outline their priorities and aspirations.

PARTICIPATORY METHODS

Role Playing helps people to be creative, open their perspectives,


understand the choices that another person might face, and make
choices free from their usual responsibilities
Wealth Ranking (also known as well-being ranking or vulnerability
analysis) is a visual technique to engage local people in the rapid
data collection and analysis of social stratification in a community
(regardless of language and literacy barriers).
Access to Resources is a tool to collect information and raise
awareness of how access to resources varies according to gender,
age, marital status, parentage, and so on.
Analysis of Tasks clarifies the distribution of domestic and
community activities by gender and the degree of role flexibility
that is associated with each task.

PARTICIPATORY METHODS

Mapping is useful for collecting baseline data on a number


of indicators as part of a beneficiary assessment or rapid
appraisals, and can lay the foundation for community
ownership of development planning by including different
groups.
Needs Assessment draws out information about people's
needs and requirements in their daily lives.
Pocket Charts are investigative tools, which use pictures as
stimulus to encourage people to assess and analyze a
given situation.
Tree Diagrams are multi-purpose, visual tools for narrowing
and prioritizing problems, objectives or decisions.
Information is organized into a tree-like diagram.

WORKSHOP-BASED METHODS

Objectives-Oriented Project Planning is a method that


encourages participatory planning and analysis throughout
the project life cycle.

A series of stakeholder workshops are held to set priorities, and


integrate them into planning, implementation and monitoring.
Building commitment and capacity is an integral part of this
process.

TeamUP was developed to expand the benefits of


objectives-oriented project planning and to make it more
accessible for institution-wide use.

PC/TeamUP is a software package, which automates the basic


step-by-step methodology and guides stakeholders through
research, project design, planning, implementation, and
evaluation.

The research we do at the local level collaboratively - is what makes formal,


outside research work. Outside
research cannot be installed like a car
part - it has to be fitted, adjusted, and
refined for the school contexts we
workd in.
Mike Schmoker

THANK YOU AND MABUHAY!!!

diosdado.sanantonio@deped.gov.ph

Anda mungkin juga menyukai