Anda di halaman 1dari 8

SDPDIAGNOSTIC

INTRODUCTION AND BACKGROUND

SDP DIAGNOSTIC
A core strategy of the Strategic Data Project is to execute diagnostic analyses using our partner agencies data. We conduct these diagnostic analyses in two core areas: human capital (primarily focusing on teacher effectiveness) and postsecondary success and attainment for students. We focus on these two areas precisely because of the leverage they offer. Research consistently shows that among all factors controlled by school systems, a teachers effectiveness matters the most for student learning. Similarly, a students successful graduation from high school (and beyond) can dramatically shape the opportunities for his or her future.
WHAT IS THE DIAGNOSTIC?

INTRODUCTION AND BACKGROUND


THE DIAGNOSTICS WHAT THEY ARE

1 2 3 4 1 2 3 4

Standardized analyses designed to help agencies better understand their current performance, uncover issues, and strategically plan responses Illustrations of how existing data can be used to improve decision making Starting point for district-led deep-dive explorations on specific strategic issues A way to explore how agencies perform relative to others in the covered areas

The diagnostic is a set of analyses that frames actionable questions for education leaders. For example, an education leader might ask: How successfully are we recruiting effective teachers? How long do these teachers stay and where do they teach in our system? Similarly, how many of our students stay on-track for high school graduation? Of those who stay on-track and those who do not, how many transition to postsecondary education? Our analyses were developed after a close examination of the data that is available in many districts and states while working closely with our partners to understand their core questions. To this end, we designed both the human capital and collegegoing diagnostics to offer insight into an agencys current performance in these areas and to demonstrate how that performance varies (for different schools, within schools, and across time). A primary goal is to reveal opportunities for management or policy changes that may improve outcomes. As their names suggest, our analyses are meant to be diagnostic in nature. Just as a blood test does not reveal the underlying cause of high cholesterol, our diagnostics do not try to explain the root cause of our findings. For instance, we may find that high-poverty schools have

WHAT THEY ARE NOT


Root-cause analyses for specific issues uncovered A set of specific recommendations of actions agencies should take to improve performance Comprehensive collection of all that can be done with existing data Ranking of agencies or departmental performance

the greatest proportion of novice teachers, but we do not try to provide an exact reasoning for this finding. Rather, our aim is to broach conversation about our findings in a manner similar to how a patient and a doctor might converse; carefully understanding the results of various diagnostic tests that may in turn help each to understand any underlying health problems and how they might be alleviated by changing certain choices in the patients life (more exercise, fewer french fries, etc.).

2 SDPDIAGNOSTIC

HUMAN CAPITAL DIAGNOSTIC PATHWAY: FIVE STAGES IN A TEACHERS CAREER

RECRUIT Does teacher effectiveness vary by certification pathway (traditional vs. alternative)? Where do agencies recruit their most effective teachers?

PLACE Where are highly effective teachers placed (which schools and with which students)? Does teacher effectiveness vary across type of school (grades, poverty level, geography)?

DEVELOP Are teachers with advanced degrees more effective? Do teachers become more effective as they gain experience?

EVALUATE How predictive is novice teacher effectiveness of future teacher effectiveness?

RETAIN/TURNOVER Are more effective teachers retained at higher rates than less effective teachers? Are there patterns in the movement to different jobs (into/ out of teaching jobs; to/from higher needs schools, etc.)?

Human Capital We organize each diagnostic around a framework or pathway. The human capital diagnostic includes 2030 analyses, each falling under one of five stages in a teachers career. The chart above identifies each stage in the human capital pathway and lists an example or two of the kind of questions our analyses are designed to answer.1 We measure a teachers effect on student achievement by using a statistical model that is often referred to as value-added. When we employ the value-added model, we measure a teachers effect on student achievement growth by isolating the part of that growth attributable to a particular teacher. We obtain this isolated value-added
1 The analyses we are able to complete for each agency are dependent on data availability and quality.

measure by controlling for other factors in the students background (including prior performance, income status, demographics, and peers in the classroom) to account for other differences in performance unrelated to the teacher.2 For all of the diagnostic analyses, we examine average teacher effectiveness rates for groups of teachers; in no case do we report teacher effectiveness measures for individual teachers.

2 For a technical explanation of the model we use, please see http://www.gse.harvard.edu/~pfpie/pdf/CMS_Technical_Appendix_ Teacher_Effects.pdf.

SDPDIAGNOSTIC 3

College-Going For the college-going diagnostic, we focus on student trajectories from high school into college. In this diagnostic we repeat several analyses using various student samples to track postsecondary outcomes for each stage in the pathway. For example, for most of the stages below, we examine high school completion and postsecondary attainment rates: by high school, by student ethnic group, by student socioeconomic status, and by prior student achievement.

For example, one might ask: What are the college enrollment rates for students who score in the lowest quartile on their eighth-grade math and English tests? What if we explore these rates by high school? (To date, we see a wide range here, indicating that some high schools significantly outperform other schools in ensuring lower performing students successfully matriculate to college.) Below we list the four stages in the college-going pathway, along with several factors and questions our analyses seek to answer for each stage.

COLLEGE-GOING DIAGNOSTIC PATHWAY

9TH TO 10TH

HIGH SCHOOL GRADUATION What are high school graduation rates? By high school? Controlling for prior student achievement? For how much of high school are students on track vs. off track?

COLLEGE ENROLLMENT What are two- and fouryear college enrollment rates? How do enrollment rates vary by seamless enrollers (students who go to college the fall after graduating from high school) vs. delayed enrollers?

COLLEGE PERSISTENCE How do persistence rates vary for specific groups of students and types of colleges? By seamless enrollers vs. delayed enrollers? By two-year vs. four-year school?

How do transition rates vary for different schools and different groups of students? When do students fall off track in terms of credit accumulation and when do they get back on track?

4 SDPDIAGNOSTIC

HOW ARE THE DIAGNOSTIC ANALYSES COMPLETED?

CEPR commits a three-person research team for four to five months to complete both diagnostics (human capital and college-going). Along the way, we report interim findings to internal audiences in our partner agency to ensure we are accurately interpreting and using data. We find that these conversations are invaluable, both in grounding our team in the history, politics, and priorities of the agency, as well as helping agency leaders understand what the analyses are conveying and how they may be used for meaningful decision making. We also work with agency leaders to plan an appropriate public release of the findings and then publish them on our web page.
WHY DO WE EXECUTE THESE ANALYSES?

HOW DID WE SELECT THE ANALYSES THAT COMPRISE THE DIAGNOSTIC?

In our pilot study, our CEPR-based research team worked closely with Charlotte-Mecklenburg Schools Superintendent Peter C. Gorman and members of his leadership team to: 1) sift through the data that was available to create the analyses that were possible and 2) sort through the myriad of analyses to hone in on those that are most salient and most actionable. We also attempted to identify analyses that would be replicable given data quality and availability in other education agencies. Our goal is to present new analyses, though certainly some of our work (particularly in the college-going diagnostic) is fairly well understood given No Child Left Behind high school graduation accountability requirements and the excellent work of the National Student Clearinghouse. We attempt to take this information and dive deeper, examining, for example, not just student graduation rates by high school, or by race and ethnicity, but also controlling for prior student achievement or family income. With each agency, we work with leadership to understand the most pressing issues and to ensure the diagnostic analyses address these whenever possible. In partner agencies where we have also placed Data Fellows (another core element of the Strategic Data Project), we often integrate these Fellows onto the diagnostic research team in order to equip them to conduct deeper and agencyspecific analyses stemming from the diagnostic work. Because we strive to develop a body of work with a standard set of analyses that can successfully be used across an array of agencies, we must strike a balance in the level of customization we can provide each agency. We often find, however, that customization can be enhanced with a greater commitment from the agency, such as allocating staff research time to become part of the diagnostic work team.

The diagnostics are a core element of the Strategic Data Project for four reasons. We complete them in order to: 1. Convey actionable and salient information about teacher effectiveness and postsecondary transitions to our partners. 2. Build a body of work that relays the findings of similar analyses across a growing number of agencies, allowing for a means of comparing and identifying best practices. 3. Demonstrate to agencies what can be done with the growing trove of data they are collecting (primarily for compliance purposes, but, which, nonetheless can be used to deeply understand performance). 4. Develop an industry-standard set of metrics that become need-to-know information about any education agency, not unlike widely recognized measures such as the priceto-earnings ratio for a publicly traded company. We also believe that the process of executing and delivering the diagnostics builds the analytic capacity (and appetite) of an organization. When a team sees that teachers with advanced degrees are not, on average, more effective than those without such degrees, they want to know more: Does it matter if a high school math teacher has a masters degree in math? Does it matter from which institution the teacher earns the advanced degree? These questions require deeper digging and may often lead to districts realizing that such essential data elements are not even collected. Additionally, the analyses often can help confirm what leaders may know intuitively. One agency recently learned that its high school that uses block scheduling is remarkably successful at getting students who fall behind on course credits back on track for on-time high school graduation.

SDPDIAGNOSTIC 5

SAMPLE ANALYSES AND THE KINDS OF MANAGEMENT AND POLICY QUESTIONS THEY PROMPT

Our stated goal is that the diagnostic analysis findings help form the basis of management and policy changes intended to improve student results. But what does this really look like? Please refer to the bottom of this page and page 43 for two sample analyses, one each from the human capital diagnostic and the college-going diagnostic along with key questions district leaders may want to explore based on the analytic results. Human Capital: Teacher Placement The graphs below show how teachers are placed with students across the district (left graph) and within schools (right graph). The bars indicate the prior year average student achievement of students taught by novice teachers, teachers with one year of experience, two years of experience, and three years of experience compared to the prior average achievement of students with teachers who have four or more years of experience. Both across the district and within schools, we see that novice teachers are disproportionately placed with lower performing students. This pattern is consistent across several districts for which we have conducted this analysis. Many district leaders may know this to be trueat least anecdotallyacross the district. But we find they are quite surprised to learn how systemic this can be within schools. Other parts of our analysis indicate that novice teachers are, on average, less effective than teachers with several years of experience. This finding is widely found in other research as well.

Thus, this analysis shows that the agency is systematically placing lower performing teachers with the students who need to learn the most. This result occurs not just across schools, but within schools. One could imagine several key questions agency leaders (and principals) would want to understand after learning such findings: 1. What are current methods for placing teachers? 2. How are student rosters created? 3. What student and teacher data is used to determine teacher placement? 4. What strategies are used to place teachers in ways that are intended to increase student achievement? How could this be tracked? 5. Are principals held accountable for teacher placement in any way? 6. What data is collected (and how is it analyzed) about teacher placement? In this case, as in many, it is probably true that any decision would require more analysis and evidence-gathering. Significant changes would clearly require groups within the agency to discuss next steps and options to consider. The goal of the analysis is to help these decision makers approach their decisions with more knowledge.

Average Prior Elementary Student Performance for Teachers with 3 or Fewer Years of Experience Relative to Teachers with 4 or More Years of Experience
Districtwide
.2 .1 0.040 Prior Year Math Score -.3 -.2 -.1 0 .1 Prior Year Math Score .2

Within Specific Schools

-.1

-0.072 -0.192* -0.274* Novice 1 Year 2 Years 3 Years

-0.076* -0.169*

-0.046

-0.002

-.4

-.3

-.2

-.4

Novice

1 Year

2 Years

3 Years

6 SDPDIAGNOSTIC

College-Going: College Enrollment Rates by Prior Student Achievement Many education leaders know roughly what percent of their high school graduates enroll in college, however, not many know this rate by high school or with any degree of granularity about different groups of students. By matching student records with National Student Clearinghouse data, we are able to present K-12 leaders with fine-grained analysis of college enrollment and college persistence of their high school graduates. We created the analysis below to reveal variability that may yet be undetected. For this analysis, we first divide all students in the sample into quartiles according to their eighth-grade composite math and English test scores. Then, we report the college enrollment rates for each high school for the students in each quartile. Each blue bar of the graph represents one high school. Notice that there is nearly a 30-point gap between the top high school and the bottom high school in the college enrollment rates for high school graduates who perform in the top two quartiles based on eighth-grade test scores. The gaps are even larger for the bottom two quartiles. Amazingly, more bottom quartile students from one high school enroll in college (70 percent) than do top quartile students from another high school (64 percent). This introduces a number of questions: 1. What are the top producing high schools doing to prepare students for college? What are they doing to encourage students to enroll in college?

2. Do high schools have the same success in sending students from all quartiles to college, or are some high schools more successful with certain groups of students? 3. Are there high school course enrollment patterns that are evident for students who enroll in college at high rates? 4. Are principals or guidance counselors aware of this variation? This particular analysis might also be highly meaningful for parents and students. It shows rather clearly that prior student achievement does not always predict whether a student goes to college. It at least appears that particular high schools have significant influence on this success.
CONCLUSION

The diagnostic analyses are a starting point, not an ending point. They create methods for analyzing data. They are intended to help agencies better understand current performance and uncover issues. We hope this also allows leaders to strategically plan responses. The Strategic Data Project will publish summary findings for each partners diagnostic results, working carefully with each partner on the public release of the findings. The Charlotte-Mecklenburg Schools human capital findings and Fulton County Schools findings for both diagnostics are currently available on the Center for Education Policy Research at Harvard Universitys website. SDP will also release a compilation of diagnostic findings across SDP partners in summer 2011 (please refer to our website at www.gse.harvard.edu/cepr).

Distribution of College Enrollment Rates by Prior Student Achievement


Percent of High School Graduates
100
91 92 85 90 92

80

70

73 64 64

60

47

51

40 0 20

32

Bottom Quartile

2nd Quartile

3rd Quartile

Top Quartile

8th Grade Composite Score Quartile


Average for Quartile Individual High School

SDPDIAGNOSTIC 7

2011 by the Center for Education Policy Research at Harvard University. All rights reserved.

Anda mungkin juga menyukai