Anda di halaman 1dari 8

Identification and Estimation of Causal Effects in Economics and other Social Sciences
Syllabus
Stanislao Maldonado
Department of Agricultural and Resource Economics
University of California at Berkeley

I. Overview

This course discusses conceptual and technical issues related to the identification and
estimation of causal effects in economics and other social sciences. The course is designed to
introduce modern econometric techniques for applied researchers and includes an important
empirical portion devoted to the implementation of the estimators discussed in class using real
data. In addition, empirical papers will be discussed in order to get a better understanding of
the econometric techniques introduced in class. Consequently, the emphasis is more on research
design and applications than theoretical proofs, although some of the former will be discussed.
Familiarity with basic econometrics is assumed. The readings for the course are available in the
following webpage:

http://are.berkeley.edu/~stanislao/

II. Program

The course is organized in four sections as follows:

1. Introduction: Structural Modeling versus Potential Outcomes Framework.


2. Experimental Designs: Randomized Control Trials (RCT).
3. Non-experimental Designs.
3.1. Selection on Observables Designs.
3.2. Selection on Non-observables Designs.
4. Other issues.

III. Textbooks and General Readings

There is no a single text for the course, but some parts of the following ones will be used
extensively. The reading list is organized using abbreviations that follow each text:

Introductory level:
Stock, James and Mark Watson (2007). Introduction to Econometrics. Pearson/Addison
Wesley. (SW)

Intermediate/Advanced:
Cameron, A. Colin and Pravin Trivedi (2005). Microeconometrics. Cambridge University
Press. (CT)

 

Wooldridge, Jeffrey (2001). Econometric Analysis of Cross-section and Panel Data. MIT Press.
(JW)
Lee, Myoung-Jae (2005). Micro-Econometrics for Policy, Program, and Treatment Effects. Oxford
University Press. (ML)
Millimet, Daniel; Jeffrey Smith and Edward Vytlacil (2008). Modelling and Evaluating
Treatment Effects in Econometrics. Advances in Econometrics Vol. 21. Elservier Ltd. (MSV)

Other relevant texts:


Angrist, Joshua and Jorn-Steffen Pischke (2009). Mostly Harmless Econometrics: An Empiricist’s
Companion. Princeton University Press. (AP)
Morgan, Stephen and Christopher Winship (2007). Counterfactuals and Causal Inference:
Methods and Principles for Social Research. Cambridge University Press. (MW)
Rubin, Donald (2006). Matched Sampling for Causal Effects. Cambridge University Press. (DR)
Shadish, William; Thomas Cook and Donald Campbell (2002). Experimental and Quasi-
Experimental Designs for Generalized Causal Inference. Houghton Mifflin Company. (SCC)
Pearl, Judea (2000). Causality: Models, Reasoning, and Inference. Cambridge University Press.
(JP)

Other relevant readings:


Angrist, Joshua and Alan Krueger (1999). “Empirical Strategies in Labor Economics”.
Handbook of Labor Economics, Vol. 3. Elsevier Science.
Reiss, Peter and Frank Wolak (2007). “Structural Econometric Modelling: Rationales and
Examples from Industrial Organization”. Handbook of Econometrics, Volume 6A. Elservier
Science.
Ravallion, Martin (2008). “Evaluating Anti-poverty Programs”. Handbook of Development
Economics, Volume 4. Elservier Science.
Heckman, James and Edward Vytlacil (2007). “Econometric Evaluation of Social Programs,
Part I: Causal Models, Structural Models and Econometric Policy Evaluation”. Handbook of
Econometrics, Vol 6B. Elservier Science.
Imbens, Guido and Jeffrey Wooldridge (2009). “Recent Developments in the Econometrics
of Program Evaluation”. Journal of Economic Literature, March 2009.

Texts for empirical sessions:


Baum, Cristopher (2006). An Introduction to Modern Econometrics using STATA. STATA Press.
Cameron, A. Colin and Pravin Trivedi (2009). Microeconometrics using STATA. STATA Press.

IV. Detailed program and Readings

1. Introduction: Structural Modeling versus Potential Outcomes Framework.

Basic readings:

 

Holland, P. W. (1986). “Statistics and Causal Inference”, Journal of the American Statistical
Association, 81, 945-970. Read also comments to the article.
Heckman, James (2000). “Causal Parameters and Policy Analysis in Economics: A Twentieth
Century Perspective”, Quarterly Journal of Economics, 115, 45-97.
Hoover, Kevin (2006). “Causality in Economics and Econometrics”. Paper prepared for the
New Palgrave Dictionary of Economics.
CT, Chapter 2.
MW, Chapter 2.

Supplementary readings:
Heckman, James (2005). “A Scientific Model of Causality”, Sociological Methodology, 35, 1-
150.
JP, Chapter 5.
Holmes, Thomas (2009). “Structural, Experimentalist and Descriptive Approaches to
Empirical Work in Regional Economics”. Mimeo.

2. Experimental Designs: Randomized Control Trials (RCT).

Basic readings:
Duflo, Esther; Rachel Glannester and Micheal Kremer (2008). “Using Randomization in
Economic Development Research: A Toolkit”. Handbook of Development Economics, Vol. 4,
Elservier Science.
Banerjee, Abhijit and Esther Duflo. “The Experimental Approach to Development
Economics”. NBER Working Paper 14467.
SW, Chapter 13.
AP, Chapter 2.

Supplementary readings:
LaLonde, Robert (1986). “Evaluating the Econometric Evaluations of Training Programs
with Experimental Data”, American Economic Review, 76, 604-620.
Heckman, James (1995). “Assessing the Case for Social Experiments”, Journal of Economic
Perspectives, 9, 85-110.
Bruhn, Miriam and David McKenzie (2008). "In Pursuit of Balance." World Bank Policy
Research Working Paper 4752.
SSC, Chapter 1.

Examples:
Schultz, T. Paul (2004). “School Subsidies for the Poor: Evaluating the Mexican Progresa
Poverty Program”. Journal of Development Economics. June, 199-250.
Olken, Ben (2007). “Monitoring Corruption: Evidence from a Field Experiment in
Indonesia.” Journal of Political Economy, 115, 200-249.

 

Chattopadhyay, Raghabendra and Esther Duflo (2004). “Women as Policy Makers: Evidence
from a Randomized Policy Experiment in India.”Econometrica, 72, 1409-1443.
Gerber, Alan S., and Donald P. Green (2000). “The Effects of Canvassing, Direct Mail, and
Telephone Contact on Voter Turnout: A Field Experiment”. American Political Science Review,
94, 653-63.

3. Non-experimental Designs.

3.1. Selection on Observables Designs.

3.1.1. Causality and Regression.

Basic readings:
AP, Chapter 3.1-3.2.
MW, Chapter 5.
CT, Sections 4.1-4.4.
JW, Chapter 4.

Examples:
Krueger, Alan (1993). “How Computers Have Changed the Wage Structure: Evidence
from Microdata, 1984-1989” Quarterly Journal of Economics, 108, 33-60.
DiNardo, John E and Pischke, Jorn-Steffen (1997). "The Returns to Computer Use
Revisited: Have Pencils Changed the Wage Structure Too?" The Quarterly Journal of
Economics, 112, 291-303.

3.1.2. Matching Methods.

Basic readings:
AP, Chapter 3.1-3.2.
MW, Chapter 4.
DR, Chapters 10 and 12.
ML, Chapter 4.
Imbens, Guido (2004). “Nonparametric Estimation of Average Treatment Effects Under
Exogeneity: A Review”, Review of Economics and Statistics, 86, 4-29.

Supplementary readings:
Dehejia, Rajeev and Sadek Wahba (1999). “Causal Effects in Non-Experimental Studies:
Reevaluating the Evaluation of Training Programs”, Journal of the American Statistical
Association, 94, 1053-1062.
Smith, Jeffrey and Petra Todd (2005). “Does Matching Overcome LaLonde’s Critique of
Non-experimental Methods?”, Journal of Econometrics, 125, 305-353.

 

Arceneaux, Kevin; A. S. Gerber and D. P. Green (2006). “Comparing Experimental and


Matching Methods Using a Large-Scale Voter Mobilization Experiment”, Political
Analysis, 14, 37 - 62.

Examples:
Jalan, Jyotsna and Martin Ravallion (2003). “Does Piped Water Reduce Diarrhea for
Children in Rural India?”. Journal of Econometrics. January, 153-173.
Gilligan, Michael J. and Ernest J. Sergenti (2008) "Do UN Interventions Cause Peace?
Using Matching to Improve Causal Inference", Quarterly Journal of Political Science, 3, 89-
122.
Persson, Torsten and Tabellini, Guido (2007). “The Growth Effect of Democracy: Is It
Heterogenous and How Can it Be Estimated?”. NBER Working Paper 14723.

3.2. Selection on Non-observables Designs.

3.2.1. Fixed effects, “Natural Experiments” and Differences-in-Differences.

Basic readings:
AP, Chapter 5.
CT, Chapter 21.
JW, Chapter 10.
Meyer, Bruce (1995). “Natural and Quasi‐Natural Experiments in Economics”, Journal of
Business and Economic Statistics, 13, 151-161.

Supplementary readings:
ML, Sections 4.5-4.6.
Rosenzweig, Mark and Kenneth Wolpin (2000). “Natural "Natural Experiments" in
Economics”. Journal of Economic Literature, 38, 827-874.
Bertrand, Marianne; Esther Duflo and Sendhil Mullainathan (2004). “How Much Should
We Trust Differences-in-Differences Estimates?” Quarterly Journal of Economics, 119, 249-
275.

Examples:
Card, D. and A. Krueger (1994). “Minimum Wages and Employment: A Case Study of
the Fast Food Industry”, American Economic Review, 84, 772-793.
Galiani, Sebastian, Gertler, Paul J. and Schargrodsky, Ernesto (2005). “Water for Life:
The Impact of the Privatization of Water Services on Child Mortality”. Journal of Political
Economy, 113, pp. 83-120.
Levitt, Steven (1994). "Using Repeat Challengers to Estimate the Effect of Campaign
Spending on Election Outcomes in the U.S. House." Journal of Political Economy, 102, 777-
98.

 

Di Tella, Rafael, and Ernesto Schargrodsky (2004). "Do Police Reduce Crime? Estimates
Using the Allocation of Police Forces after a Terrorist Attack." American Economic Review,
94, 115–133.
Di Tella, Rafael, Sebastian Galliani, and Ernesto Schargrodsky (2007). “The Formation of
Beliefs: Evidence from the Allocation of Land Titles to Squatters”. Quarterly Journal of
Economics, 122, 209–41.

3.2.2. Instrumental Variables, Weak Instruments and Heterogeneous Treatment Effects.

Basic readings:
JW, Chapter 5.
AP, Chapter 4.
MW, Chapter 7.
Angrist, Joshua; Guido Imbens and Donald Rubin (1996). “Identification of Causal
Effects Using Instrumental Variables”, Journal of the American Statistical Association, 91,
444-455.
Heckman, James (1997). “Instrumental Variables: A Study of Implicit Behavioral
Assumptions in One Widely Used Estimator”. Journal of Human Resources, 32, 441-462.

Supplementary readings:
Imbens, Guido W. and Joshua Angrist (1994). “Identification and Estimation of Local
Average Treatment Effects”, Econometrica, 62, 467-475.
Angrist, Joshua (2004). “Treatment Effect Heterogeneity in Theory and Practice”,
Economic Journal, 114, C52-C83.
Bound, J., D. Jaeger, and R. Baker (1995). “Problems with Instrumental Variables
Estimation when the Correlation between the Instruments and the Endogenous
Explanatory Variables is Weak”, Journal of the American Statistical Association, 90, 443–450.

Examples:
Angrist, Joshua (1990). "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence
from Social Security Administrative Records," American Economic Review, 80, 313-336.
Miguel, Edward; S. Satyanath and E. Sergenti (2004). “Economic Shocks and Civil
Conflict: An Instrumental Variables Approach”, Journal of Political Economy, 112, 725-753.
Acemoglu, Daron, Simon Johnson and James Robinson (2001). “The Colonial Origins of
Comparative Development: An Empirical Investigation”, American Economic Review, 91,
1369-1401.
Chay, Kenneth and Michael Greenstone (2005). “Does Air Quality Matter? Evidence
from the Housing Market”, Journal of Political Economy, 133, 376-424.

3.2.3. Regression Discontinuity Design.



 

Basic readings:
CT, Section 25.6
AP, Chapter 6.
Hahn, J., P. Todd, and W. van der Klaauw (2001). “Estimation of Treatment Effects with
a Regression-Discontinuity Design”, Econometrica, 69, 201-209.
Imbens, Guido and Thomas Lemieux (2008). “Regression Discontinuity Designs: A
Guide to Practice”, Journal of Econometrics, 142, 615-635.

Supplementary readings:
Lee, David and Thomas Lemieux (2009). “Regression Discontinuity Design in
Economics”. NBER Working Paper 14723.

Examples:
Manacorda, Marco, Edward Miguel, and Andrea Vigorito (2009). “Government
Transfers and Political Support”, unpublished working paper.
Angrist, Joshua and Victor Lavy (1999). “Using Maimonides' Rule to Estimate the Effect
of Class Size on Scholastic Achievement”, Quarterly Journal of Economics, 114, 533-575.
Lee, David; Enrico Moretti and Matthew Butler (2004). "Do Voters Affect or Elect
Policies? Evidence from the U.S. House", Quarterly Journal of Economics, 119, 807–859.
Dell, Melissa (2008). “The Persistent Effects of Peru's Mining Mita”, Mimeo.
Pettersson-Lidbom, Per and Björn Tyrefors (2008). “The Policy Consequences of Direct
versus Representative Democracy: A Regression Discontinuity Approach”, Mimeo.

4. Other issues.

4.1. Clustering, Standard Errors and Sampling Issues.

CT, Chapter 24.


AP, Chapter 8.
Wooldridge, Jeffrey (2003), “Cluster-Sample Methods in Applied Econometrics”,
American Economic Review, 93, 133-139.
Moulton, B.R. (1990). “An Illustration of a Pitfall in Estimating the Effects of Aggregate
Variables on Micro Units”, Review of Economics and Statistics, 72, 334-38.
Deaton, Angus (1997). The Analysis of Household Surveys: A Microeconomic Approach to
Development Policy. World Bank Publications. Chapter 1 and 2.

4.2. External/Internal Validity Issues.

Basic readings:
SCC, Chapter 2 and 3.

 

Roe, Brian and David Just (2009). “Internal and External Validity in Economics
Research: Tradeoffs between Experiments, Field Experiments, Natural Experiments and
Field Data”. Forthcoming in the American Journal of Agricultural Economics.

Examples:
Miguel, Edward and Micheal Kremer (2004). “Worms: Identifying Impacts on Education
and Health in the Presence of Treatment Externalities”, Econometrica, 72, 159-217.

4.3. Recent Controversies.

Keane, Micheal (2005). “Structural vs Atheoretic Approaches to Econometrics”. Mimeo.


Deaton, Angus (2009). “Instruments of Development: Randomization in the Tropics, and
the Search for the Elusive Keys to Economic Development”. NBER Working Paper
14690.
Heckman, James and Sergio Urzua (2009). “Comparing IV With Structural Models: What
Simple IV Can and Cannot Identify”. NBER Working Paper 14706.
Imbens, Guido (2009). “Better LATE Than Nothing: Some Comments on Deaton (2009)
and Heckman and Urzua (2009)”. NBER Working Paper 14896.

Anda mungkin juga menyukai