![]() |
![]() |
![]() |
Volume 15, Number 4, Fall 2001
Articles
Symposium on Econometric Tools by Alan B. Krueger
Nonparametric Density and Regression Estimation by John DiNardo and Justin L. Tobias
We provide a nontechnical review of recent nonparametric methods for estimating density and regression functions. The methods we describe make it possible for a researcher to estimate a regression function or density without having to specify in advance a particular-and hence potentially misspecified functional form. We compare these methods to more popular parametric alternatives (such as OLS), illustrate their use in several applications, and demonstrate their flexibility with actual data and generated-data experiments. We show that these methods are intuitive and easily implemented, and in the appropriate context may provide an attractive alternative to "simpler" parametric methods.
Semiparametric Censored Regression Models by Kenneth Y. Chay and James L. Powell
When data are censored, ordinary least squares regression can provide biased coefficient estimates. Maximum likelihood approaches to this problem are valid only if the error distribution is correctly specified, which can be problematic in practice. We review several semiparametric estimators for the censored regression model that do not require parameterization of the error distribution. These estimators are used to examine changes in black-white earnings inequality during the 1960s based on censored tax records. The results show that there was significant earnings convergence among black and white men in the American South after the passage of the 1964 Civil Rights Act.
Binary Response Models: Logits, Probits and Semiparametrics by Joel L. Horowitz and N.E. Savin
A binary-response model is a mean-regression model in which the dependent variable takes only the values zero and one. This paper describes and illustrates the estimation of logit and probit binary-response models. The linear probability model is also discussed. Reasons for not using this model in applied research are explained and illustrated with data. Semiparametric and nonparametric models are also described. In contrast to logit and probit models, semi- and nonparametric models avoid the restrictive and unrealistic assumption that the analyst knows the functional form of the relation between the dependent variable and the explanatory variables.
Mismeasured Variables in Econometric Analysis: Problems from the Right and Problems from the Left by Jerry Hausman
The effect of mismeasured variables in the most straightforward regression analysis with a single regressor variable leads to a least squares estimate that is downward biased in magnitude toward zero. I begin by reviewing classical issues involving mismeasured variables. I then consider three recent developments for mismeasurement econometric models. The first issue involves difficulties in using instrumental variables. A second involves the consistent estimators that have recently been developed for mismeasured nonlinear regression models. Finally, I return to mismeasured left hand side variables, where I will focus on issues in binary choice models and duration models.
Instrumental Variables and the Search for Identification: From Supply and Demand to Natural Experiments by Joshua D. Angrist and Alan B. Krueger
Instrumental variables was first used in the 1920s to estimate supply and demand elasticities and later to correct for measurement error in single equation models. Recently, instrumental variables have been widely used to reduce bias from omitted variables in estimates of causal relationships. Intuitively, instrumental variables methods use only a portion of the variability in key variables to estimate the relationships of interest; if the instruments are valid, that portion is unrelated to the omitted variables. We discuss the mechanics of instrumental variables and the qualities that make for a good instrument, devoting particular attention to instruments derived from "natural experiments."
Applications of Generalized Method of Moments Estimation by Jeffrey M. Wooldridge
I describe how the method of moments approach to estimation, including the more recent generalized method of moments (GMM) theory, can be applied to problems using cross section, time series, and panel data. Method of moments estimators can be attractive because in many circumstances they are robust to failures of auxiliary distributional assumptions that are not needed to identify key parameters. I conclude that while sophisticated GMM estimators are indispensable for complicated estimation problems, it seems unlikely that GMM will provide convincing improvements over ordinary least squares and two-stage least squaresby far the most common method of moments estimators used in econometricsin settings faced most often by empirical researchers.
Vector Autoregressions by James H. Stock and Mark W. Watson
This paper critically reviews the use of vector autoregressions (VARs) for four tasks: data description, forecasting, structural inference, and policy analysis. The paper begins with a review of VAR analysis, highlighting the differences between reduced-form VARs, recursive VARs and structural VARs. A three variable VAR that includes the unemployment rate, price inflation and the short term interest rate is used to show how VAR methods are used for the four tasks. The paper conludes that VARs have proven to be powerful and reliable tools for data description and forecasting, but have been less useful for structural inference and policy analysis.
The New Econometrics of Structural Change: Dating Breaks in U.S. Labor Productivity by Bruce E. Hansen
We have seen the emergence of three major innovations in the econometrics of structural change in the past fifteen years: (1) Tests for a structural break of unknown timing; (2) Estimation of the timing of a structural break; and (3) Tests to distinguish unit roots from broken time trends. These three innovations have dramatically altered the face of applied time series econometrics. In this paper, we review these three innovations, and illustrate their application through an empirical assessment of U.S. labor productivity in the manufacturing/durables sector.
The Bootstrap and Multiple Imputations: Harnessing Increased Computing Power for Improved Statistical Tests by David Brownstone and Robert Valletta
The bootstrap and multiple imputations are two techniques that can enhance the accuracy of estimated confidence bands and critical values. Although they are computationally intensive, relying on repeated sampling from empirical data sets and associated estimates, modern computing power enables their application in a wide and growing number of econometric settings. We provide an intuitive overview of how to apply these techniques, referring to existing theoretical literature and various applied examples to illustrate both their possibilities and their pitfalls.
Quantile Regression by Roger Koenker and Kevin F. Hallock
Quantile regression, as introduced by Koenker and Bassett (1978), may be viewed as an extension of classical least squares estimation of conditional mean models to the estimation of an ensemble of models for several conditional quantile functions. The central special case is the median regression estimator whch minimizes a sume of absolute errors. Other conditional quantile functions are estimated by minimizing an asymmetrically weighted sum of absolute errors. Quantile regression methods are illustrated with applications to models for CEO pay, food expenditure, and infant birthweight.
GARCH 101: The Use of ARCH/GARCH Models in Applied Econometrics by Robert Engle
ARCH and GARCH models have become important tools in the analysis of time series data, particularly in financial applications. These models are especially useful when the goal of the study is to analyze and forecast volatility. This paper gives the motivation behind the simplest GARCH model and illustrates its usefulness in examining portfolio risk. Extensions are briefly discussed.
Teaching Statistics and Econometrics to Undergraduates by William E. Becker and William H. Greene
Traditionally econometrics and economics statistics have been taught in the theory and proof, chalk and talk mode commonly found in the teaching of mathematics. We advance the use of computer technology in the teaching of quantitative methods to get students actively engaged in the learning process. We also assert that the essential tasks for those who teach these courses are to identify important issues that lend themselves to quantitative analyses and then to help students develop an understanding of the appropriate key concepts for those analyses.
Free Labor for Costly Journals? by Theodore C. Bergstrom
Retrospectives: Cost-Benefit Analysis and the Classical Creed by Joseph Persky
Features
Recommendations for Further Reading
Comments by Russell S. Sobel, Alberto Alesina, Kurt W. Rothschild, J.E. King, and Mark Blaug
Notes