An Appreciation of A.W. Phillips
A way to honor A. W. Phillips is to describe the continuing influence of one of his enduring contributions to economic dynamics, his remarkable 1959 Biometrika paper about how discrete time observations can be used to restrict a continuous time linear model. That paper precisely described what later came to be known as the problem of `aggregation over time,’ set forth a framework for studying it, and achieved useful characterizations of it. Phillips’s 1959 paper partly shared the destiny of John F. Muth’s two 1960 and 1961 papers about rational expectations. It took years for other economists to recognize how much more could be done with their ideas. In 1960, both Phillips and Muth were far ahead of most other economists in their understanding of the technicalities of time series analysis, and their appreciation for its potential applications to economic dynamics.
@article{hansen:1995appreciation, title={An Appreciation of AW Phillips}, author={Hansen, Lars P and Sargent, Thomas J}, year={1995}, publisher={Citeseer} }✕
Robust Permanent Income and Pricing
“… I suppose there exists an extremely powerful, and, if I may so speak, malignant being, whose whole endeavours are directed toward deceiving me.” Rene Descartes, Meditations, II.
@article{hansen1999robust, title={Robust Permanent Income and Pricing}, author={Hansen, Lars Peter and Sargent, Thomas J and Tallarini, Thomas D and others}, journal={Review of Economic studies}, volume={66}, number={4}, pages={873--907}, year={1999} }✕
Micro Data and General Equilibrium Models
An extensive literature in macroeconomics and public finance uses dynamic stochastic general equilibrium models to study consumption, savings, capital accumulation, and asset pricing and to analyze alternative policies. Except for a few special cases, the economies studied cannot be analyzed using “paper and pencil” style analysis. It is often difficult to produce general theorems that are true for all parameter values of dynamic general equilibrium models. This is a general feature of nonlinear dynamic models in economics as well as in the physical sciences. For such models, knowing which parameters govern behavior is essential for understanding their empirical content and for providing quantitative answers to policy questions. For the numerical output of a dynamic equilibrium model to be interesting, the inputs need to be justified as empirically relevant. There are two sources of information that are commonly used in rationalizing parameter values. One is the behavior of time series averages of levels or ratios of key variables. These time series averages are often matched to the steady state implications of versions of the models that abstract from uncertainty. The other input is from microeconomic evidence. In this essay we discuss the use of evidence from both sources, concentrating mostly on microeconomic evidence. See King and Rebelo (1998) and Taylor (1998) for extensive discussions of calibrating real business cycle and staggered contract models, respectively.
@article{bhh:1999, title={Micro Data and General Equilibrium Models}, author={Browning, Martin and Hansen, Lars Peter and Heckman, James J}, journal={Handbook of macroeconomics}, volume={1}, pages={543--633}, year={1999}, publisher={Elsevier} }✕
Spectral Methods for Identifying Scalar Diffusions
This paper shows how to identify nonparametrically scalar stationary diffusions from discrete-time data. The local evolution of the diffusion is characterized by a drift and diffusion coefficient along with the specification of boundary behavior. We recover this local evolution from two objects that can be inferred directly from discrete-time data: the stationary density and a conveniently chosen eigenvalue–eigenfunction pair of the conditional expectation operator over a unit interval of time. This construction also lends itself to a spectral characterization of the over-identifying restrictions implied by a scalar diffusion model of a discrete-time Markov process.
@article{hansen1998spectral, title={Spectral Methods for Identifying Scalar Diffusions}, author={Hansen, Lars Peter and Scheinkman, Jos{'e} Alexandre and Touzi, Nizar}, journal={Journal of Econometrics}, volume={86}, number={1}, pages={1--32}, year={1998}, publisher={Elsevier} }✕
Risk and Robustness in Equilibrium
@article{anderson1998risk, title={Risk and Robustness in General Equilibrium}, author={Anderson, Evan W and Hansen, Lars Peter and Sargent, Thomas J}, journal={Preprint University of Chicago}, year={1998} }✕
Short-term Interest Rates As Subordinated Diffusions
In this article we characterize and estimate the process for short-term interest rates using federal funds interest rate data. We presume that we are observing a discrete-time sample of a stationary scalar diffusion. We concentrate on a class of models in which the local volatility elasticity is constant and the drift has a flexible specification. To accommodate missing observations and to break the link between “economic time” and calendar time, we model the sampling scheme as an increasing process that is not directly observed. We propose and implement two new methods for estimation. We find evidence for a volatility elasticity between one and one-half and two. When interest rates are high, local mean reversion is small and the mechanism for inducing stationarity is the increased volatility of the diffusion process.
@article{conley1997short, title={Short-Term Interest Rates as Subordinated Diffusions}, author={Conley, Timothy G and Hansen, Lars Peter and Luttmer, Erzo GJ and Scheinkman, Jos{'e} A}, journal={Review of Financial Studies}, volume={10}, number={3}, pages={525--577}, year={1997}, publisher={Soc Financial Studies} }✕
Bootstrapping the Long Run
We develop and apply bootstrap methods for diffusion models whenfitted to the long run as characterized by the stationarydistribution of the data. To obtain bootstrap refinements tostatistical inference, we simulate candidate diffusion processes. Weuse these bootstrap methods to assess measurements of local meanreversion or pull to the center of the distribution for short-terminterest rates. We also use them to evaluate the fit of the model to the empirical density.
@article{conley1997bootstrapping, title={Bootstrapping the Long Run}, author={Conley, Timothy G and Hansen, Lars Peter and Liu, Wen-Fang}, journal={Macroeconomic Dynamics}, volume={1}, number={02}, pages={279--311}, year={1997}, publisher={Cambridge University Press} }✕
Assessing Specification Errors in Stochastic Discount Factor Models
In this article we develop alternative ways to compare asset pricing models when it is understood that their implied stochastic discount factors do not price all portfolios correctly. Unlike comparisons based on χ 2 statistics associated with null hypotheses that models are correct, our measures of model performance do not reward variability of discount factor proxies. One of our measures is designed to exploit fully the implications of arbitrage-free pricing of derivative claims. We demonstrate empirically the usefulness of our methods in assessing some alternative stochastic factor models that have been proposed in asset pricing literature.
@article{hansen:1997assessing, title={Assessing Specification Errors in Stochastic Discount Factor Models}, author={Hansen, Lars Peter and Jagannathan, Ravi}, journal={The Journal of Finance}, volume={52}, number={2}, pages={557--590}, year={1997}, publisher={Wiley Online Library} }✕
The Empirical Foundations of Calibration
Interest in simulating recently developed dynamic stochastic general equilibrium models of the economy stimulated a demand for parameters. This has given rise to calibration as advocated by Finn E. Kydland and Edward C. Prescott (1982). This paper explores the implicit assumptions underlying their calibration method. The authors question that there is a ready supply of micro estimates available to calibrate macroeconomic models. Measures of parameter uncertainty and specification sensitivity should be routinely reported. They propose a more symbiotic role for calibration as providing signals to microeconomists about important gaps in knowledge, which when filled will solidify the empirical underpinning, improving the credibility of the quantitative output.
@article{hansen:1996empirical, title={The Empirical Foundations of Calibration}, author={Hansen, Lars Peter and Heckman, James J}, journal={The Journal of Economic Perspectives}, volume={10}, number={1}, pages={87--104}, year={1996}, publisher={JSTOR} }✕
Finite-Sample Properties of Some Alternative GMM Estimators
We investigate the small-sample properties of three alternative generalized method of moments (GMM) estimators of asset-pricing models. The estimators that we consider include ones in which the weighting matrix is iterated to convergence and ones in which the weighting matrix is changed with each choice of the parameters. Particular attention is devoted to assessing the performance of the asymptotic theory for making inferences based directly on the deterioration of GMM criterion functions.
@article{hansen:1996finite, title={Finite-Sample Properties of Some Alternative GMM Estimators}, author={Hansen, Lars Peter and Heaton, John and Yaron, Amir}, journal={Journal of Business & Economic Statistics}, volume={14}, number={3}, pages={262--280}, year={1996}, publisher={Taylor & Francis Group} }✕