Discovering India: My Crash Course on the Indian Economy

In July my spouse, Grace Tsiang, and I visited Mumbai and Hyderabad in India.  Our trip was orchestrated in part by our friend and distinguished scholar Ravi Jagannathan, who is affiliated with and an external advisor to one of our hosts, the Indian School of Business (ISB).  My other host was the Reserve Bank of India (RBI).

lphisbsmallsize
Lars and Bhagwan Choudhry share a panel discussion on ISB.

During my visit, I gave two lectures and a public interview, participated in a formal exchange with Bhagwan Chowdhry of UCLA about my research and perspectives, and had many other opportunities to share insights.  It was my first visit to this interesting and important country that faces many challenges as it nurtures growth and transitions to a market economy. To my amusement, the Indian press, like the media in other countries, presumed I would arrive as an expert on their economy.

My first formal talk was at the RBI in a celebration of the 10th Statistics Day in India. The initial speaker was Raghu Rajan, who is the outgoing governor at the Reserve Bank. Raghu is about to return to the University of Chicago, which is welcome news for the University, but is a loss to the RBI of a distinguished and perceptive economist.  My talk focused on decision-making, financial market behavior, and policy design in environments with a high degree of statistical complexity and where there is limited understanding.  I proposed ways to nurture productive interaction between statistical constructs and practical decision making both “inside the economic models’’ we build and “outside the models” when providing inputs into the design of policy.  I suggested frameworks that acknowledge in a tractable and revealing way the uncertainties that we face, including the potential for model misspecification.  Tools from statistics are valuable inputs into such analyses.

My talk at ISB followed a similar theme but focused on gaining a better understanding of financial market behavior and included a more formal analysis. In economists’ fascination with psychology and behavior, discerning the role of the statistical complexity in the underlying economic environment too often is shunted to the background.

Because it was my first time visiting India, I looked forward to discovering more about its economy. One of my superb students, Aaron Pancost, who is about to finish his Ph.D. degree, studies the aggregate impact of financing restrictions on Indian firms. While I have followed his work with great interest, my knowledge and understanding of the Indian economy was limited, despite the press’s impression. Typically, it is the advisee and not the advisor who is sent off to do the field work. For my benefit, I found the roles reversed.

Lars in India.
Lars and Raghu Rajan at the Royal Bank of India event.

I also benefitted from a variety of interactions.  At the RBI my talk followed that of Raghu Rajan.  He has been subject to some not very compelling criticisms, and his talk went through evidence that is most germane to the policy undertaken at the bank under his leadership.  I also was provided in advance with some very nice briefings from some faculty affiliated with the ISB and learned more from private conversations with them. The ISB and Adi Burjorji Godrej, a well-respected industrialist and businessman, hosted a luncheon where I received some insights from private sector leaders.

In India, as elsewhere, there is public outcry to let conventional monetary policy solve a wide array of problems.  This occurs, I believe, because monetary policy is easier for politicians and the media to target in their commentary than, say, financial market oversight and long-term fiscal sustainability. The RBI has focused recently on stabilizing inflation with the idea that this will, in the longer term, help the Indian economy sustain growth. I am sympathetic with the perspective and found Raghu’s discussion of why inflation is not yet “a solved problem” for the Indian economy to be credible.

Many economists who I talked with compared the performance of publicly-owned and privately-owned banks. The private banks are evidently more efficient and better willing or able to provide resources to new enterprises.  Market valuations reflect these differences. This led me and others to ask, “Why not privatize the publicly-owned banks?’’

The answer is apparently that it is politically infeasible to do so.  As a partial alternative, the RBI has recently insisted that public sector banks produce more transparent balance sheets.  This is a productive step that will help with external assessments of bank value and productivity.   Over time, I hope that private banks are given the opportunity to play an even more dominant role in the provision of financing for enterprises.

India has also recently witnessed notable and impressive increases in foreign direct investment (FDI). The contribution of FDI to overall investment remains modest, however. Continuing to make India attractive to external investors is promising, but labor market restrictions remain a serious impediment to firms’ ability to expand their productive activities and for nurturing new enterprises.

Finally, I gained a new perspective on the importance of language in enhancing economic development.  India’s population has a rich cultural diversity, which extends to many languages, but secondary schools use Hindi and English in addition to the local languages.  The common fluency in English facilitates communication among regions in India, and, notably, it is a substantial advantage for India’s participation in the world economy compared to other Asian nations.

While of great value, this crash course on the Indian macroeconomy still does not transform me into an expert. I have more to learn about their fascinating and important economy. In this case, however, I return to Chicago with a better perspective on Aaron’s research and many new insights about India.

— Lars Peter Hansen

Photo of Lars Peter Hansen and Bhagwan Chowdhry (top) courtesy of Indian School of Business.

Photo of Lars Peter Hansen and Raghu Rajan (bottom) courtesy of Royal Bank of India. 

Opportunity at the Intersection of Computation and Economics

Following the tradition of interdisciplinary collaboration will yield exciting research

In the 1940s and ’50s, the Cowles Commission, then at the University of Chicago, brought together economic scholars together with eminent statisticians and applied mathematicians who pioneered exciting new lines of research in mathematically-oriented economic theory and econometrics. Their intellectual probes, nurtured by cross-disciplinary interactions, had a profound impact on economic research over the next decades.

Today, as our computational power continues to expand, it opens the door to new and exciting approaches to research. Following in the tradition of the Cowles Commission, in the next few months the Becker Friedman Institute will be exploring how computation can nurture new approaches to economic research by bringing together computational experts and economists to engage in productive exchanges of ideas along two different fronts.

One area we are exploring is how computing power enhances development of economic theory. For example, economists often use rationality hypotheses when building models. It is understood this approach is at best an approximation of individual’s behavior and decision-making. This has led many researchers to explore alternative notions of bounded rationality in complex economic environments in which the approximation of full rationality is harder to defend. Among other things, models with bounded rationality impose limitations on the computational effort required for the full optimization.

Meanwhile, advances in information technology has led to the emergence of new markets with new forms of exchange. Computational advances offer approaches that can approximate behavioral interactions in these new types of market interactions. Our 2015–16 Research Fellows, Ben Brooks and Mohammad Akbarpour, have organized a conference in August on the Frontiers of Economic Theory and Computer Science that will bring together economists and computer scientists to explore promising new research directions in this exciting area of endeavor.

On a related front, data science has brought together computational and statistical expertise to study so-called “machine learning” approaches to the analysis of large scale data accumulating from everyday transactions in every area of our lives. The institute is probing the question of how to use such approaches in conjunction with economic models that allow us to study important policy questions. Comparing alternative policy options often requires that we engage in the analysis of counterfactuals. This requires that we extrapolate what we have learned from rich data to realms where data is more sparse, using what we call structural economic models.  In this vein, the analysis of new and rich data will lead to new economic models designed to address important policy questions. A conference I am organizing with my colleagues Stéphane BonhommeJohn Lafferty, and Thibaut Lamadon will bring together econometricians and statisticians to probe new opportunities for advancement in this exciting synergistic area of research.

While two conferences alone cannot hope to meet the impact of almost two decades of influential work that emerged from the Cowles Commission, they will help to encourage some exciting directions for synergistic research and innovation at the intersection of computation, statistics and economic analysis.

—Lars Peter Hansen

MFM Launches First Summer Session for Young Scholars

Research Reflection

Many years ago I participated in a project supported by the National Research Council that was designed to nurture linkages between mathematics and the sciences. Through this experience I was introduced to an exciting cross-disciplinary teaching opportunity called the Geophysical Fluid Dynamics Program at Woods Hole.

This summer program, which was founded in 1959, brought leading faculty from diverse backgrounds together with promising young scholars who shared a common interest in nonlinear dynamics. The goal was to introduce them to a then relatively new topic in mathematical physics, geophysical fluid dynamics.
In 2000, the National Research Council published a book on the program that suggested it “could serve as a model for introducing senior scientists, graduate students, and postdoctoral fellows to cross-disciplinary research and for sustaining their interest in and commitment to such research.” The possibilities for replicating this approach stayed with me. I remained curious to see if a similar program could be productive in economics.

The culmination of that ambition came in June 2016, as Andrew Lo of MIT and I planned and led a new offering: the MFM Summer Session for Young Scholars. The sessions immersed students in an exploration of the linkages between macroeconomics and finance, with the goal of advancing quantitative and empirical work on macro models aimed at supporting prudent policy-making. Interestingly enough, the event was held in Harwich, MA, on Cape Cod, not too far from Woods Hole.

Through this inaugural conference, almost 50 young scholars from around the world gained the opportunity to engage in many informal discussions with senior experts from academia, the private sector, and research groups from governmental agencies, much in the way envisioned in the 2000 NRC report. They also listened to panel discussions on key topics and made important new contacts with other young scholars with closely related research interests.

The panel discussions highlighted various challenges and opportunities faced by the experts; the lectures focused on evidence and methods, new data sources, research challenges in the private and governmental sectors, and methods for addressing complexity and big data.

Among many lectures, Deborah Lucas reminded the young scholars that the government itself is a “systematically important financial institution” that warrants scrutiny. Antoinette Schoar reported that we still have much to learn about the systemic impact of housing finance, and Christopher Sims argued that there remain open empirical questions about the linkages between credit cycles and business cycles. Simon Gilchrist looked at inflation dynamics and firm behavior during the financial crisis and discussed its implications for monetary policy.

Students also learned about modeling advances featuring nonlinear transition mechanisms induced by financing restrictions from Yuliy Sannikov, Jarda Borovička, and myself.

In contrast to some of the other summer camps in economics, many of the young scholars were sufficiently advanced to present their own research for informal scrutiny and feedback.  We had 17 posters in multiple sessions and 9 student presentations that contributed notably to the conversation and exchange.

This first conference was a great start for a program that I sincerely hope will nurture important research long into the future.

–Lars Peter Hansen

AFA 2013 Laureate Panel Discussion: Q & A Revisited

On January 4, 2014, the American Financial Association (AFA) hosted a panel discussion with 2013 Nobel laureates. AFA President Luigi Zingales, the panel moderator, circulated these questions in advance. My answers below elaborate a bit on what I said at the panel.

While the quality of the contributions is beyond question, this particular combination of awardees has left many puzzled. Could you explain not only why you got the prize but why you got it together with these two other researchers?

I prefer not to defend why I received the prize since I was not on the award committee. Nevertheless let me not discard this opportunity and say something about my work.

My initial background was in econometrics and macroeconomics. Some of my im- portant contributions have been to econometrics more generally. My entrance into empirical asset pricing thus came from an outsider’s perspective, which I believe proved to be advantageous. I was lucky enough to have some important collabora- tions stemming from my time at Carnegie-Mellon University with Bob Hodrick, Scott Richard, Ravi Jagannathan, Marty Eichenbaum and especially Ken Singleton. These coauthors helped me explore asset pricing from the lens of the econometric methods that I was simultaneously thinking about. I felt that there is an important role for formal statistics in assessing asset pricing models. I still have the conviction that the discussion of empirical evidence should be pushed beyond impressionistic approaches to financial time series.

Why did we get the prize together?

I have heard somewhat flippant comments about the award that draw comparisons to the 1974 Nobel Prize to Hayek and Myrdal. The Hayek-Myrdal award looks very much like it was the outcome of a balancing act by the award committee. That particular pairing looks peculiar to me. I see little common ground in their contributions. Of more interest to me, I see no Hansen counterpart in that award. Needless to say, I have little sympathy with the comparison to Hayek-Myrdal award.

I also had some conversations with people who have thought more seriously about this award suggesting that indeed there is important common ground among my fellow laureates Gene Fama and Robert Shiller and me. I agree with this. The three of us were engaged in assessing empirical puzzles in asset pricing. Puzzles are only well defined in terms of models. Of course models by their very nature are wrong, but the question is whether they are wrong in important ways. We all took an empirical outlook on this question, with the aim to build better models. We challenged existing models based on evidence from financial market data with the aim to be build better ones.

Along this line, I am currently intrigued by the following modeling challenges mo- tivated by empirical evidence. Risk prices as encoded in stochastic discount factors show substantial fluctuations. They appear to be larger in magnitude in bad macroe- conomic times than good ones. Are these risk prices or more generally uncertainty prices? To what extent can we distinguish the role of risk from investor beliefs and from investor struggles in assessing the future?

Question: Even if the Nobel prize was awarded for your fundamental contribution to empirical asset pricing, your prize has stirred a renewed interest on the issue of market efficiency. How do you think about market efficiency, and what do you think are its practical implications (if any)?

I recall a New York Times Op Ed published shortly after the prize announcement suggesting that I am “well known for having rejected one form of the efficient mar- kets model in a famous paper with Kenneth Singleton.”1 There is a lot loaded in the qualifier “one form.” Empirical tests of this nature are tests of composite hypotheses, and our paper featured a simple specification of investor preferences that was com- monly used at that time in dynamic economic models. We also imposed a financial market structure in which there were no frictions. I see the conclusion of our work differently than what was suggested in the New York Times. Our empirical results and other related work served as a catalyst for a variety of interesting and revealing explorations of the role of investor preferences and market structures better suited to confront empirical challenges. This subsequent literature in macroeconomics and finance has enriched both fields and their interplay.

Market efficiency had its theoretical origins in the work of Samuelson, motivated in part by earlier work of Bachelier. Gene’s fundamental contribution was to turn this into empirically testable hypotheses, certainly in ways that were much more direct than research like that of Singleton’s and mine. An important point to remember is that it takes a a model to beat a model. Thus we make little progress by simply rejecting models. All models by their very nature are abstractions and hence have shortcomings. Any discussion of rejecting the efficient markets model should be coupled with a formal alternative model.

On a different point, I find the term ”market efficiency” to be a bit peculiar. As economists we care about allocative efficiency, which may indeed be supported by markets. When we discuss financial market imperfections, presumably what we care about are consequences for resource allocation. Are these consequences substantial or trivial? It seems critical that in assessing the role of financial markets we need to broaden the discussion to incorporate the allocative consequences. This of course is much more challenging, but it also elevates substantially the overall discussion.

Question: In his Nobel lecture Gene Fama has called bubble a nefarious term. Regardless, of whether you believe bubbles exist or not, there is the fundamental problem of how to detect them. George Soros once claimed he spotted five of the last three bubbles. In your view, what is the best way to identify a bubble in real time? What is the best way to identify it ex post?

There are interesting episodes in financial time series with sustained increases in value followed by steep declines. We often use ‘bubbles’ to describe such episodes. But our approach to the study of bubbles has to go beyond naming unexplained residuals from our models or relying on impressionistic interpretations of time series. I am reminded of Lord Kelvin’s dictum, stated in abbreviated form as: 2

Lord Kelvin’s dictum: .. when you cannot measure it, when you cannot express it numbers your knowledge is of a meagre or unsatisfactory kind: it may be the beginning of knowledge but you have scarcely, in your thoughts advanced to the stage of science, whatever the matter might be.

There are a variety of theoretical models of bubbles. It is important to work with formal models for two reasons. Models are needed to support the measurement and models can serve as guides to prudent policy. Some models suggest that bubbles might enhance social welfare. For instance, the collateral value of an asset satisfies some notions of a bubble as it adds value to the asset in excess of a measure of the fun- damental value. The use of collateral in market transactions sometimes can enhance market opportunities and improve welfare. While there are interesting models of bub- bles, in my view none of the models are sufficiently grounded in empirical evidence and well designed to support reliable measurement and detection ex ante. Without such measurement, discussions of bubbles are of very limited value in the design and conduct of policy. Moreover, it is potentially dangerous to design government policy based on an overstatement of our knowledge and understanding.3

1 See “Sharing Nobel Honor, and Agreeing to Disagree” by Robert Shiller, published in the New York Times on October 26, 2013.
2 For an interesting discussion entitled “The Kelvin Dictum and Social Science,” an article by Merton, Sills and Stigler, published in the Journal of the History of the Behavior Sciences, Volume 20, 1984.
3 For an excellent discussion of the theoretical literature on bubbles and their ramifications see Jose Scheinkman’s monograph entitled  “Speculation, Trading, and Bubbles (The Third Arrow Lecture)” to be published by the Columbia University Press.