9:00 am Fatih Guvenen, University of Minnesota and NBER Gueorgui Kambourov, University of Toronto Burhanettin Kuruscu, University of Toronto Sergio Ocampo-Diaz, University of Minnesota Daphne Chen, Florida State University Use It Or Lose It: Efficiency Gains from Wealth Taxation Discussant: Roger H. Gordon, University of California at San Diego and NBER

10:15 am Matteo Maggiori, Harvard University and NBER Brent Neiman, University of Chicago and NBER Jesse Schreger, Columbia University and NBER International Currencies and Capital Allocation Discussant: Harald Uhlig, University of Chicago and NBER

1:30 pm Marcus Hagedorn, University of Oslo Iourii Manovskii, University of Pennsylvania and NBER Kurt Mitman, Institute for International Economic Studies The Fiscal Multiplier Discussant: Adrien Auclert, Stanford University and NBER

Wealth inequality in the U.S. rose steeply between 2007 to 2010, largely as a result of the sharp decline in house prices during that period, Edward N. Wolff reports in Household Wealth Trends in the United States, 1962 to 2016: Has Middle Class Wealth Recovered? (NBER Working Paper No. 24085). Households with a greater concentration of wealth in their homes — including younger households, African Americans, and Hispanics — fared worse than other groups. The decline in home prices had a far greater percentage impact on the net worth of the middle class than the stock market plunge had on net worth of the top 1 percent.

The study draws on data from the Survey of Consumer Finances (SCF), which was conducted eleven times between the years 1983 and 2016. It defines wealth as net worth — the current value of all marketable assets minus any outstanding debts. This wealth measure excludes the future value of Social Security benefits and defined benefit pension payments. Median net worth declined from $118,600 in 2007 to $66,500 in 2010. Mean net worth, which is more sensitive to the holdings of high net worth households, declined from $620,500 to $521,000 — a drop of 16 percent. By 2016, median net worth had rebounded to $78,100, while mean net worth had reached $667,600, surpassing its 2007 value.

The rich tend to have a more diverse range of investments than the middle class, making them less vulnerable to declines in particular asset categories. The middle class tends to be heavily leveraged, with their homes as primary assets. As a result, they were disproportionately affected by the housing crash. Median wealth fell more than house prices from 2007 to 2010.

The study also reports the average return on all investments for households in different strata of the wealth distribution. For the period 1983-2016, "the average annual return on gross assets for the top 1 percent was 0.57 percentage points greater than that of the next 19 percent and 1.44 percentage points greater than that of the middle quintiles." This return differential, which contributes to greater wealth accumulation by those in higher wealth categories, is largely due to greater weight on owner-occupied housing in the asset holdings of the middle class, and a higher weight on corporate stocks — historically a high return asset class — in the portfolios of the wealthiest households.

The racial divide in wealth-holding widened with the housing crisis. In 2007, the ratio of debt to net worth in African-American households averaged 0.553, as opposed to 0.154 for white households. The ratio of mortgage debt to home value was also greater for African-American households: 0.49 compared with 0.32. The greater leverage made the relative loss in home equity after the housing crash far greater for African-American households. Hispanic households were even harder hit, as many bought homes at high prices between 2001 and 2007 in states that saw particularly steep drops in home prices. Both African-Americans and Hispanics recovered fairly well after the Great Recession, though not quite to their 2007 levels.

The study also notes a significant reduction in the relative wealth of the young versus the old during the Great Recession. "The average wealth of the youngest age group [households headed by someone under the age of 35] collapsed almost in half, from $105,500 in 2007 to $57,000 in 2010 (measured in 2016 USD), its second lowest point over the 30-year period ...while the relative wealth of age group 35-44 shrank from $357,400 to $217,600, its lowest point over the whole 1983 to 2010 period." This may be the result of younger households having bought homes at peak housing prices. The wealth of older age groups declined by less during this period.

Wednesday, November 15, 2017

Should We Reject the Natural Rate Hypothesis?, by Olivier Blanchard, PIIE: Fifty years ago, Milton Friedman articulated the natural rate hypothesis. It was composed of two sub-hypotheses: First, the natural rate of unemployment is independent of monetary policy. Second, there is no long-run tradeoff between the deviation of unemployment from the natural rate and inflation. Both propositions have been challenged. Blanchard reviews the arguments and the macro and micro evidence against each and concludes that, in each case, the evidence is suggestive but not conclusive. Policymakers should keep the natural rate hypothesis as their null hypothesis but keep an open mind and put some weight on the alternatives. [paper]

Thursday, October 12, 2017

There is a conference on Rethinking Macroeconomic Policy "coordinated by Olivier Blanchard ...and Lawrence H. Summers..." taking place today and tomorrow (wanted to go, but couldn't).

"Academic experts and policymakers will address the challenges to macroeconomic thinking and policymaking that today’s economic environment presents–low inflation despite low unemployment, the apparent interactions of rising inequality and stagnating productivity, and the unresponsiveness of long-term interest rates to rising public debt, among others." [Conference program, papers, presentations, and conference webcast.]

Here are links to the first two papers presented at the conference. First, Olivier Blanchard and Lawrence Summers:

Rethinking Stabilization Policy. Back to the Future (Preliminary): Nearly ten years after the onset of the Great Financial Crisis, both researchers and policy makers are still assessing the policy implications of the crisis and its aftermath. Previous major crises, from the Great Depression to the stagflation of the 1970s, profoundly changed both macroeconomics and macroeconomic policy. The question is whether this crisis should and will have similar effects.

We believe it should, although we are less sure it will. Rather obviously, the crisis has forced macroeconomists to (re)discover the role and the complexity of the financial sector, and the danger of financial crises. But the lessons should go largely beyond this, and force us to question a number of cherished beliefs. Among other things, the events of the last ten years have put into question the presumption that economies are self stabilizing, have raised again the issue of whether temporary shocks can have permanent effects, and have shown the importance of non linearities.

These call for a major reappraisal of macroeconomic thinking and macroeconomic policy. As the paper is a curtain raiser for a conference that will look in more detail at the implications for specific policies, we make no attempt at being encyclopedic and feel free to pick and choose the issues which we see as most salient. ...

Temporary price-level targeting: An alternative framework for monetary policy: Low nominal interest rates, low inflation, and slow economic growth pose challenges to central bankers. In particular, with estimates of the long-run equilibrium level of the real interest rate quite low, the next recession may occur at a time when the Fed has little room to cut short-term rates. As I have written previously and recent research has explored, problems associated with the zero-lower bound (ZLB) on interest rates could be severe and enduring. While the Fed has other useful policies in its toolkit such as quantitative easing and forward guidance, I am not confident that the current monetary toolbox would prove sufficient to address a sharp downturn. I am therefore sympathetic to the view of San Francisco Fed President John Williams and others that we should be thinking now about adjusting the framework in which monetary policy is conducted, to provide more policy “space” in the future. In a paper presented at the Peterson Institute for International Economics, I propose an option for an alternative monetary framework that I call a temporary price-level target—temporary, because it would apply only at times when short-term interest rates are at or very near zero.

To explain my proposal, I’ll begin by briefly discussing two other ideas for changing the monetary framework: raising the Fed’s inflation target above the current 2 percent level, and instituting a price-level target that would operate at all times. (See my paper for more details.) ...

Today I will talk about the work of two of my graduate students and co-authors, Giovanni Nicolò and Konstantin Platonov. Both of them gave presentations at the conference. ...

Giovanni’s research is on the empirics of models with multiple equilibria and sunspots. ...

The final conference paper that I will discuss in this series, “Animal Spirits in a Monetary Economy”, was co-authored by myself and Konstantin Platonov. Konstantin presented our paper at the conference and we wrote about our work for VOX here.

I have been critical of the IS-LM model in several of my posts. My paper with Konstantin fixes some of the more salient problems of IS-LM by reintroducing two key ideas from Keynes. 1. The confidence fairy is real. 2. If confidence remains depressed, high unemployment can exist forever. Our Vox piece presents the key findings of the paper in simple language. ...

Michael Woodford was one of our two keynote speakers... Michael is one of the founders, and a long-time proponent, of New-Keynesian economics. ...

Michael addresses the question of forward guidance and specifically how central bank announcements will affect the economy when people are forward-looking but not infinitely forward looking. His goal, like Xavier’s, is to fix New Keynesian economics. ...

Today’s memo features two economists working on models of multiple equilibria from different perspectives. George Evans is a pioneer in models of adaptive learning, a topic he has worked on for more than thirty years. George presented his joint work with Seppo Honkapohja, Deputy Governor of the Bank of Finland, and Kaushik Mitra, Professor of Economics at the University of Birmingham. Patrick Pintus, a Researcher at the Banque de France, presented a co-authored paper with Yi Wen, an Assistant Vice-President at the Federal Reserve Bank of St. Louis and Xiaochuan Xing from Yale University. ...

George Evans began his work on adaptive learning in his Ph.D. dissertation at Berkeley in the early 1980s. When the rest of the profession was swept up by the rational expectations revolution, George persevered with the important idea that perfectly correct beliefs about the future cannot be plucked from the air, they must be learned. For an introduction to George’s work, I highly recommend the book co-authored with his long-time co-author, Seppo Honkapohja.

The paper of Evans, Honkapohja and Mitra (EHM), begins with a theme we met in post two where I discussed the fact that the standard New Keynesian model, in which the central bank follows a Taylor Rule, has two steady state equilibria. ...

Evans, Honkapohja and Mitra (EHM) build on this idea by adding a theory of adaptive learning. I am often asked how my own work on the belief function is related to George’s work on adaptive learning. They are very closely linked. ...

Previous work has shown that, in the basic New-Keynesian model, the upper steady state is stable under adaptive learning but the lower steady state is not. They modify the basic model by adding the assumption that the rate at which prices and output can fall has a lower bound. They show that this assumption implies that there exists a third steady state in which recessions can be persistent and deep. ...

George and his co-authors use their analysis to argue that a large fiscal intervention, a short-sharp shock, can knock the economy out of region C and back into region A. Readers of this blog will know that I have expressed scepticism of that idea in the past, largely because I am not a big fan of the basic NK model. However, this is the most convincing rationale in favour of a large fiscal stimulus that I have yet seen. ... If you are a young researcher who is thinking of working in macroeconomic theory and policy, consider working on models of expectations formation.

PWX take up a puzzle that has long been known to plague the equilibrium real business cycle (RBC) model that has dominated macroeconomic theory for more than thirty years. That model predicts that when interest rates are high, the economy will soon enter an expansion. The reality is different. High interest rates are an omen that a recession is coming down the road. What are the features of the real world that are missed by the classical RBC paradigm? ...

Last week I featured two US central bankers, Jim Bullard, President of the St Louis Fed and Kevin Lansing, a Research Advisor at the Federal Reserve Bank of San Francisco. This week’s post features two papers on the housing market, presented by two researchers at the Bank of England; Arzu Uluc and Philippe Bracke. One of these papers uses an Agent Based-Modelling approach to study the effects of macroprudential policy in the housing market. The other is a large-scale empirical study which finds significant evidence against the hypothesis that we are all perfectly rational. Homo Economicus and Femina Economica display more subtle behaviour than simple neo-classical models admit.

“Why has it taken economists so long to learn that demography influences growth?” Jeff Williamson (1998)[13]

1. Introduction. In this note we propose a model which combines the classical Solow (1956)[10] and Swan (1956)[11] model with ideas about population growth that are borrowed from Malthus (1798)[9]. We will refer to our model as a Malthus- Swan-Solow (MSS) model. Our model has no technical progress, no institutional change, no human capital and no land.

We assume that the rate of growth of population depends on the real wage in a continuous way. This function is a generalization of one used by Hansen and Prescott (2002)[7].

We find that, as in the classical Solow-Swan model, there exist a steady state value of capital-labor ratio, see Proposition 1. However this steady state is not necessarily unique: Proposition 2 and Example 3 show that there might be an odd number of steady state capital-labor ratios. And only the smaller and the larger values of these capital-labor ratio are locally stable, see Proposition 4. This implies that there might be two, very different values of per capita income in the steady state: one with a small and another with a large value of per capita income. Finally we find that an increase in total factor productivity may increase or decrease the capital-labor ratio in a stable steady state (Proposition 5) but it always increases per capita income (Proposition 6).

Summing up, the consideration of endogenous population in the Solow-Swan model brings new insights with respect to the standard model regarding the number, stability and comparative static properties of steady states. ...

I had been planning, for some time, to run a conference on the topic of multiple equilibria sponsored by Warwick University. Andy Haldane and Sujit Kapadia had been talking with Alan Taylor of U.C. Davis about organizing a conference on the topic of behavioural economics. After talking with Andy, Sujit and Alan, we decided it would be ideal to combine our plans into a single conference that would highlight the promise of studying the marriage of psychology with multiple equilibria in economics. The video ... explains why this is a fruitful idea.

Roger goes on to discuss how "psychology enters the picture," and why the Robert Lucas idea that "the expectations of market participants are determined by economic fundamentals ... makes little or no sense in models ... where there are multiple equilibria." Also:

In addition to the introductory video, linked above, we also recorded videos from many of the conference presenters and discussants. I will be releasing these videos in a series of posts in the coming weeks and I will discuss the research associated with the accompanying topic. You can find links to the original papers on the conference website linked here. Stay tuned.

Sunday, August 06, 2017

On the Formation of Capital and Wealth Draft: Abstract We show modern information technology (in short IT) is the cause of rising income and wealth inequality since the 1970's and has contributed to slow growth of wages and decline in the natural rate. We first study all US firms whose securities trade on public exchanges. Surplus wealth of a firm is the difference between wealth created (equity and debt) and its capital. We show (i) aggregate surplus wealth rose from -$0.59 Trillion in 1974 to $24 Trillion which is 79% of total market value in 2015 and reflects rising monopoly power. The added wealth was created mostly in sectors transformed by IT. Declining or slow growing firms with broadly distributed ownership have been replaced by IT based firms with highly concentrated ownership. Rising fraction of capital has been financed by debt, reaching 78% in 2015. We explain why IT innovations enable and accelerate the erection of barriers to entry and once erected, IT facilitates maintenance of restraints on competition. These innovations also explain rising size of firms. We next develop a model where firms have monopoly power. Monopoly surplus is unobservable and we deduce it with three methods, based on surplus wealth, share of labor or share of profits. Share of monopoly surplus rose from zero in early 1980's to 23% in 2015. This last result is, remarkably, deduced by all three methods. Share of monopoly surplus was also positive during the first, hardware, phase of the IT revolution. It was zero in 1950-1962, reaching 7.3% in 1965 before falling back to zero in 1970. Standard TFP computation is shown to be biased when firms have monopoly power.

Do Older Americans Have More Income Than We Think?, July 25, 2017 Working Paper Number: SEHSD-WP2017-39: Introduction The Current Population Survey Annual Social and Economic Supplement (CPS ASEC) is the source of the nation’s official household income and poverty statistics. In 2012, the CPS ASEC showed that median household income was $33,800 for householders aged 65 and over and the poverty rate was 9.1 percent for persons aged 65 and over. When we instead use an extensive array of administrative income records linked to the same CPS ASEC sample, we find that median household income was $44,400 (30 percent higher) and the poverty rate was just 6.9 percent. We demonstrate that large differences between survey and administrative record estimates are present within most demographic subgroups and are not easily explained by survey design features or processes such as imputation. Further, we show that the discrepancy is mainly attributable to underreporting of retirement income from defined benefit pensions and retirement account withdrawals. Using archived survey and administrative record data, we extend our analysis back to 1990 and provide evidence of underreporting from an earlier period. We also document a growing divergence over time between the two measures of median income which is in turn driven by the growth in retirement income underreporting. Turning to synthetic cohort analysis, we show that in recent years, most households do not experience substantial declines in total incomes upon retirement or any increases in poverty rates. Our results have important implications for assessing the relative value of different sources of income available to older Americans, including income from the nation’s largest retirement program, Social Security. We caution, however, that our findings apply to the population aged 65 and over in 2012 and cannot easily be extrapolated to future retirees.

Wednesday, July 26, 2017

Summer 2017 Journal of Economic Perspectives Available Online I was hired back in 1986 to be the Managing Editor for a new academic economics journal, at the time unnamed, but which soon was launched as the Journal of Economic Perspectives. The JEP is published by the American Economic Association, which back in 2011 decided--to my delight--that it would be freely available on-line, from the current issue back to the first issue. Here, I'll start with Table of Contents for the just-released Summer 2017 issue, which in the Taylor household is sometimes known as issue #121. Below that are abstracts and direct links for all of the papers. I will almost certainly blog about some of the individual papers in the next week or two, as well.

Thursday, March 23, 2017

Compensation Benchmarking, Leapfrogs, and the Surge in Executive Pay, by Thomas A. DiPrete; Gregory M. Eirich; Matthew Pittinsky, American Journal of Sociology: Abstract: Scholars frequently argue whether the sharp rise in chief executive officer (CEO) pay in recent years is "efficient" or is a consequence of "rent extraction" because of the failure of corporate governance in individual firms. This article argues that governance failure must be conceptualized at the market rather than the firm level because excessive pay increases for even relatively few CEOs a year spread to other firms through the cognitively and rhetorically constructed compensation networks of "peer groups," which are used in the benchmarking process to negotiate the compensation of CEOs. Counterfactual simulation based on Standard and Poor's ExecuComp data demonstrates that the effects of CEO "leapfrogging" potentially explain a considerable fraction of the overall upward movement of executive compensation since the early 1990s. [download]

1. Study reports results which reinforce the dominant, politically correct, narrative. 2. Study is widely cited in other academic work, lionized in the popular press, and used to advance real world agendas. 3. Study fails to replicate, but no one (except a few careful and independent thinkers) notices.

#1 is spot-on for economics. Woe be to she who bucks the dominant narrative. In economics, something else happens. Following the study, there are 20 piggy-back papers which test for the same results on other data. The original authors typically get to referee these papers, so if you're a young researcher looking for a publication, look no further. You've just guaranteed yourself the rarest of gifts -- a friendly referee who will likely go to bat for you. Just make sure your results are similar to theirs. If not, you might want to shelve your project, or else try 100 other specifications until you get something that "works". One trick I learned: You can bury a robustness check which overturns the main results deep in the paper, and your referee who is emotionally invested in the benchmark result for sure won't read that far. ...

Most researchers in Economics go their entire careers without criticizing anyone else in their field, except as an anonymous referee, where they tend to let out their pent-up aggression. Journals shy away from publishing comment papers, as I found out first-hand. In fact, much if not a majority of the papers published in top economics journals are probably wrong, and yet the field soldiers on like a drunken sailor. Often, many people "in the know" realize that many big papers have fatal flaws, but have every incentive not to point this out and create enemies, or to waste their time writing up something which journals don't really want to publish (the editor doesn't want to piss a colleague off either). As a result, many of these false results end up getting taught to generations of students. Indeed, I was taught a number of these flawed papers as both an undergraduate and a grad student.

10:15 am Daniel Garcia-Macia, International Monetary Fund Chang-Tai Hsieh, University of Chicago and NBER Peter Klenow, Stanford University and NBER How Destructive is Innovation? Discussant: Andrew Atkeson, University of California at Los Angeles and NBER

4:00 pm Michael Gelman, University of Michigan Yuriy Gorodnichenko, University of California at Berkeley and NBER Shachar Kariv, University of California at Berkeley Dmitri Koustas, University of California at Berkeley Matthew Shapiro, University of Michigan and NBER Dan Silverman, Arizona State University and NBER Steven Tadelis, University of California at Berkeley and NBER The Response of Consumer Spending to Changes in Gasoline Prices Discussant: Arlene Wong, Federal Reserve Bank of Minneapolis

Monday, January 30, 2017

"How to Write an Effective Referee Report and Improve the Scientific Review Process," by Jonathan B. Berk, Campbell R. Harvey and David Hirshleifer [Full-Text Access | Supplementary Materials]: The review process for academic journals in economics has grown vastly more extensive over time. Journals demand more revisions, and papers have become bloated with numerous robustness checks and extensions. Even if the extra resulting revisions do on average lead to improved papers--a claim that is debatable--the cost is enormous. We argue that much of the time involved in these revisions is a waste of research effort. Another cause for concern is the level of disagreement amongst referees, a pattern that suggests a high level of arbitrariness in the review process. To identify and highlight what is going right and what is going wrong in the reviewing process, we wrote to a sample of former editors of the American Economic Review, the Journal of Political Economy, the Quarterly Journal of Economics, Econometrica, the Review of Economic Studies, and the Journal of Financial Economics, and asked them for their thoughts about what might improve the process. We found a rough consensus that referees for top journals in economics tend to make similar, correctable mistakes. The italicized quotations throughout this paper are drawn from our correspondence with these editors and our own experience. Their insights are consistent with our own experiences as editors at the Journal of Finance and the Review of Financial Studies. Our objective is to highlight these mistakes and provide a roadmap for how to avoid them.

Tuesday, December 20, 2016

Hysteresis and Fiscal Policy, by Philipp Engler and Juha Tervala, December 19, 2016: Abstract Empirical studies support the hysteresis hypothesis that recessions have a permanent effect on the level of output. We analyze the implications of hysteresis for fiscal policy in a DSGE model. We assume a simple learning-by-doing mechanism where demand-driven changes in employment can affect the level of productivity permanently, leading to hysteresis in output. We show that the fiscal output multiplier is much larger in the presence of hysteresis and that the welfare multiplier of fiscal policy -- the consumption equivalent change in welfare for one dollar change in public spending -- is positive (negative) in the presence (absence) of hysteresis. The main benefit of accommodative fiscal policy in the presence of hysteresis is to diminish the damage of a recession to the long-term level of productivity and, thus, output.

Monday, November 28, 2016

Immigrants and Firms' Outcomes: Evidence from France, by Cristina Mitaritonna, Gianluca Orefice, and Giovanni Peri, NBER Working Paper No. 22852 Issued in November 2016: In this paper we analyze the impact of an increase in the local supply of immigrants on firms’ outcomes, allowing for heterogeneous effects across firms according to their initial productivity. Using micro-level data on French manufacturing firms spanning the period 1995-2005, we show that a supply-driven increase in the share of foreign-born workers in a French department (a small geographic area) increased the total factor productivity of firms in that department. Immigrants were prevalently highly educated and this effect is consistent with a positive complementarity and spillover effects from their skills. We also find this effect to be significantly stronger for firms with low initial productivity and small size. The positive productivity effect of immigrants was also associated with faster growth of capital, larger exports and higher wages for natives. Highly skilled natives were pushed towards firms that did not hire too many immigrants spreading positive productivity effects to those firms too. Because of stronger effects on smaller and initially less productive firms, the aggregate effects of immigrants at the department level on average productivity and employment was small.

Sunday, November 20, 2016

Game Theory in Economics and Beyond, by Larry Samuelson, Journal of Economic Perspectives vol. 30, no. 4, Fall 2016 (pp. 107-30): Abstract Within economics, game theory occupied a rather isolated niche in the 1960s and 1970s. It was pursued by people who were known specifically as game theorists and who did almost nothing but game theory, while other economists had little idea what game theory was. Game theory is now a standard tool in economics. Contributions to game theory are made by economists across the spectrum of fields and interests, and economists regularly combine work in game theory with work in other areas. Students learn the basic techniques of game theory in the first-year graduate theory core. Excitement over game theory in economics has given way to an easy familiarity. This essay first examines this transition, arguing that the initial excitement surrounding game theory has dissipated not because game theory has retreated from its initial bridgehead, but because it has extended its reach throughout economics. Next, it discusses some key challenges for game theory, including the continuing problem of dealing with multiple equilibria, the need to make game theory useful in applications, and the need to better integrate noncooperative and cooperative game theory. Finally it considers the current status and future prospects of game theory.

Advocates of legalizing the purchase of goods sold in black markets argue that allowing legal trade will displace illegal buying and selling, reduce criminal activity, and permit greater control of the previously illegal goods. New research indicates that this is not always the case.

In Does Legalization Reduce Black Market Activity? Evidence from a Global Ivory Experiment and Elephant Poaching Data (NBER Working Paper No. 22314), Solomon Hsiang and Nitin Sekar show that the production of black market elephant ivory expanded by an estimated 66 percent following a one-time legal sale in 2008. Seizures of contraband ivory leaving African countries also increased, from 4.8 to 8.4 seizures per country per year. The weight of ivory in the seizures increased by an average of 335 kilograms per year.

In 1989, the Convention on the International Trade of Endangered Species (CITES) banned international trade in ivory in order to protect the wild African elephant. Individual countries continued to regulate their domestic ivory trade. Poaching slowed, and elephant populations began to recover. African governments kept stockpiles of ivory harvested from animals that died naturally.

Poaching began increasing again in the mid-1990s. Following a single legal sale from stockpiles to Japan in 1999, China and Japan requested the right to make an additional purchase. After years of debate, the governments of those countries were able to purchase 62 and 45 tons of legal ivory, respectively, at auction in 2008. The governments continue to resell that ivory in their domestic markets.

After the legal sale in 1999, CITES established the Monitoring the Illegal Killing of Elephants (MIKE) program at 79 sites in 40 countries in Africa and Asia. Preliminary data collection began in mid-2002. The Proportion of Illegally Killed Elephants (PIKE) Index is the fraction of "detected elephant carcasses that were illegally killed," a measure designed to correct for fluctuating elephant populations and field worker effort.

The researchers examine how poachers responded to the 2008 sale by studying annual PIKE data from 2003 to 2013. They find a clear discontinuous increase in the index after the 2008 sale. They cannot explain this increase with changes in natural elephant mortality rates, or with economic variables such as China's or Japan's per capita GDP, Chinese or Japanese trade with elephant range countries, measures of China's physical presence in range countries, or per capita GDP in PIKE-reporting countries.

The researchers conclude that the legal sale of ivory "triggered an increase in black market ivory production by increasing consumer demand and/or reducing the cost of supplying black market ivory." Supplier costs may be reduced if legalization of a product makes it more difficult to detect and monitor illegal provision of that product. Consumer demand may rise because legalization may reduce the stigma around a previously banned product.

Gold has historically played a prominent role in transactions among financial institutions even in modern systems that rely on paper money. What’s more, many observers think that gold provides a hedge against major macroeconomic declines. But after assessing long-term US data on gold returns, the new research finds that gold has not served consistently as a hedge against large declines in real GDP or real stock prices. ... [more] ...

Tuesday, June 07, 2016

This paper by Gauti Eggertsston, Neil Mehrotra, Sanjay Singh, and Larry Summers was released yesterday as an NBER Working paper. The paper looks at secular stagnation in open economy and examines how it is transmitted across countries (in an OLG framework with many countries and imperfect capital integration). One interesting implication is that if the Fed pursues an interest rate hike, and the rest of the world does not follow, we should expect strong capital flows into the US thus possibly generating a mismatch between desired savings and investment. This, in turn, leads to a drop in the US natural rate of interest forcing the Fed to cut rates again to avoid a recession:

A Contagious Malady? Open Economy Dimensions of Secular Stagnation, by Gauti B. Eggertsson, Neil R. Mehrotra, Sanjay R. Singh, Lawrence H. Summers, NBER Working Paper No. 22299: Conditions of secular stagnation - low interest rates, below target inflation, and sluggish output growth - characterize much of the global economy. We consider an overlapping generations, open economy model of secular stagnation, and examine the effect of capital flows on the transmission of stagnation. In a world with a low natural rate of interest, greater capital integration transmits recessions across countries as opposed to lower interest rates. In a global secular stagnation, expansionary fiscal policy carries positive spillovers implying gains from coordination, and fiscal policy is self-financing. Expansionary monetary policy, by contrast, is beggar-thy-neighbor with output gains in one country coming at the expense of the other. Similarly, we find that competitiveness policies including structural labor market reforms or neomercantilist trade policies are also beggar-thy-neighbor in a global secular stagnation.

A related variation that strips down the argument in the paper (which uses and elaborate DSGE model) into a simple textbook IS-MP framework is in this year's AER Papers and Proceedings volume. The results are much the same. See here. The more elaborate model should give people comfort in knowing that the key insights hold once you put add all the bells and whistles of a modern DSGE (or perhaps it's the other way around).

Conclusions: Following the Great Recession, many countries have experienced repeated periods with realized and expected inflation below target levels set by policymakers. Should policy respond to this by keeping interest rates near zero for a longer period or, in line with neo-Fisherian reasoning, by increasing the interest rate to the steady-state level corresponding to the target inflation rate? We have shown that neo-Fisherian policies, in which interest rates are set according to a peg, impart unavoidable instability. In contrast, a temporary peg at low interest rates, followed by later imposition of the Taylor rule around the target inflation rate, provides a natural return to normalcy, restoring inflation to its target and the economy to its steady state.

Monday, May 02, 2016

Growth of income and welfare in the U.S, 1979-2011, by John Komlos, NBER Working Paper No. 22211 Issued in April 2016: We estimate growth rates of real incomes in the U.S. by quintiles using the Congressional Budget Office’s (CBO) post-tax, post-transfer data as basis for the period 1979-2011. We improve upon them by including only the present value of earnings that will accrue in retirement and excluding items included in the CBO income estimates such as “corporate taxes borne by labor” that do not increase either current purchasing power or utility. We estimate a high and a low growth rate using two price indexes, the CPI and the Personal Consumption Expenditure index. The major consistent findings include what in the colloquial is referred to as the “hollowing out” of the middle class. According to these estimates, the income of the middle class 2nd and 3rd quintiles increased at a rate of between 0.1% and 0.7% per annum, i.e., barely distinguishable from zero. Even that meager rate was achieved only through substantial transfer payments. In contrast, the income of the top 1% grew at an astronomical rate of between 3.4% and 3.9% per annum during the 32-year period, reaching an average annual value of $918,000, up from $281,000 in 1979 (in 2011 dollars). Hence, the post-tax, post-transfer income of the 1% relative to the 1st quintile increased from a factor of 21 in 1979 to a factor of 51 in 2011. However, income of no other group increased substantially relative to that of the lowest quintile. Oddly, the income of even those in the 96-99 percentiles increased only from a multiple of 8.1 to a multiple of 11.3. We next estimate growth in welfare assuming diminishing marginal utility of income. A logarithmic utility function yields a growth in welfare for the middle class of roughly 0.01% to 0.07% per annum, which is indistinguishable from zero. With interdependent utility functions only the welfare of the 5th quintile experienced meaningful growth while those of the first four quintiles tend to be either negligible or even negative.

Discussants:Mark Gertler, New York University and NBER Atif Mian, Princeton University and NBER

6:30 pm Dinner Speaker: Lawrence Summers, Harvard University and NBER

Saturday, April 16:

9:00 am Pierre-Olivier Gourinchas, University of California at Berkeley and NBERThomas Philippon, New York University and NBER Dimitri Vayanos, London School of Economics and NBER The Analytics of the Greek Crisis

Discussants: Olivier Blanchard, Peterson Institute for International Economics and NBERMarkus Brunnermeier, Princeton University and NBER

Discussants: Harald Uhlig, University of Chicago and NBERRicardo Reis, Columbia University and NBER

Abstracts

Forward Guidance and Macroeconomic Outcomes Since the Financial Crisis, by Jeffrey R. Campbell, Jonas D. M. Fisher, Alejandro Justiniano, and Leonardo Melosi: April 13, 2016 Abstract This paper studies the effects of FOMC forward guidance. We begin by using high frequency identification and direct measures of FOMC private information to show that puzzling responses of private sector forecasts to movements in federal funds futures rates on FOMC announcement days can be attributed almost entirely to Delphic forward guidance. However a large fraction of futures rates’ variability on announcement days remains unexplained leaving open the possibility that the FOMC has successfully communicated Odyssean guidance. We then examine whether the FOMC used Odyssean guidance to improve macroeconomic outcomes since the financial crisis. To this end we use an estimated medium-scale New Keynesian model to perform a counterfactual experiment for the period 2009:1–2014q4 in which we assume the FOMC did not employ any Odyssean guidance and instead followed its reaction function inherited from before the crisis as closely as possible while respecting the effective lower bound. We find that a purely rule-based policy would have delivered better outcomes in the years immediately following the crisis – forward guidance was counterproductive. However starting toward the end of 2011, after the Fed’s introduction of “calendar-based” communications, Odyssean guidance appears to have boosted real activity and moved inflation closer to target. We show that our results do not reflect Del Negro, Giannoni, and Patterson (2015)’s forward guidance puzzle.

Are State and Time dependent models really different?, Fernando Alvarez, Francesco Lippi Einaudi, Juan Passadore: April 13, 2016 FIRST DRAFT Abstract Yes, but only for large monetary shocks. In particular, we show that for a large class of models where shocks have continuous paths, the propagation of a monetary impulse is independent of the nature of the sticky price friction when shocks are small. The propagation of large shocks instead depends on the nature of the friction: the impulse response of inflation to monetary shocks is non-linear in state-dependent models, while it is independent of the shock size in time-dependent models. We use data on exchange rate devaluations and inflation for a panel of countries over 1974-2014 to test for the presence of state dependent decision rules. We find evidence of a non-linear effect of exchange rate changes on prices in the sample of flexible-exchange rate countries with low inflation. In particular, we find that large exchange rate changes have larger short term pass through, as implied by state dependent models.

Is the Macroeconomy Locally Unstable and Why Should We Care?, by Paul Beaudry, Dana Galizia, and Franck Portier: March 2016 Abstract In most modern macroeconomic models, the steady state (or balanced growth path) of the system is a local attractor, in the sense that, in the absence of shocks, the economy would converge to the steady state. In this paper, we examine whether the time series behavior of macroeconomic aggregates (especially labor market aggregates) is in fact supportive of this local-stability view of macroeconomic dynamics, or if it instead favors an alternative interpretation in which the macroeconomy may be better characterized as being locally unstable, with nonlinear deterministic forces capable of producing endogenous cyclical behavior. To do this, we extend a standard AR representation of the data to allow for smooth nonlinearities. Our main finding is that, even using a procedure that may have low power to detect local instability, the data provide intriguing support for the view that the macroeconomy may be locally unstable and involve limit-cycle forces. An interesting finding is that the degree of nonlinearity we detect in the data is small, but nevertheless enough to alter the description of macroeconomic behavior. We complete the paper with a discussion of the extent to which these two different views about the inherent dynamics of the macroeconomy may matter for policy.

Macrofinancial History and the New Business Cycle Facts. by Oscar Jordà, Moritz Schularick, and Alan M. Taylor: Abstract In the era of modern finance, a century-long near-stable ratio of credit to GDP gave way to increasing financialization and surging leverage in advanced economies in the last forty years. This “financial hockey stick” coincides with shifts in foundational macroeconomic relationships beyond the widely-noted return of macroeconomic fragility and crisis risk. Leverage is correlated with central business cycle moments. We document an extensive set of such moments based on a decade-long international and historical data collection effort. More financialized economies exhibit somewhat less real volatility but lower growth, more tail risk, and tighter real-real and real- financial correlations. International real and financial cycles also cohere more strongly. The new stylized facts we document should prove fertile ground for the development of a newer generation of macroeconomic models with a prominent role for financial factors.

The Analytics of the Greek Crisis, by Pierre-Olivier Gourinchas, Thomas Philippon, and Dimitri Vayanos: April 13, 2016 Abstract This paper presents an interim and analytical report on the Greek Crisis of 2010. The Greek crisis presents a number of important features that sets it apart from the typical sudden stop, sovereign default, or lending boom/bust episodes of the last quarter century. We provide an analytical account of the Greek crisis using a rich model designed to capture the main financial and macro linkages of a small open economy. Using the model to parse through the wreckage, we uncover the following main findings: (a) Greece experienced a more prolonged and severe decline in output per capita than almost any crisis on record since 1980; (b) the crisis was significantly backloaded, thanks to important financial assistance mechanisms; (c) a sizable share of the crisis was the consequence of the sudden stop that started in late 2009; (d) the severity of the crisis was compounded by elevated initial levels of exposure (external debt, public debt, domestic credit), vastly in excess of levels observed in typical emerging economies. In summary: Greece experienced a typical Emerging Market Sudden Stop crisis, with the initial exposure levels of an Advanced Economy

Jump-Starting the Euro Area Recovery: Would a Rise in Core Fiscal Spending Help the Periphery?, by Olivier Blanchard, Christopher J. Erceg, Jesper Linde: March 24, 2016 Abstract We show that a Öscal expansion by the core economies of the euro area would have a large and positive impact on periphery GDP assuming that policy rates remain low for a prolonged period. Under our preferred model specification, an expansion of core government spending equal to one percent of euro area GDP would boost periphery GDP around 1 percent in a liquidity trap lasting three years, nearly half as large as the effect on core GDP. Accordingly, under a standard ad hoc loss function involving output and inflation gaps, increasing core spending would generate substantial welfare improvements, especially in the periphery. The benefits are considerably smaller under a utility-based welfare measure, reflecting in part that higher net exports play a material role in raising periphery GDP.

Thursday, February 11, 2016

Does inequality cause financial distress? Evidence from lottery winners and neighboring bankruptcies Sumit Agarwal, Vyacheslav Mikhed, and Barry Scholnick: Abstract We test the hypothesis that income inequality causes financial distress. To identify the effect of income inequality, we examine lottery prizes of random dollar magnitudes in the context of very small neighborhoods (13 households on average). We find that a C$1,000 increase in the lottery prize causes a 2.4% rise in subsequent bankruptcies among the winners’ close neighbors. We also provide evidence of conspicuous consumption as a mechanism for this causal relationship. The size of lottery prizes increases the value of visible assets (houses, cars, motorcycles), but not invisible assets (cash and pensions), appearing on the balance sheets of neighboring bankruptcy filers.

Since the passage of the Clean Air Act of 1990, the federal government has pursued a variety of policies designed to reduce the level of sulfur dioxide emissions from coal-fired power plants and the associated acid rain. In The Market for Sulfur Dioxide Allowances: What Have We Learned from the Grand Policy Experiment? (NBER Working Paper No. 21383), H. Ron Chan, B. Andrew Chupp, B. Andrew Chupp, Maureen L. Cropper, and Nicholas Z. Muller evaluate the cost savings and the health consequences of relying on a cap-and-trade sulfur dioxide allowance market to implement emissions reductions.

The key argument advanced by proponents of cap-and-trade programs for pollution reduction is that they are less costly than regulatory programs that impose the same abatement requirements on all polluters. By allowing emission sources with high abatement costs to offset higher on-site emissions by purchasing additional reductions from other, lower-cost polluters, they assert trade in pollution allowances reduces the total cost of achieving a given reduction in aggregate emissions.

To study the cost savings associated with the Acid Rain Program, which allowed such trade, the authors model the cost of abatement for individual coal-fired power plants. They estimate how firms choose between the two leading technologies for sulfur dioxide abatement, burning low-sulfur coal and installing flue-gas desulfurization units. They use these estimates to compare abatement decisions corresponding to the Acid Rain Program and standards that achieve the same aggregate reduction in emissions by making uniform requirements on coal-fired plants, with no trading allowed. They find cost savings in 2002, with the Acid Rain Program in full swing, of approximately $250 million from trade in emission allowances. This is less than half of the previously estimated saving from tradable permits. The data suggest that many generating units were not complying with the Clean Air Act in the most economical manner.

One potential drawback of a cap-and-trade system is that in some areas the level of local pollutants — those which pose the greatest health threat near their place of emission — can be higher than under uniform emission standards. This could occur if, for example, utilities in the densely populated eastern United States, where emission reduction can be comparatively costly, pay utilities in less-populous western regions, where abatement is cheaper, to cut emissions there. The aggregate national reduction may still be achieved, but many more people in the densely populated east could be exposed to pollutants.

The researchers find a greater level of particulate air pollution and associated premature mortality under the Acid Rain Program than under a hypothetical no-trade scenario in which units emitted SO2 at a rate equal to 2002 allowance allocations plus observed drawdowns of their allowance banks. They estimate the cost of health damages associated with observed SO2 emissions in 2002 under the Acid Rain Program to be $2.4 billion higher than would have been the case under the no-trade scenario. They conclude that the health impact of a cap-and-trade program depends on how the program is structured and on the correlation between marginal abatement costs and marginal damages across pollution sources.

Wednesday, February 03, 2016

How Successful Was the New Deal? The Microeconomic Impact of New Deal Spending and Lending Policies in the 1930s, by Price V. Fishback, NBER Working Paper No. 21925 Issued in January 2016: Abstract The New Deal during the 1930s was arguably the largest peace-time expansion in federal government activity in American history. Until recently there had been very little quantitative testing of the microeconomic impact of the wide variety of New Deal programs. Over the past decade scholars have developed new panel databases for counties, cities, and states and then used panel data methods on them to examine the examine the impact of New Deal spending and lending policies for the major New Deal programs. In most cases the identification of the effect comes from changes across time within the same geographic location after controlling for national shocks to the economy. Many of the studies also use instrumental variable methods to control for endogeneity. The studies find that public works and relief spending had state income multipliers of around one, increased consumption activity, attracted internal migration, reduced crime rates, and lowered several types of mortality. The farm programs typically aided large farm owners but eliminated opportunities for share croppers, tenants, and farm workers. The Home Owners’ Loan Corporation’s purchases and refinancing of troubled mortgages staved off drops in housing prices and home ownership rates at relatively low ex post cost to taxpayers. The Reconstruction Finance Corporation’s loans to banks and railroads appear to have had little positive impact, although the banks were aided when the RFC took ownership stakes.

Clearly, if confirmed, either the presence of hysteresis or the deterioration of the relation between inflation and activity would have major implications for monetary policy and for stabilization policy more generally. ...

First, we revisit the hysteresis hypothesis, defined as the hypothesis that recessions may have permanent effects on the level of output relative to trend. ... We find that a high proportion of recessions, about two-thirds, are followed by lower output relative to the pre-recession trend even after the economy has recovered. Perhaps more surprisingly, in about one-half of those cases, the recession is followed not just by lower output, but by lower output growth relative to the pre-recession output trend. That is, as time passes following recessions, the gap between output and projected output on the basis of the pre-recession trend increases. ...

Turning to the Phillips curve relation, we ... find clear evidence that the effect of the unemployment gap on inflation has substantially decreased since the 1970s. Most of the decrease, however, took place before the early 1990s. Since then, the coefficient appears to have been stable, and, in most cases, significant...

Finally, in the last section, we explore the implications of our findings for monetary policy. The findings of the second section have opposite implications for monetary policy... To the extent that recessions are due to the perception or anticipation of lower underlying growth, this implies that estimates of potential output, based on the assumption of an unchanged underlying trend, may be too optimistic, and lead to too strong a policy response to movements in output. However, to the extent that recessions have hysteresis or super-hysteresis effects, then the cost of allowing downward movements in output in response to shifts in demand increases implies that a stronger response to output gaps is desirable.

The findings of the third section yield less dramatic conclusions. To the extent that the coefficient on the unemployment gap, while small, remains significant, the implication is that, within an inflation targeting framework, the interest rate rule should put more weight on the output gap relative to inflation. ...

Monday, October 26, 2015

Economic Cycles in Ancient China, by Yaguang Zhang, Guo Fan, and John Whalley, NBER Working Paper No. 21672 Issued in October 2015: We discuss business cycles in ancient China. Data on Ancient China business cycles are sparse and incomplete and so our discussion is qualitative rather than quantitative. Essentially, ancient debates focused on two types of cycles: long run political or dynastic cycles of many decades, and short run nature induced cycles. Discussion of the latter show strong parallels to Jevons’ conception of sun spot cycles. The former has no clear contemporary analogue, were often deep in impact and of long duration. The discussion of both focused on agricultural economies. Ancient discussion on intervention focused on counter cyclical measures, including stockpiling, and predated Keynes and the discussion in the 1930s by centuries. Also, a strongly held belief emerged that cycles create their own cycles to follow, and that cycles are part of the inevitable economic order, a view consistent with Mitchell’s view of the business cycle in the 1940s. Current debates on how best to respond to the ongoing global financial crisis draw in part on historical precedents, but these are largely limited to the last 150 years for OECD countries and with major focus on the 1990’s. Here we also probe material on Ancient China to see what is relevant.

Monday, October 19, 2015

How Does Declining Unionism Affect the American Middle Class and Intergenerational Mobility?, by Richard Freeman, Eunice Han, David Madland, Brendan V. Duke, NBER Working Paper No. 21638 [Open Link to Earlier Version]: This paper examines unionism’s relationship to the size of the middle class and its relationship to intergenerational mobility. We use the PSID 1985 and 2011 files to examine the change in the share of workers in a middle-income group (defined by persons having incomes within 50% of the median) and use a shift-share decomposition to explore how the decline of unionism contributes to the shrinking middle class. We also use the files to investigate the correlation between parents’ union status and the incomes of their children. Additionally, we use federal income tax data to examine the geographical correlation between union density and intergenerational mobility. We find: 1) union workers are disproportionately in the middle-income group or above, and some reach middle-income status due to the union wage premium; 2) the offspring of union parents have higher incomes than the offspring of otherwise comparable non-union parents, especially when the parents are low-skilled; 3) offspring from communities with higher union density have higher average incomes relative to their parents compared to offspring from communities with lower union density. These findings show a strong, though not necessarily causal, link between unions, the middle class, and intergenerational mobility.

Friday, October 16, 2015

This surprised me. I was under the impression that things are moving in the opposite direction:

Economics and the Modern Economic Historian, by Ran Abramitzky, NBER Working Paper No. 21636, October 2015: Abstract I reflect on the role of modern economic history in economics. I document a substantial increase in the percentage of papers devoted to economic history in the top-5 economic journals over the last few decades. I discuss how the study of the past has contributed to economics by providing ground to test economic theory, improve economic policy, understand economic mechanisms, and answer big economic questions. Recent graduates in economic history appear to have roughly similar prospects to those of other economists in the economics job market. I speculate how the increase in availability of high quality micro level historical data, the decline in costs of digitizing data, and the use of computationally intensive methods to convert large-scale qualitative information into quantitative data might transform economic history in the future.

From the introduction to the paper:

... This sense that economists “believe history to be of small and diminishing interest” was made clear ... in 1976, when McCloskey wrote in defense of economic history a paper entitled “Does the past have useful economics?”. McCloskey concluded that the average American economist answers “no”. McCloskey showed a sharp decline in the publication of economic history papers in the top economic journals (AER, QJE, JPE). It was clear that “…this older generation of American economists did not persuade many of the younger that history is essential to economics.” ...

Today, thirty years later, economic history is far from being marginalized and overlooked by economists. To be sure, economic history remains a small field within economics, but the average economist today would answer a “yes” to the question of whether the past has useful economics. Economists increasingly recognize that historical events shape current economic development, and that current modern economies were once upon a time developing and their experience might be relevant for current developing countries. Recent debates in the US and Europe about immigration policies renewed interest in historical migration episodes. Most notably, the Great Recession of 2007-08 reminded economists of the Great Depression and other historic financial crises. Macroeconomic historian Christina Romer, a Great Depression expert, became the chief advisor of president Obama.3 Indeed, Barry Eichengreen, himself an expert on financial crises in history, started his 2011 presidential address by saying that “this has been a good crisis for economic history.”

That economic history today is more respected and appreciated by the average economist is also reflected by an increase in economic history publications in the top-5 economic journals. The decline in economic history in the top-3 journals that McCloskey documented has been reversed...

Friday, October 09, 2015

Resurrecting the Role of the Product Market Wedge in Recessions Mark Bils, Peter J. Klenow, and Benjamin A. Malin: Abstract Employment and hours appear far more cyclical than dictated by the behavior of productivity and consumption. This puzzle has been called “the labor wedge” — a cyclical intratemporal wedge between the marginal product of labor and the marginal rate of substitution of consumption for leisure. The intratemporal wedge can be broken into a product market wedge (price markup) and a labor market wedge (wage markup). Based on the wages of employees, the literature has attributed the intratemporal wedge almost entirely to labor market distortions. Because employee wages may be smoothed versions of the true cyclical price of labor, we instead examine the self-employed and intermediate inputs, respectively. Looking at the past quarter century in the United States, we find that price markup movements are at least as important as wage markup movements — including during the Great Recession and its aftermath. Thus, sticky prices and other forms of countercyclical markups deserve a central place in business cycle research, alongside sticky wages and matching frictions.

Monday, September 28, 2015

Cheap Talk, Round Numbers, and Signaling Behavior: In the marketplace for ordinary goods, buyers and sellers have many characteristics that are hidden from each other. From the seller's perspective, it may be beneficial to reveal some of these characteristics. For example, a patient seller may want to signal unending willingness to wait in order to secure a good deal. At the same time, an impatient seller may want to signal a desire to sell a good quickly, albeit at a lower price.

This insight is at the heart of Cheap Talk, Round Numbers, and the Economics of Negotiation (NBER Working Paper No. 21285) by Matthew Backus, Thomas Blake, and Steven Tadelis. The authors show that sellers on eBay behave in a fashion that is consistent with using round numbers as signals of impatience.

The authors analyze data from eBay's bargaining platform using its collectibles category—coins, antiques, toys, memorabilia, and the like. The process is one of sequential offers not unlike haggling in an open-air market. A seller lists an initial price, to which buyers may make counteroffers, to which sellers may make counteroffers, and so on. If a price is agreed upon, the good sells. The authors analyze 10.5 million listed items, out of which 2.8 million received offers and 2.1 million ultimately sold. Their key finding is that items listed at multiples of $100 receive lower offers on average than items listed at nearby prices, ultimately selling for 5 to 8 percent less.

It is tempting to label such behavior a mistake. However, items listed at these round numbers receive offers 6 to 11 days sooner and are 3 to 5 percent more likely to sell than items listed at "precise" numbers. Furthermore, even experienced sellers frequently list items at round numbers, suggesting it is an equilibrium behavior best modeled by rationality rather than seller error. It appears that impatient sellers are able to signal their impatience and are happy to do it, even though it nets them a lower price.

One concern with the analysis is that round-number pricing might provide a signal about the good being sold, rather than the person or firm selling it. To address this issue, the authors use data on goods originally posted with prices in British pounds. These prices are automatically translated to U.S. dollars for the American market. Hence, the authors can test what happens when goods intended to be sold at round numbers are, in fact, sold at non-round numbers. This removes the round-number signal while holding the good's features constant. In this setting, they find that buyers of goods priced in non-round dollar amounts systematically realize higher prices, though the effect is not as strong as that in their primary sample. This evidence indicates the round numbers themselves have a significant effect on bargaining outcomes.

The authors find additional evidence on the round-number phenomenon in the real estate market in Illinois from 1992 to 2002. This is a wholly different market than that for eBay collectibles, with much higher prices and with sellers typically receiving advice from professional listing agents. But here, too, there is evidence that round-number listings lead to lower sales prices. On average, homes listed at multiples of $50,000 sold for $600 less.

No sense hiding from evidence that works against my support of immigration. This is from George Borjas (if you are unfamiliar with the Mariel boatlift, see here):

The Wage Impact of the Marielitos: A Reappraisal, by George J. Borjas, NBER Working Paper No. 21588 [open link]: This paper brings a new perspective to the analysis of the Mariel supply shock, revisiting the question and the data armed with the accumulated insights from the vast literature on the economic impact of immigration. A crucial lesson from this literature is that any credible attempt to measure the wage impact of immigration must carefully match the skills of the immigrants with those of the pre-existing workforce. The Marielitos were disproportionately low-skill; at least 60 percent were high school dropouts. A reappraisal of the Mariel evidence, specifically examining the evolution of wages in the low-skill group most likely to be affected, quickly overturns the finding that Mariel did not affect Miami’s wage structure. The absolute wage of high school dropouts in Miami dropped dramatically, as did the wage of high school dropouts relative to that of either high school graduates or college graduates. The drop in the relative wage of the least educated Miamians was substantial (10 to 30 percent), implying an elasticity of wages with respect to the number of workers between -0.5 and -1.5. In fact, comparing the magnitude of the steep post-Mariel drop in the low-skill wage in Miami with that observed in all other metropolitan areas over an equivalent time span between 1977 and 2001 reveals that the change in the Miami wage structure was a very unusual event. The analysis also documents the sensitivity of the estimated wage impact to the choice of a placebo. The measured impact is much smaller when the placebo consists of cities where pre-Mariel employment growth was weak relative to Miami.

One recurrent criticism focuses on ‘field-dependent factors’... In a recent paper (Anauati et al. 2015), we analyze if the ‘field-dependent factors’ critique is also valid for fields of research inside economics. Our approach began by assigning into one of four fields of economic research (applied, applied theory, econometric methods and theory) every paper published in the top five economics journals – The American Economic Review, Econometrica, the Journal of Political Economy, The Quarterly Journal of Economics, and The Review of Economic Studies.

The sample consisted of 9,672 articles published in the top five journals between 1970 and 2000. It did not include notes, comments, announcements or American Economic Review Papers and Proceedings issues. ...

What did they find?:

Conclusions Even though citation counts are an extremely valuable tool for measuring the importance of academic articles, the patterns observed for the lifecycles of papers across fields of economic research support the ‘field-dependent factors’ inside this discipline. Evidence seems to provide a basis for a caveat regarding the use of citation counts as a ‘one-size-fits-all’ yardstick to measure research outcomes in economics across fields of research, as the incentives generated by their use can be detrimental for fields of research which effectively generate valuable (but perhaps more specialized) knowledge, not only in economics but in other disciplines as well.

According to our findings, pure theoretical economic research is the clear loser in terms of citation counts. Therefore, if specialized journals' impact factors are calculated solely on the basis of citations during the first years after an article’s publication, then theoretical research will clearly not be attractive to departments, universities or journals that are trying to improve their rankings or to researchers who use their citation records when applying for better university positions or for grants. The opposite is true for applied papers and applied theory papers – these fields of research are the outright winners when citation counts are used as a measurement of articles' importance, and their citation patterns over time are highly attractive for all concerned. Econometric method papers are a special case; their citation patterns vary a great deal across different levels of success.

Saturday, September 19, 2015

Some preliminary results from a working paper by Alisdair Mckay and Ricardo Reis:

Optimal Automatic Stabilizers, by Alisdair McKay and Ricardo Reis: 1 Introduction How generous should the unemployment insurance system be? How progressive should the tax system be? These questions have been studied extensively and there are well-known trade-offs between social insurance and incentives. Typically these issues are explored in the context of a stationary economy. These policies, however, also serve as automatic stabilizers that alter the dynamics of the business cycle. The purpose of this paper is to ask how and when aggregate stabilization objectives call for, say, more generous unemployment benefits or a more progressive tax system than would be desirable in a stationary economy. ...

We consider two classic automatic stabilizers: unemployment benefits and progressive taxation. Both of these policies have roles in redistributing income and in providing social insurance. Redistribution affects aggregate demand in our model because households differ in their marginal propensities to consume. Social insurance affects aggregate demand through precautionary savings decisions because markets are incomplete. In addition to unemployment insurance and progressive taxation, we also consider a fiscal rule that makes government spending respond automatically to the state of the economy.

Our focus is on the manner in which the optimal fiscal structure of the economy is altered by aggregate stabilization concerns. Increasing the scope of the automatic stabilizers can lead to welfare gains if they raise equilibrium output when it would otherwise be inefficiently low and vice versa. Therefore, it is not stabilization per se that is the objective but rather eliminating inefficient fluctuations. An important aspect of the model specification is therefore the extent of inefficient business cycle fluctuations. Our model generates inefficient fluctuations because prices are sticky and monetary policy cannot fully eliminate the distortions. We show that in a reasonable calibration, more generous unemployment benefits and more progressive taxation are helpful in reducing these inefficiencies. Simply put, if unemployment is high when there is a negative output gap, a larger unemployment benefit will stimulate aggregate demand when it is inefficiently low thereby raising welfare. Similarly, if idiosyncratic risk is high when there is a negative output gap,1 providing social insurance through more progressive taxation will also increase welfare....

We find that a hospital’s ownership of an admitting physician dramatically increases the probability that the physician’s patients will choose the owning hospital. We also find that ownership of an admitting physician has large effects on how the hospital’s cost and quality affect patients’ hospital choice. Patients whose admitting physician is not owned by a hospital are more likely to choose facilities that are low cost and high quality. ... We conclude that hospital/physician integration affects patients’ hospital choices in a way that is inconsistent with their best interests.

Monday, September 07, 2015

Support for Redistribution in an Age of Rising Inequality: New Stylized Facts and Some Tentative Explanations, by Vivekinan Ashok, Ilyana Kuziemko, and Ebonya Washington, NBER Working Paper No. 21529 Issued in September 2015 [open link to earlier version]: Despite the large increases in economic inequality since 1970, American survey respondents exhibit no increase in support for redistribution, in contrast to the predictions from standard theories of redistributive preferences. We replicate these results but further demonstrate substantial heterogeneity by demographic groups. In particular, the two groups who have most moved against income redistribution are the elderly and African-Americans. We find little evidence that these subgroup trends are explained by relative economic gains or growing cultural conservatism, two common explanations. We further show that the elderly trend is uniquely American, at least relative to other developed countries with comparable survey data. While we are unable to provide definitive evidence on the cause of these two groups' declining redistributive support, we offer additional correlations which may offer fruitful directions for future research on the topic. One story consistent with the data on elderly trends is that older Americans worry that redistribution will come at their expense, in particular via cuts to Medicare. We find that the elderly have grown increasingly opposed to government provision of health insurance and that controlling for this tendency explains about 40% of their declining support for redistribution. For blacks, controlling for their declining support of race-targeted aid explains nearly 45% of their differential decline in redistributive preferences (raising the question of why support for race-targeted aid has fallen during a period when black economic catch-up to whites has stalled).

Monday, August 31, 2015

This is a summary of new research from two of our former graduate students here at the University of Oregon, Harold Cuffe and Chris Gibbs (link to full paper):

The effect of payday lending restrictions on liquor sales – Synopsis, by Harold Cuffe and Chris Gibbs: The practice of short-term consumer financing known as payday lending remains controversial because the theoretical gains in welfare from greater credit access stand in opposition to anecdotal evidence that many borrowers are made worse off. Advocates for the industry assert that the loans fill a gap in credit access for underserved individuals facing temporary financial hardship. Opponents, who include many state legislatures and the Obama administration, argue that lenders target financially vulnerable individuals with little ability to pay down their principal, who may end up paying many times the borrowed amount in interest and fees.

Regulations restricting both payday loan and liquor access seek to minimize the potential for overuse. To justify intervention in the two markets, policy makers note a host of negative externalities associated with each product, and cite behavioral motivations underlying individuals' consumption decisions. In particular, researchers have shown that the same models of impulsivity and dynamically inconsistent decision making - hyperbolic preferences and the cue theory of consumption - used to describe the demand for alcohol, also describe patterns of payday loan usage. In these models, individuals can objectively benefit from a restricted choice set that limits their access to loans and liquor. The overlap in behavioral characteristics of over-users of both products suggests that liquor sales is a reasonable and interesting place to test the effectiveness of payday lending regulations.

To identify the causal effect of lending restrictions on liquor sales, we exploit a change in payday lending laws in the State of Washington. Leveraging lender- and liquor store-level data, we estimate a difference-in-differences model comparing Washington to the neighboring State of Oregon, which did not experience a change in payday lending laws during this time. We find that the law change leads to a significant reduction in liquor sales, with the largest decreases occurring at liquor stores located very near to payday lenders at the time the law took effect. Our results provide compelling evidence on how credit constraints affect consumer spending, suggest a behavioral mechanism that may underlie some payday loan usage, and provide evidence that the Washington’s payday lending regulations reduced one form of loan misuse.

Background

Washington State enacted HB 1709 on January, 1st 2010, which introduced three new major restrictions to the payday loan industry. First the law limited the size of a payday loan to 30% of a person's monthly income or $700, whichever is less. Second the law created a state-wide database to track the issuance of payday loans in order to set a hard cap on the number of loans an individual could obtain in a twelve month period to eight, and eliminated multiple concurrent loans. This effectively prohibited the repayment of an existing loan with a new one. In the year prior to the law, the State of Washington estimated that roughly one third of all payday loan borrowers took out more than eight loans. Finally, the law mandated that borrowers were entitled to a 90 day instalment plan to pay back loans of $400 or less or 180 days for loans over $400.

The effect of the law on the industry was severe. There were 603 payday loan locations active in Washington in 2009 that were responsible for 3.24 million loans worth $1.366 billion according to Washington Division of Financial Institutions. In the year following the law change, the number of payday lenders dropped to 424, and loan volume fell to 1.09 million loans worth only $434 million. The following year the number of locations fell again to 256 with a loan volume of roughly 900,000 worth $330 million. Today there are fewer than 200 lenders in Washington and the total loan volume and value has stabilized close to the 2011 values.

A crucial feature of our estimation strategy involves accounting for potentially endogenous supply side factors that challenge efforts to separately identify changes in demand from the store response to the change. To do so, we focus on liquor control states, in which the state determines the number and location of liquor stores, the products offered, and harmonizes prices across stores to regulate and restrict liquor access. Oregon and Washington were both liquor control states until June of 2012 (Washington privatized liquor sales in June 2012).

Main Results

For this study, we use monthly store-level sales data provided by Oregon's and Washington's respective liquor control agencies from July 2008 through March 2012. Figure 4 plots estimated residuals from a regression of log liquor store sales on a set of store-by-month fixed effects, averaged over state and quarter. The graph possesses three notable features. First, prior to Washington's lending restrictions (indicated by the vertical dashed line), the states' log sales are trending in parallel, which confirming the plausibility of the ``common trends'' assumption of the DD model. Second, a persistent gap in the states' sales appears in the same quarter as the law change. This gap is the result of a relatively large downward movement in Washington's sales compared to Oregon's, consistent with a negative effect of the law on sales. Finally, the effect appears to be primarily a level shift as sales in both states maintain a common upward trend.

Our regression estimates indicate that the introduction of payday lending restrictions reduced liquor store sales by approximately 3.6% (statistically significant at the 1% level). As average Washington liquor sales were approximately $163,000 in the months prior to the law change, this represents a $5,900 decline per store each month. At the state level, the point estimate implies a $23.5 million dollar annual decrease in liquor sales. As Washington State reported that the law decreased payday loans by $932 million from 2009 to 2010, this decline represents approximately 2.5% of the change in total value of loans issued.

We see two primary explanations (not mutually exclusive) for the decline in Washington liquor sales in response to the law change. First, the effect may represent a wider permanent reduction in consumption as households lose their ability to cope with unforeseen negative income shocks. Alternatively, the drop in spending may indicate a more direct financing of liquor purchases by individuals with present-biased preferences. The first explanation implies that restrictions on payday lending negatively affect consumer welfare, while the second allows for a positive impact, since individuals with present-biased preferences may be made objectively better off with a restricted choice set.

Zinman (2013) highlights Laibson (2001) theory of Pavlovian cues as a particularly intriguing explanation for payday loan usage. In these models, consumer ``impulsivity'' makes instant gratification a special case during dynamic utility maximization, where exposure to a cue can explain dynamically inconsistent behavior. Indeed, Laibson uses liquor as a prime example of a consumption good thought to be influenced by cues, and subsequent experimental research on liquor uncovers evidence consistent with this hypothesis (MacKillop et al (2010)). In situations where payday lenders locate very near to liquor stores, individuals may be exposed to a cue for alcohol, and then see the lender as a means to satisfy the urge to make an immediate purchase. A lender and liquor store separated by even a brief walk may be far enough apart to allow an individual to resist the urge to obtain both the loan and liquor. Of course, cue-theory of consumption makes lender-liquor store distance relevant even in circumstances where individuals experience a cue only after borrowing. Lenders locating near liquor stores increase the likelihood that an individual exposed to a cue is financially liquid, and able to act on an impulse.

To investigate liquor store and lender proximity, we geocode the stores' and lenders' street addresses, and calculate walking distances for all liquor store-lender pairs within two kilometers of one another. We then repeatedly estimate our preferred specification with a full set of controls on an ever expanding window of liquor stores beginning with the stores that were located within a ten meter walking distance of a lender in the month prior to the law change, then within 100 meters, within 200 meters, etc., to two kilometres. These estimates are presented in Figure 5. The graph demonstrates a negative effect of 9.2% on those liquor stores that had a payday lender located within ten meters in the month before the law change (significant at the 1% levels), an effect almost three times as large as that overall. The larger effect rapidly declines in distance suggesting that even a small degree of separation is significant. The degree of nonlinearity in the relationship between distance and liquor sales supports the behavioral explanation of demand.

Conclusion

Our analysis provides the first empirical evidence of the connection between payday lending and spending on liquor. We uncover a clear reduction in liquor sales resulting from payday lending restrictions. In addition, we find that those liquor stores located very near to lenders at the time of the law change experience declines in sales almost three times as large as the overall average.

This finding is significant because it highlights that a segment of borrowers may be willing to assume significant risk by borrowing in order to engage in alcohol consumption - an activity which carries significant personal risk of its own. The connection between payday lending restrictions and reduced liquor purchases, therefore, suggests that the benefits to payday lending restrictions extend beyond personal finance and may be large.

Effective payday loan regulation should recognize the potential for greater credit access to help or harm consumers. As Carrell and Zinman (2014) highlight, heterogeneity likely exists within the pool of payday loan users, and external factors will influence the ratio of ``productive and counter-productive borrowers.'' Lending restrictions can seek to reduce the proportion of counterproductive borrowers through the prohibition of practices known to harm consumers, including those that rely upon leveraging behavioral responses such as addiction and impulsivity. The behavioral overlap identified in the literature between counterproductive payday loan borrowers and heavy alcohol users suggests that there exists a link between the two markets. The decline in liquor sales documented here provides evidence that these regulations may be effective in promoting productive borrowing.

Tuesday, August 25, 2015

Nothing particularly surprising here -- the Great recession was unusually severe and unusually long, and hence had unusual impacts, but it's good to have numbers characterizing what happened:

Great Recession Job Losses Severe, Enduring: Of those who lost full-time jobs between 2007 and 2009, only about 50 percent were employed in January 2010 and only about 75 percent of those were re-employed in full-time jobs.

The economic downturn that began in December 2007 was associated with a rapid rise in unemployment and with an especially pronounced increase in the number of long-term unemployed. In "Job Loss in the Great Recession and its Aftermath: U.S. Evidence from the Displaced Workers Survey" (NBER Working Paper No. 21216), Henry S. Farber uses data from the Displaced Workers Survey (DWS) from 1984-2014 to study labor market dynamics. From these data he calculates both the short-term and medium-term effects of the Great Recession's sharply elevated rate of job losses. He concludes that these effects have been particularly severe.

Of the workers who lost full-time jobs between 2007 and 2009, Farber reports, only about 50 percent were employed in January 2010 and only about 75 percent of those were re-employed in full-time jobs. This means only about 35 to 40 percent of those in the DWS who reported losing a job in 2007-09 were employed full-time in January 2010. This was by far the worst post-displacement employment experience of the 1981-2014 period.

The adverse employment experience of job losers has also been persistent. While both overall employment rates and full-time employment rates began to improve in 2009, even those who lost jobs between 2011 and 2013 had very low re-employment rates and, by historical standards, very low full-time employment rates.

In addition, the data show substantial weekly earnings declines even for those who did find work, although these earnings losses were not especially large by historical standards. Farber suggests that the earnings decline measure from the DWS is appropriate for understanding how job loss affects the earnings that a full-time-employed former job-loser is able to command.

The author notes that the measures on which he focuses may understate the true economic cost of job loss, since they do not consider the value of time spent unemployed or the value of lost health insurance and pension benefits.

Farber concludes that the costs of job losses in the Great Recession were unusually severe and remain substantial years later. Most importantly, workers laid off in the Great Recession and its aftermath have been much less successful at finding new jobs, particularly full-time jobs, than those laid off in earlier periods. The findings suggest that job loss since the Great Recession has had severe adverse consequences for employment and earnings.

Disparities in youth outcomes in the United States are striking. For example, among 15-to-24 year olds, the male homicide rate in 2013 was 18 times higher for blacks than for whites. Black males lose more years of potential life before age 65 to homicide than to heart disease, America's leading overall killer. A large body of research emphasizes that, beyond institutional factors, choices and behavior contribute to these outcomes. Those choices include decisions around dropping out of high school, involvement with drugs or gangs, and how to respond to confrontations that could escalate to serious violence.

In "Thinking, Fast and Slow? Some Field Experiments to Reduce Crime and Dropout in Chicago" (NBER Working Paper No. 21178), authors Sara B. Heller, Anuj K. Shah, Jonathan Guryan, Jens Ludwig, Sendhil Mullainathan, and Harold A. Pollack explain these behavioral differences using the psychology of automaticity. Because it is mentally costly to think through every situation in detail, all of us have automatic responses to some of the situations we encounter. These responses—automaticity—are tuned to situations we commonly face.

The authors present results from three large-scale, randomized experimental studies carried out in Chicago with economically disadvantaged male youth. All three experiments show sizable behavioral responses to fairly short-duration, automaticity-reducing interventions that get youths to slow down and behave less automatically in high-stakes situations.

The first intervention (called Becoming a Man, or BAM, developed by Chicago-area nonprofit Youth Guidance) involved 2,740 males in the 7th through 10th grades in 18 public schools on the south and west sides of the city. Some youths were offered an automaticity-reducing program once a week during school or an after-school sports intervention developed by Chicago nonprofit World Sport Chicago. The authors find that participation in the programming reduced arrests over the program year for violent crimes by 44 percent, and non-violent, non-property, non-drug crimes by 36 percent. Participation also increased engagement with school, which the authors estimate could translate into gains in graduation rates of between 7 and 22 percent.

A second study of BAM randomly assigned 2,064 male 9th and 10th graders within nine Chicago public high schools to the treatment or to a control condition. The authors found that arrests of youth in the treatment group were 31 percent lower than arrests in the control group.

The third intervention was delivered by trained detention staff to high-risk juveniles housed in the Cook County Juvenile Temporary Detention Center. The curriculum in this program, while different from the first two interventions, also focused on reducing automaticity. Some 5,728 males were randomly assigned to units inside the facility that did or did not implement the program. The authors found that those who received programming were about 16 percent less likely to be returned to the detention center than those who did not.

The sizable impacts the authors observe from all three interventions stand in stark contrast to the poor record of many efforts to improve the long-term life outcomes of disadvantaged youths. As with all randomized experiments, there is the question of whether these impacts generalize to other samples and settings. The interventions considered in this study would not be costly to expand. The authors estimate that the cost of the intervention for each participant in the first two studies was between $1,178 and $2,000. In the third case, the per-participant cost was about $60 per juvenile detainee. The results suggest that expanding these programs may be more cost-effective than other crime-prevention strategies that target younger individuals.

The authors also present results from various survey measures suggesting the results do not appear to be due to changes in mechanisms like emotional intelligence or self-control. On the other hand results from some decision-making exercises the authors carried out seem to support reduced automaticity as a key mechanism. The results overall suggest that automaticity can be an important explanation for disparities in outcomes.

Monday, August 17, 2015

This is the abstract, introduction, and final section of a recent paper by Joe Stiglitz on theoretical models of deep depressions (as he notes, it's "an extension of the Presidential Address to the International Economic Association"):

Towards a General Theory of Deep Downturns, by Joseph E. Stiglitz, NBER Working Paper No. 21444, August 2015: Abstract This paper, an extension of the Presidential Address to the International Economic Association, evaluates alternative strands of macro-economics in terms of the three basic questions posed by deep downturns: What is the source of large perturbations? How can we explain the magnitude of volatility? How do we explain persistence? The paper argues that while real business cycles and New Keynesian theories with nominal rigidities may help explain certain historical episodes, alternative strands of New Keynesian economics focusing on financial market imperfections, credit, and real rigidities provides a more convincing interpretation of deep downturns, such as the Great Depression and the Great Recession, giving a more plausible explanation of the origins of downturns, their depth and duration. Since excessive credit expansions have preceded many deep downturns, particularly important is an understanding of finance, the credit creation process and banking, which in a modern economy are markedly different from the way envisioned in more traditional models.

Introduction The world has been plagued by episodic deep downturns. The crisis that began in 2008 in the United States was the most recent, the deepest and longest in three quarters of a century. It came in spite of alleged “better” knowledge of how our economic system works, and belief among many that we had put economic fluctuations behind us. Our economic leaders touted the achievement of the Great Moderation.[2] As it turned out, belief in those models actually contributed to the crisis. It was the assumption that markets were efficient and self-regulating and that economic actors had the ability and incentives to manage their own risks that had led to the belief that self-regulation was all that was required to ensure that the financial system worked well , an d that there was no need to worry about a bubble . The idea that the economy could, through diversification, effectively eliminate risk contributed to complacency — even after it was evident that there had been a bubble. Indeed, even after the bubble broke, Bernanke could boast that the risks were contained.[3] These beliefs were supported by (pre-crisis) DSGE models — models which may have done well in more normal times, but had little to say about crises. Of course, almost any “decent” model would do reasonably well in normal times. And it mattered little if, in normal times , one model did a slightly better job in predicting next quarter’s growth. What matters is predicting — and preventing — crises, episodes in which there is an enormous loss in well-being. These models did not see the crisis coming, and they had given confidence to our policy makers that, so long as inflation was contained — and monetary authorities boasted that they had done this — the economy would perform well. At best, they can be thought of as (borrowing the term from Guzman (2014) “models of the Great Moderation,” predicting “well” so long as nothing unusual happens. More generally, the DSGE models have done a poor job explaining the actual frequency of crises.[4]

Of course, deep downturns have marked capitalist economies since the beginning. It took enormous hubris to believe that the economic forces which had given rise to crises in the past were either not present, or had been tamed, through sound monetary and fiscal policy.[5] It took even greater hubris given that in many countries conservatives had succeeded in dismantling the regulatory regimes and automatic stabilizers that had helped prevent crises since the Great Depression. It is noteworthy that my teacher, Charles Kindleberger, in his great study of the booms and panics that afflicted market economies over the past several hundred years had noted similar hubris exhibited in earlier crises. (Kindleberger, 1978)

Those who attempted to defend the failed economic models and the policies which were derived from them suggested that no model could (or should) predict well a “once in a hundred year flood.” But it was not just a hundred year flood — crises have become common . It was not just something that had happened to the economy. The crisis was man-made — created by the economic system. Clearly, something is wrong with the models.

Studying crises is important, not just to prevent these calamities and to understand how to respond to them — though I do believe that the same inadequate models that failed to predict the crisis also failed in providing adequate responses. (Although those in the US Administration boast about having prevented another Great Depression, I believe the downturn was certainly far longer, and probably far deeper, than it need to have been.) I also believe understanding the dynamics of crises can provide us insight into the behavior of our economic system in less extreme times.

This lecture consists of three parts. In the first, I will outline the three basic questions posed by deep downturns. In the second, I will sketch the three alternative approaches that have competed with each other over the past three decades, suggesting that one is a far better basis for future research than the other two. The final section will center on one aspect of that third approach that I believe is crucial — credit. I focus on the capitalist economy as a credit economy , and how viewing it in this way changes our understanding of the financial system and monetary policy. ...

He concludes with:

IV. The crisis in economics The 2008 crisis was not only a crisis in the economy, but it was also a crisis for economics — or at least that should have been the case. As we have noted, the standard models didn’t do very well. The criticism is not just that the models did not anticipate or predict the crisis (even shortly before it occurred); they did not contemplate the possibility of a crisis, or at least a crisis of this sort. Because markets were supposed to be efficient, there weren’t supposed to be bubbles. The shocks to the economy were supposed to be exogenous: this one was created by the market itself. Thus, the standard model said the crisis couldn’t or wouldn’t happen ; and the standard model had no insights into what generated it.

Not surprisingly, as we again have noted, the standard models provided inadequate guidance on how to respond. Even after the bubble broke, it was argued that diversification of risk meant that the macroeconomic consequences would be limited. The standard theory also has had little to say about why the downturn has been so prolonged: Years after the onset of the crisis, large parts of the world are operating well below their potential. In some countries and in some dimension, the downturn is as bad or worse than the Great Depression. Moreover, there is a risk of significant hysteresis effects from protracted unemployment, especially of youth.

The Real Business Cycle and New Keynesian Theories got off to a bad start. They originated out of work undertaken in the 1970s attempting to reconcile the two seemingly distant branches of economics, macro-economics, centering on explaining the major market failure of unemployment, and microeconomics, the center piece of which was the Fundamental Theorems of Welfare Economics, demonstrating the efficiency of markets.[66] Real Business Cycle Theory (and its predecessor, New Classical Economics) took one route: using the assumptions of standard micro-economics to construct an analysis of the aggregative behavior of the economy. In doing so, they left Hamlet out of the play: almost by assumption unemployment and other market failures didn’t exist. The timing of their work couldn’t have been worse: for it was just around the same time that economists developed alternative micro-theories, based on asymmetric information, game theory, and behavioral economics, which provided better explanations of a wide range of micro-behavior than did the traditional theory on which the “new macro - economics” was being constructed. At the same time, Sonnenschein (1972) and Mantel (1974) showed that the standard theory provided essentially no structure for macro- economics — essentially any demand or supply function could have been generated by a set of diverse rational consumers. It was the unrealistic assumption of the representative agent that gave theoretical structure to the macro-economic models that were being developed. (As we noted, New Keynesian DSGE models were but a simple variant of these Real Business Cycles, assuming nominal wage and price rigidities — with explanations, we have suggested, that were hardly persuasive.)

There are alternative models to both Real Business Cycles and the New Keynesian DSGE models that provide better insights into the functioning of the macroeconomy, and are more consistent with micro- behavior, with new developments of micro-economics, with what has happened in this and other deep downturns . While these new models differ from the older ones in a multitude of ways, at the center of these models is a wide variety of financial market imperfections and a deep analysis of the process of credit creation. These models provide alternative (and I believe better) insights into what kinds of macroeconomic policies would restore the economy to prosperity and maintain macro-stability.

This lecture has attempted to sketch some elements of these alternative approaches. There is a rich research agenda ahead.

Sunday, August 16, 2015

The U.S. Foreclosure Crisis Was Not Just a Subprime Event, by Les Picker, NBER: Many studies of the housing market collapse of the last decade, and the associated sharp rise in defaults and foreclosures, focus on the role of the subprime mortgage sector. Yet subprime loans comprise a relatively small share of the U.S. housing market, usually about 15 percent and never more than 21 percent. Many studies also focus on the period leading up to 2008, even though most foreclosures occurred subsequently. In "A New Look at the U.S. Foreclosure Crisis: Panel Data Evidence of Prime and Subprime Borrowers from 1997 to 2012" (NBER Working Paper No. 21261), Fernando Ferreira and Joseph Gyourko provide new facts about the foreclosure crisis and investigate various explanations of why homeowners lost their homes during the housing bust. They employ microdata that track outcomes well past the beginning of the crisis and cover all types of house purchase financing—prime and subprime mortgages, Federal Housing Administration (FHA)/Veterans Administration (VA)-insured loans, loans from small or infrequent lenders, and all-cash buyers. Their data contain information on over 33 million unique ownership sequences in just over 19 million distinct owner-occupied housing units from 1997-2012.

The researchers find that the crisis was not solely, or even primarily, a subprime sector event. It began that way, but quickly expanded into a much broader phenomenon dominated by prime borrowers' loss of homes. There were only seven quarters, all concentrated at the beginning of the housing market bust, when more homes were lost by subprime than by prime borrowers. In this period 39,094 more subprime than prime borrowers lost their homes. This small difference was reversed by the beginning of 2009. Between 2009 and 2012, 656,003 more prime than subprime borrowers lost their homes. Twice as many prime borrowers as subprime borrowers lost their homes over the full sample period.

The authors suggest that one reason for this pattern is that the number of prime borrowers dwarfs that of subprime borrowers and the other borrower/owner categories they consider. The prime borrower share averages around 60 percent and did not decline during the housing boom. Although the subprime borrower share nearly doubled during the boom, it peaked at just over 20 percent of the market. Subprime's increasing share came at the expense of the FHA/VA-insured sector, not the prime sector.

The authors' key empirical finding is that negative equity conditions can explain virtually all of the difference in foreclosure and short sale outcomes of prime borrowers compared to all cash owners. Negative equity also accounts for approximately two-thirds of the variation in subprime borrower distress. Both are true on average, over time, and across metropolitan areas.

None of the other 'usual suspects' raised by previous research or public commentators—housing quality, race and gender demographics, buyer income, and speculator status—were found to have had a major impact. Certain loan-related attributes such as initial loan-to-value (LTV), whether a refinancing occurred or a second mortgage was taken on, and loan cohort origination quarter did have some independent influence, but much weaker than that of current LTV.

The authors' findings imply that large numbers of prime borrowers who did not start out with extremely high LTVs still lost their homes to foreclosure. They conclude that the economic cycle was more important than initial buyer, housing and mortgage conditions in explaining the foreclosure crisis. These findings suggest that effective regulation is not just a matter of restricting certain exotic subprime contracts associated with extremely high default rates.

Monday, August 10, 2015

What Works? A Meta Analysis of Recent Active Labor Market Program Evaluations, by David Card, Jochen Kluve, and Andrea Weber, NBER Working Paper No. 21431 Issued in July 2015: We present a meta-analysis of impact estimates from over 200 recent econometric evaluations of active labor market programs from around the world. We classify estimates by program type and participant group, and distinguish between three different post-program time horizons. Using meta-analytic models for the effect size of a given estimate (for studies that model the probability of employment) and for the sign and significance of the estimate (for all the studies in our sample) we conclude that: (1) average impacts are close to zero in the short run, but become more positive 2-3 years after completion of the program; (2) the time profile of impacts varies by type of program, with larger gains for programs that emphasize human capital accumulation; (3) there is systematic heterogeneity across participant groups, with larger impacts for females and participants who enter from long term unemployment; (4) active labor market programs are more likely to show positive impacts in a recession. [open link]

And:

Clearing Up the Fiscal Multiplier Morass: Prior and Posterior Analysis, by Eric M. Leeper, Nora Traum, and Todd B. Walker, NBER Working Paper No. 21433 Issued in July 2015: We use Bayesian prior and posterior analysis of a monetary DSGE model, extended to include fiscal details and two distinct monetary-fiscal policy regimes, to quantify government spending multipliers in U.S. data. The combination of model specification, observable data, and relatively diffuse priors for some parameters lands posterior estimates in regions of the parameter space that yield fresh perspectives on the transmission mechanisms that underlie government spending multipliers. Posterior mean estimates of short-run output multipliers are comparable across regimes—about 1.4 on impact—but much larger after 10 years under passive money/active fiscal than under active money/passive fiscal—means of 1.9 versus 0.7 in present value. [open link]

Thursday, August 06, 2015

“Buying Locally,” G. J. Mailath, A. Postlewaite & L. Samuelson (2015): Arrangements where agents commit to buy only from selected vendors, even when there are more preferred products at better prices from other vendors, are common. Consider local currencies like “Ithaca Hours”, which can only be used at other participating stores and which are not generally convertible, or trading circles among co-ethnics even when trust or unobserved product quality is not important. The intuition people have for “buying locally” is to, in some sense, “keep the profits in the community”; that is, even if you don’t care at all about friendly local service or some other utility-enhancing aspect of the local store, you should still patronize it. The fruit vendor, should buy from the local bookstore even when her selection is subpar, and the book vendor should in turn patronize you even when fruits are cheaper at the supermarket.

At first blush, this seems odd to an economist. Why would people voluntarily buy something they don’t prefer? What Mailath and his coauthors show is that, actually, the noneconomist intuition is at least partially correct when individuals are both sellers and buyers. Here’s the idea. ....

One thing that isn’t explicit in the paper, perhaps because it is too trivial despite its importance, is how buy local arrangements affect welfare..., an intriguing possibility is that “buy local” arrangements may not harm social welfare at all, even if they are beneficial to in-group members. ...