Sunday, December 28, 2014

If you thought that the Greek debt crisis was over; think again. Tomorrow, the Greek parliament will try, for the third time, to agree on who will be the next president. If parliamentarians cannot agree (and that now seems likely) we are headed for the first potential rock in the road to recovery for 2015. There is a real danger that the Greek debt crisis will emerge with a vengeance and, once again, throw world financial markets into turmoil.

Under the rules of the Greek constitution, if no candidate
receives an absolute majority, parliament will be dissolved, and there will be
a general election, most likely in early February. If that happens, all signs point to a victory by
Syriza, a left of center party that proposes to renegotiate the Greek debt.

A Syriza victory would
force the core Euro countries to decide either to give up on the project of
European integration, or to move to the next stage of full scale fiscal union
in which German taxpayers assume responsibility for Greek debt.

If the Euro breaks
apart, the fallout will be global. The world economy has been hit by a falling
demand for raw materials and oil is trading at less than US$60 a barrel. Some of this is caused by newly discovered proven reserves and that is a good thing. But Jim Hamilton has argued that falling world demandis a big part of the reason for lower oil prices and that does not bode well for a truly global recovery.

The US economy has
been the single flickering light in a dark sky. If the Euro collapses, the knock-on-effect will derail the US recovery and send the entire world economy back into recession.

Is a Greek default and a breakup of the Euro the most likely outcome? Probably not. But it is the first of many building storms that the global economy will need to weather in 2015. All eyes on Greece tomorrow!

Saturday, December 13, 2014

I have lost count of the number of times I have heard students and faculty repeat the idea in seminars, that “all models are wrong”. This aphorism, attributed to George Box, is the battle cry of the Minnesota calibrator, a breed of macroeconomist, inspired by Ed Prescott, one of the most important and influential economists of the last century.

All models are wrong... all models are wrong...

Of course all models are wrong. That is trivially true: it is the definition of a model. But the cry has been used for three decades to poke fun at attempts to use serious econometric methods to analyze time series data. Time series methods were inconvenient to the nascent Real Business Cycle Program that Ed pioneered because the models that he favored were, and still are, overwhelmingly rejected by the facts. That is inconvenient.Ed’s response was pure genius. If the model and the data are in conflict, the data must be wrong. Time series econometrics, according to Ed, was crushing the acorn before it had time to grow into a tree. His response was not only to reformulate the theory, but also to reformulate the way in which that theory was to be judged. In a puff of calibrator’s smoke, the history of time series econometrics was relegated to the dustbin of history to take its place alongside alchemy, the ether, and the theory of phlogiston.

Thursday, December 11, 2014

I've followed, with a great deal of interest, the debate between John Cochrane and Paul Krugman. I have a lot in common with both of them. I agree with Paul that, for the most part, the IS-LM model provides the right answer to policy questions. I agree with John, that we have learned a lot since 1955, when Paul Samuelson invented the Neo-classical synthesis.

But there were a couple of ideas in the General Theory that have been buried by MIT macro. The first, and most important, is that high unemployment is an equilibrium. Repeat after me. E-Q-U-I-L-I-B-R-I-U-M. The second is that animal spirits are an independent causal factor that determines which equilibrium the private economy will select.Let me ask a simple question that you should feel free to answer. And do please also try to guess the PK and JC answers. (To answer this question, you will need to arm yourself with a knowledge of the textbook IS-LM model. A good introduction would be Greg Mankiw's textbook or, the book I learned from, the intermediate text by Dornbusch, Fischer and Starz.)

Saturday, December 6, 2014

This is the second post to advertise the work of a UCLA graduate student who is looking for a job this year. My first post introduced Sangyup Choi who is working on uncertainty shocks in emerging markets. This post introduces Chan Mang who is working on the implications of term structure models for the foreign exchange market.

Chan Mang

Chan Mang graduated from UCLA two years ago. In 2012 he was awarded a post doc position at the prestigious National University of Singapore and last year he worked in the private sector. Chan's research builds on the widely cited bond pricing model developed by John Cochrane and Monika Piazzesi.

Friday, November 14, 2014

I've been teaching a class on intermediate macroeconomics this quarter. Increasingly, over the past twenty years or more, intermediate macro classes at UCLA (and in many other top schools), have focused almost exclusively on economic growth. That reflected a bias in the profession, initiated by Finn Kydland and Ed Prescott, who persuaded macroeconomists to use the Ramsey growth model as a paradigm for business cycle theory. According to this Real Business Cycle view of the world, we should think about consumption, investment and employment 'as if' they were the optimal choices of a single representative agent with super human perception of the probabilities of future events. Although there were benefits to thinking more rigorously about inter-temporal choice, the RBC program as a whole led several generations of the brightest minds in the profession to stop thinking about the problem of economic fluctuations and to focus instead on economic growth. Kydland and Prescott assumed that labor is a commodity like any other and that any worker can quickly find a job at the market wage. In my view, the introduction of the shared belief that the labor market clears in every period, was a huge misstep for the science of macroeconomics that will take a long time to correct.In my intermediate macroeconomics class, I am teaching business cycle theory from the perspective of Keynesian macroeconomics but I am grounding old Keynesian concepts in the theory of labor market search, based on my recent books (2010a, 2010b) and articles (2011, 2012, 2013a, 2013b). I am going to use this blog to explain some insights that undergraduates can easily absorb that are adapted from my understanding of Keynes' General Theory. Today's post is about measuringemployment. In later posts, I will take up the challenge of constructing a theory to explainunemployment.Ever since Robert Lucas introduced the idea of continuous labor market clearing, the idea that it may be useful to talk of something called 'involuntary unemployment' has been scoffed at by the academic chattering classes. It's time to fight back. The concept of 'involuntary unemployment' does not describe a loose notion that characterizes the sloppy work of heterodox economists from the dark side. It is a useful category that describes a group of workers who have difficulty finding jobs at existing market prices.

Sunday, November 9, 2014

Early in the New Year, economists from all over the world will congregate in Boston for the 2015 annual meetings of the American Economics Association. The main purpose of these meetings is to interview new Ph.D. candidates for potential jobs as academics and in the public and private sectors as research and/or policy economists.

Sangyup Choi

As an academic economist at UCLA, my job includes teaching undergraduates, carrying out economic research for publication in books and journals and, (my favorite part), training new Ph.D. economists. Teaching graduate students is a rewarding experience for an academic as we get to watch our students progress from undergraduates to colleagues. What begins as a teaching experience in year 1 ends up as a learning experience in year 5.

Today's blog features my student, Sangyup (Sam) Choi, who is working on the impact of financial market volatility on emerging market economies. My colleague Aaron Tornell and I are Sam's principal advisors.

Sam is studying the VIX and its impact on economic activity. This is a hot topic amongst macroeconomists ever since Nick Bloom showed, in a paper published in Econometrica, that shocks to uncertainty are a causal factor in US. recessions. What, you ask is the VIX?

The VIX is an index of volatility that goes up when traders are less certain about the future. In his Econometrica paper, Nick showed that shocks to the VIX are an independent causal factor that helps to predict future U.S. output. Here is a graph of the VIX for the period 2000 to 2014.

Figure 1: The VIX from 2000 to 2014

In a paper published last year in Economics Letters, Sam showed that Nick’s results are sensitive to the period of study. The VIX does predict future output in data from 1950 through 1982, but that result goes away after 1983. The largest recession in post war history in which the VIX jumped by a factor of four, (see Figure 1), did not have a significant independent impact on the U.S. economy, once other explanatory variables have been accounted for. That in itself is surprising. But it gets better.

Wednesday, October 15, 2014

Volatility has returned to the stock market and most of the gains of 2014 were wiped out in the last week. Is it time to panic? Not yet!

There is a close relationship between changes in the value of the stock market and changes in the unemployment rate one quarter later. My research here, and here shows that a persistent 10% drop in the real value of the stock market is followed by a persistent 3% increase in the unemployment rate. The important word here is persistent. If the market drops 10% on Tuesday and recovers again a week later, (not an unusual movement in a volatile market), there will be no impact on the real economy. For a market panic to have real effects on Main Street it must be sustained for at least three months. And there is no sign that that is happening: Yet.

It is of course, possible, that movements in the stock market are only apparently causal. In reality, the clever people who trade in the markets are prescient in their ability to foresee the very bad fundamentals that are driving the real economy. It is also possible that sometimes, market participants panic and that panic has real consequences when the rest of us find that our houses and pension plans are suddenly worthless. My own theoretical work supports the latter hypothesis but reasonable people can disagree.

So: should you be worried that we are about to enter a double dip recession? In my view, not yet, because, as of right now, the market shows no signs of a persistent drop when measured in real terms. When (and if) the Yellen Fed follows through with its withdrawal of QE; we may be looking at a very different situation. Hang on to your hats!

Thursday, October 9, 2014

I have been slow to chime in on Thomas Piketty’s book, Capital in the 21st Century, but it is hard to ignore the chatter that the book has generated from those on all sides of the political spectrum. The book sheds welcome light on the topic of income and wealth inequality and it has rekindled a debate in the United States and Europe on an age-old question: Should we care if some individuals earn much more than others?

As individuals in a modern democracy we make social decisions about how much of each good to produce and consume through free trade in a market economy. The rules by which we trade with others are determined through democratic elections in which we give power to our representatives to transfer resources from one human being to another. And we interact with each other through conversations, free association and social media or through more organized forms of persuasion such as newspapers and television stations.

As economists, we are sometimes justly accused by other social scientists of taking a narrow view of human nature. A human being, to the neoclassical economist, is a preference ordering over all possible actions that he or she may take over the course of a lifetime. That preference ordering is fixed at birth and swings into action at the age of consent, at which time each of us exercises our endowed ability to choose among competing alternatives to maximize our happiness.

That, of course, is poppycock. The view of homo-economicus as a utility seeking machine is not to be found in Smith, who had a much richer view of human nature as evidenced by his “other book” on The Theory of Moral Sentiments. Nor is it to be found in John Stuart Mill’s eloquent defense of free speech in his essay On Liberty. Both of those eminent social scientists would, I am certain, have been open to the idea that our opinions are formed through rational argument with other human beings. Our preference orderings do determine our actions; but they are not preordained. Nature and nurture are equally important determinants of human action.

Sunday, September 21, 2014

John Cochrane supports the case (forcefully made by Anat Admati) for higher capital requirements, citing excellent pieces by Pat Regnier at Time and Peter Coy at Business Week who explain exactly what this does and does not, mean. I agree: we need banks to hold more capital. But is that enough?

The following passages are extracts from my recent paper in the Manchester School on the role of the Financial Policy Committee as a guardian of financial stability. I make the case that financial markets are inefficient because we cannot trade in markets that open before we are born. That fact is an important source of market incompleteness that I call the "absence of prenatal financial markets".

We all agree that financial crises occur. We disagree as to their cause. Some economists argue that markets are not only informationally efficient; they are also Pareto efficient. The boom and the bust are a consequence of the natural flow of knowledge acquisition in a capitalist economy. They are the price of progress. I disagree.

Dynamic Stochastic General Equilibrium Models (DSGE) often have many equilibria. I have long argued that we should exploit that idea to explain real world phenomena. For example, multiple equilibrium models can help to explain why "animal spirits" drive real world markets (see my survey here).

In 2004, Thomas Lubik and Frank Schorfheide published an influential paper which applied that idea to US monetary policy. A number of authors have taken up their method, but the technique they used is not very easy to apply in practice. Our paper shows how to solve and estimate models with indeterminate equilibria using readily available software packages such as Chris Sim's code Gensys, or the widely used Matlab based package Dynare.

Sunday, August 17, 2014

In my last post on QE, I quoted a paper by James Hamilton and Cynthia Wu that provides some empirical evidence for the importance of the asset composition of the Fed's balance sheet and its effect on the term structure of interest rates. They have posted their data online and it makes for interesting bedtime reading.

Hamilton and Wu combined their data with evidence from the yield curve. They found that qualitative easing can be effective at the lower bound and that

To construct these estimates, they used a theoretical model developed by Vayanos and Vila which assumes that there are investors who have a 'preferred habitat'.

The Hamilton Wu results are important. I ran some regressions of term premiums on bond supply by maturity, using their data, and I found the same orders of magnitude in the response of interest rates that they found. But there is an interesting sub-text to their analysis discussed in Section 8 of their paper. The Fed and the Treasury have been following conflicting policies. David Beckworth on his blog in 2012 makes the same point.

Quantitative Easing took place in three phases. QE1 from 11/08 to 03/10, QE2 from 11/10 to 06/11 and QE3 which is ongoing. Along with monetary expansion, the Fed attempted to refinance its portfolio by selling at the short end and buying at the long end of the yield curve. But at the same time, the Treasury was refinancing its own portfolio. The end result was that the Treasury restructuring completely swamped any effect of Fed operations at the long end of the yield curve.

Figure 1

In Figure 1 I have broken down the System Open Market Account (SOMA) of Fed holdings of Treasuries by maturity as a percentage of all outstanding Treasuries, using the Hamilton Wu data set. The two vertical red lines are the beginning and end of the last recession and the vertical black line marks the collapse of Lehman Brothers.

My worry is that, while CAPE has historically been a good predictor of future returns, the level that the FPC should be ready to intervene would have to be set so low that it might be fairly useless. Otherwise the safety net would just encourage increased irrational exuberance.

My response ...

I am not arguing just for a Greenspan Put: but also for a Yellen Call. It is just as dangerous to allow market bubbles as it is to allow them to crash.

Saturday, August 9, 2014

Noah Smith raises the question: can the Fed influence the interest rate? Although the answer may seem obvious, the question itself reflects a conundrum for neoclassical theory. It is representative of a related but more comprehensive question: does the asset composition of the central bank balance sheet matter?

Let me set aside, for now, the deep question: what is money? I will take for granted the fact that the liabilities of the central bank are special. Perhaps this is due to legal restrictions, as Neil Wallace has suggested, or perhaps it is a matter of social convention. My focus here is not on central bank liabilities; but on their assets.

Chris and David chose to speak about monetary policy and the role of the Monetary Policy Committee. I chose, instead, to focus on the task that faces the newly formed Bank of England's Financial Policy Committee. This post will focus on one of the points I made in my talk, the distinction between what I call institutional and systemic explanations of the 2008 financial crisis. My complete argument is published in a forthcoming paper "Financial Stability and the Role of the Financial Policy Committee", that will appear in The Manchester School.

Recent events have generated widespread consensus that the financial markets are not working as they should. But there is little agreement as to why. One explanation is that financial frictions can sometimes become more disruptive than usual and these frictions can be corrected by regulating financial institutions. An alternative explanation that I have promoted in my own work, is that financial markets do not allocate capital efficiently. The failure of financial markets occurs because people who will be born in the future cannot trade in current markets. I call this the absence of prenatal financial markets.

Monday, May 19, 2014

Christian Zimmerman draws attention to a new paper by Paolo Gelain and Marco Guerrazi, "A demand-driven search model with self-fulfilling expectations: The new ‘Farmerian’ framework under scrutiny"

Here is the abstract from the paper

In this paper, we implement Bayesian econometric techniques to analyze a theoretical framework built along the lines of Farmer’s micro-foundation of the General Theory. Specifically, we test the ability of a demand-driven search model with self-fulfilling expectations to match the behaviour of the US economy over the last thirty years. The main findings of our empirical investigation are the following. First, all over the period, our model fits data very well. Second, demand shocks are the most relevant in explaining the variability of concerned variables. In addition, our estimates reveal that a large negative demand shock caused the Great Recession via a sudden drop of confidence. Overall, those results are consistent with the main features of the New ‘Farmerian’ Economics as well as to latest demand-side explanations of the finance-induced recession.

In Christian's words...

Roger Farmer’s recent work has been causing quite a stir, especially as it seems to validate some the things that happened during the recent crisis. This paper provides an empirical test of Farmer’s theory and shows that he is indeed onto something.

Christian's website was set up to promote discussion of research on DSGE models and he invites visitors to leave comments on the papers he highlights. Thanks Christian, for drawing attention to this very interesting piece.

Sunday, May 4, 2014

Simon Wren-Lewis, seeks a serious debate with our heterodox colleagues, and judging by the excellent comment thread that appears on his post, there are plenty of heterodox economists who are ready and willing to take up the challenge. This is a welcome debate.

Simon defends his view of orthodoxy, by which he means New Keynesian economics. In its simplest form, New Keynesian economics is a three-equation model that explains the behavior of the nominal interest rate, the "output gap" and the inflation rate.

I agree firmly with Simon, that from a policy perspective, we should not care one iota if NK economics has anything to do with what Keynes might or might not have thought. But from the perspective of the history of thought, we should not mislead our students with false labels. The New Keynesian model is neither new nor Keynesian. It is a beautiful formalization of David Hume's verbal argument in his 1742 essay "Of Money"; an early piece on the Quantity Theory of Money that every macroeconomics student should read at least once.

Sunday, April 27, 2014

The first paperback English language edition of my book How the Economy Works has just been published by Oxford University Press. I hope this edition finds a new audience that will take the time to consider the ideas I present. The book provides, not only a history of contemporary economic thought, but also some fresh ideas for dealing with financial crises and for the design of a new financial architecture to prevent them from reoccurring.

Here are a few excerpts from the new Preface.

How the Economy Works, (HTEW) first appeared in 2010. By the time of its publication, the world was in the throes of the worst recession since the 1930s. Thirty-seven months after the NBER called the recession over, in June of 2009, the U.S. economy is still a long way from regaining all of the jobs lost during the crisis. I wrote this book to help you understand why this happened and to offer some new ideas to prevent similar financial crises from reoccurring in the future.

Wednesday, April 23, 2014

Students at the University of Manchester in England are unhappy with the way they are being taught and they are not alone. In a widely publicized, and highly articulate report, the Post-Crash Economics Society, a group of Manchester Univesity students, is highly critical of "business as usual" in the economics curriculum in the wake of the crisis.

Wednesday, April 16, 2014

I am teaching two graduate classes this quarter, and that gives me the opportunity to publicize some ideas that I'm teaching in my classes, and that I have been working on for some time. I plan to put up a series of posts explaining the ideas in my 2010 book, Expectations Employment and Prices. I will also talk about extensions of the book that I have subsequently published in peer reviewed journals.

Here is how I characterized the project in the preface to EEP.

I have long believed that modern interpreters of Keynes missed the main point of The General Theory; high unemployment is an equilibrium phenomenon that can persist for a very long time if nothing is done by a government to correct the problem. This was the point of my 1984 paper, which argued that the natural rate hypothesis is false. In the intervening years, I had time to refine this idea. Expectations Employment and Prices is the result.

Sunday, March 30, 2014

I just returned from a conference at the San Franciso Fed on Monetary Policy and Financial Markets HERE where I discussed a paper by Fumio Hayashi and Junko Koeda. They use a novel way of identifying the effects of policy during periods of Quantitative Easing which recognizes that policy is different when interest rates are at the lower bound. An interesting take away from their paper is that QE is effective at reducing the output gap.

The Hayashi-Koeda paper suggests the following research topic for Ph.D. Students. H-K use an SVAR, i.e. a vector autoregression that is identified by making assumptions about the covariances of the variables. See Stock and Watson here for a summary of what that means.

The novelty in Hayashi Koeda is to allow for different coefficients of the VAR when the interest rate is at the lower bound. The pitfall here, is that although SVAR stands for "structural vector autoregression", there really isn't anything structural about it. An SVAR is just the reduced form of a DSGE model. And that means that the coefficients of the equations cannot be relied upon to remain constant if the policy rule changes.

Nothing new there -- we've known that for a long time. I was asked to discuss the paper because I've worked here (with Dan Waggoner and Tao Zha) on DSGE models where the parameters switch occasionally from one regime to another. Here is the interesting research topic. How are Regime switching SVARs of the kind estimated by Hayashi and Koeda, related to the Markov switching DSGE models that I studied with Dan and Tao?

Thursday, March 20, 2014

Beginning with the work of Robert Lucas and Leonard Rapping in 1969, macroeconomists have modeled the labor market as if the wage always adjusts to equate the demand and supply of labor.

I don't think thats a very good approach. It's time to drop the assumption that the demand equals the supply of labor.

Why would you want to delete the labor market clearing equation from an otherwise standard model? Because setting the demand equal to the supply of labor is a terrible way of understanding business cycles.

Tuesday, March 11, 2014

I have just completed a new paper on asset prices, "Asset Prices in a Lifecycle Economy". The paper is available here from the NBER or here from my website. This is a good time to comment on asset price volatility and the apparently contradictory findings of two of the 2013 Nobel Laureates because my paper sheds light on this issue.

Fama won the Nobel Prize for showing that financial markets are efficient. He meant, that it is not possible to make money by trading financial assets because markets already incorporate all available information.

Shiller won the Nobel Prize for showing that financial markets are inefficient. He meant that the ratio of the price of a stock to the dividends it earns, returns to a long run average value; hence, an investor can profit by holding undervalued stocks for very long periods.

These apparently contradictory results are consistent because Fama and Shiller are referring to different concepts of efficiency.

When Fama says that financial markets are efficient, he means informational efficiency. There is a second concept that economists call Pareto efficiency. This means that there is no possible intervention by government that can improve the welfare of one person without making someone else worse off. The fact that markets are informationally efficient does not necessarily mean that they are Pareto efficient and that fact helps to explain why financial markets appear to do such crazy things over short periods of time.

Saturday, March 8, 2014

A common mistake amongst Ph.D. students is to place too much weight on the ability of mathematics to solve an economic problem. They take a model off the shelf and add a new twist. A model that began as an elegant piece of machinery designed to illustrate a particular economic issue, goes through five or six amendments from one paper to the next. By the time it reaches the n'th iteration it looks like a dog designed by committee.

Mathematics doesn't solve economic problems. Economists solve economic problems. My advice: never formalize a problem with mathematics until you have already figured out the probable answer. Then write a model that formalizes your intuition and beat the mathematics into submission. That last part is where the fun begins because the language of mathematics forces you to make your intuition clear. Sometimes it turns out to be right. Sometimes you will realize your initial guess was mistaken. Always, it is a learning process.

Saturday, March 1, 2014

1. The production function: Y=F(L). Output (Y) is a function of employment (L).

2. A "classical" labour demand curve: W/P=MPL(L). The real wage (W/P) equals the Marginal Product of Labour, which is a decreasing function of employment. This is Keynes' "first classical postulate", which he agreed with.

3. A "classical" labour supply curve: W/P=MRS(L,Y). The real wage equals the Marginal Rate of Substitution between labour (or leisure) and output (or consumption). This is Keynes' "second classical postulate", which he disagreed with (except at "full employment").

From 1 and 2, plus some tedious math, we can derive what Keynes calls "the aggregate supply function": PY/W = S(L). It shows the value of output, measured in wage units, as a function of employment. It is substantively identical to the Short Run Aggregate Supply Curve in intermediate macro textbooks that assume sticky nominal wages: Y=H(P/W), which uses the exact same equations 1 and 2, but presents the same solution differently.

From 1 and 3, plus some tedious math, we can derive a second "aggregate supply function", that is not in the General Theory: PY/W = Z(L). It is substantively identical to the short run aggregate supply curve implicit in New Keynesian models, which assume sticky P and perfectly flexible W, so the economy is always on the labour supply curve and always on the production function.

From 1 and 2 and 3, plus some tedious math, we can solve for Y, L, and W/P, and derive a third aggregate supply function: Y=Y*. This is the textbook Long Run Aggregate Supply curve. It is identical to the solution we could get if we solved for the levels of Y, L, and W/P that satisfied both the first and second "aggregate supply functions".

Nick's first supply curve is the only supply curve in The General Theory. All else is due to misinterpretations by later economists who tried to make sense of what Keynes really meant (ineffectively in my view). We don't need sticky prices (supply curve number 2) and we don't need to reintroduce the second classical postulate through the back door (supply curve number 3). That is 1950s MIT talking and it led us down the wrong path.

Monday, February 24, 2014

Simon Wren-Lewis has a great post today on what makes a Keynesian. Here is my answer together with a quiz for wannabe Keynesians.

First, let me delve into a little highbrow theory.

Figure 1: The Keynesian Cross

Figure 1 is a picture that goes by the name of the Keynesian cross. On the horizontal axis is income; the value of all wages, rents and profits earned from producing goods and services in a given year. On the vertical axis is planned expenditure; the value of all spending on goods and services produced in the economy in a given year. Since this is a closed economy, all expenditure is allocated to one of three categories; expenditure on consumption goods, expenditure on investment goods and government purchases. Since every dollar spent must generate income for someone; in a Keynesian equilibrium, income must equal planned expenditure.

Saturday, February 15, 2014

DSGE models have been the subject of much attention recently on the blogs. Simon Wren-Lewis suggests that DSGE modelers made a Faustian bargain and offers a partial defense. David Glasner is distinctly uneasy with the DSGE approach and although Paul Krugman remains eclectic he wants to retain the IS-LM model as part of his portfolio.

In this book I take a point of view that is becoming less controversial but is by no means universally accepted. I will argue that the future of macroeconomics is as a branch of applied general equilibrium theory.

Believe it or not; twenty one years ago, that was a controversial statement. I argued then that the problem with DSGE models is not the assumption that the economy is in equilibrium. The problem with DSGE models is the implication of some of these models that the equilibrium is optimal. Since then, I have consistently argued that the way forward is to reformulate Keynesian ideas with modern mathematics; that is what the DSGE agenda is all about.

Sunday, February 9, 2014

Several recent excellent posts have appeared on Keynesian economics and sticky wages and prices. David Glasner points out that

...the sticky-wages explanation for unemployment was exactly the “classical” explanation that Keynes was railing against in the General Theory.

and quoting David again

it’s really quite astonishing — and amusing — to observe that, in the current upside-down world of modern macroeconomics, what differentiates New Classical from New Keynesian macroeconomists is that macroecoomists of the New Classical variety, dismissing wage stickiness as non-existent or empirically unimportant, assume that cyclical fluctuations in employment result from high rates of intertemporal substitution by labor in response to fluctuations in labor productivity, while macroeconomists of the New Keynesian variety argue that it is nominal-wage stickiness that prevents the steep cuts in nominal wages required to maintain employment in the face of exogenous shocks in aggregate demand or supply.

Monday, February 3, 2014

Along with the rest of modern macroeconomics, the rational expectations (RE) assumption has gotten quite a bit of flack lately. I don’t think all of it is deserved. It is not the rational expectations (RE) assumption that is at fault: It is the rational expectations assumption in conjunction with the assumption of a unique equilibrium.

In standard dynamic stochastic general equilibrium (DSGE) models there is a single rational expectations equilibrium. In the models I work with there are many rational expectations equilibria. Not just one, or two or three: but an infinite dimensional continuum of them. That is not a problem. It is an opportunity that I exploit to model the idea that beliefs matter. In my work, I close my models by adding an equation that I call a 'belief function'. The belief function is an effective way of operationalizing the Old Keynesian assumption of ‘animal spirits’. It is a forecasting rule that explains how people use current information to predict the future. That rule replaces the classical assumption that the quantity of labor demanded is always equal to the quantity of labor supplied.

You might think that adding a belief function to operationalize animal spirits allows me to dispense with the rational expectations assumption since the belief function could be arbitrary. Not so. Even though we do not live in a stationary environment, our beliefs should be consistent with the outcomes that we would observe in a stationary world. In such a world, beliefs should obey Abraham Lincoln’s dictum that “you can fool all of the people some of the time or some of the people all of the time but you can’t fool all of the people all of the time.” In my view, that is the rational expectations assumption.

The recent drop in the stock market, if it persists, will present serious challenges for the Yellen Fed.

In a couple of recent academic papers, The Stock Market Crash of 2008 caused the Great Recession: Theory and Evidence here and The Stock Market Crash Really Did Cause the Great Recession here I showed that changes in the value of the stock market cause changes in the unemployment rate three months later. Here is a link to a Freakonomics post that features my work.

I continue to receive requests for the data that I used in those studies. That data is available here. These are important empirical findings that establish a strong and stable relationship between changes in the value of the S&P and changes in the U.S. unemployment rate.

Sunday, January 26, 2014

Noah Smith refers to a vintage piece by Robert Barro that pours scorn on the New Keynesian agenda. I am grateful to Noah for drawing our attention to it. I find much to agree with in Barro’s critique of the New Keynesians and those who would attack his position would be wise to heed the proverb: those who live in glass houses should not throw stones.

Monday, January 20, 2014

In a comment on my most recent blog post, Andy Harless "[wishes he] had a better intuition for what is going on in [my] model." I took a stab at responding to Andy in the comment section, but my response became so long that I turned it into a post. Here is my answer to Andy. You can find additional comments over at Economist's View where Mark Thoma was kind enough to post an excerpt.

Sunday, January 19, 2014

Bob Shiller wrote an interesting piece in today's NY Times on the irrationality of human action. Shiller argues that the economist's conception of human beings as rational is hard to square with the behavior of asset markets.

Although I agree with Shiller, that human action is inadequately captured by the assumptions that most economists make about behavior, I am not convinced that we need to go much beyond the rationality assumption, to understand what causes financial crises or why they are so devastatingly painful for large numbers of people. The assumption that agents maximize utility can get us a very very long way.

Thursday, January 16, 2014

My colleague Harold Demsetz was honored this year, along with Stanley Fischer, Jerry Hausman and Paul Joskow, as a Distinguished Fellow of the American Economics Association. Congratulations to all! Here is what the AEA said about Harold.

Harold Demsetz

Harold Demsetz is one of the most creative and deep microeconomists of the 20th century. Several of his contributions anticipated subsequent research by years or even decades, and have offered unusually insightful analyses of fundamental problems of economic theory.

Demsetz’s most famous paper “Production, Information Costs, and Economic Organization” (with Armen Alchian, American Economic Review 1972) is one of the most cited papers in all of economics. It analyzes the fundamental question first raised by Coase, “What is a firm?” and tries to understand the difference between contracts occurring inside the firm (for example, with employees) and those occurring in the market (for example, with customers). Alchian and Demsetz argue that some contracts are efficiently brought inside the firm because doing so reduces the costs of monitoring of performance, especially when production occurs in teams. Alchian and Demsetz’s approach has been challenged by more recent developments, such as Grossman and Hart (1986), but remains a classic in the theory of the firm.

Tuesday, January 14, 2014

I was planning to take a break from blogging today but then I came across Chris House's homily to his students encouraging them not to read the General Theory; or, for that matter, anything else written in economics BME (before the Mankiw era). I simply cannot let that exhortation stand without adding a few words in defense of the history of thought and in support of Scott Sumner's take on Chris' post.

Monday, January 13, 2014

Start with a standard model with perfectly flexible prices and wages. Delete one equation, for example the labour market clearing condition. We are now one equation short of a solution, so we have multiple equilibria. Does that mean we are now free to add any additional equation we feel like? Mathematically, we can do that, of course. But one would like some sort of intuition for that extra equation. Why, for example, should it be an equation for stock prices? Why not a different equation for wages?

That's a great question. Until recently, new-Keynesian economists didn't bother to model unemployment. Instead, they followed the new-classical approach in which all that matters is labor hours spent in paid employment. More recently, a number of authors including Bob Hall, and Mark Gertler and Antonella Trigari have incorporated explicit models of search unemployment into otherwise standard macroeconomic DSGE models. That idea is not new; David Andolfatto and Monika Merz introduced search to RBC models in the 1990s. What is different about more recent work, building on Hall's 2005 paper, is the way the model is closed.

Saturday, January 11, 2014

There has been a lot written on the blogosphere in recent weeks
about the microfoundations of macroeconomics. Tony
Yates argues in favour of micro-founded structural models. Adam
Posen is sceptical of micro-foundations and Simon-Wren
Lewis, Noah
Smith and Nick
Rowe call for a more eclectic approach. For those looking for a neat
summary of these debates, Paul
Krugman traces the history of macroeconomic ideas. Responding to a piece by Brad
Delong, he argues that there has been a recent resurgence of what he calls
“neo-paleo-Keynesianism”. This is very useful concept and I have much in common with the ideas expressed in Paul's piece. This essay offers a novel definition of the term that Paul coined and an invitation to fellow academics to join me in pursuing an agenda based
on this definition.