The Rise of the Renminbi as International Currency: Historical Precedents, by Jeff Frankel: All of a sudden, the renminbi is being touted as the next big international currency. Just in the last year or two, the Chinese currency has begun to internationalize along a number of dimensions. A RMB bond market has grown rapidly in Hong Kong, and one in RMB bank deposits. Some of China's international trade is now invoiced in the currency. Foreign central banks have been able to hold RMB since August 2010, with Malaysia going first.

Some are now claiming that the renminbi could overtake the dollar for the number one slot in the international currency rankings within a decade (especially Subramanian 2011a, p.19; 2011b). ...

The dollar is one of three national currencies to have attained international status during the 20th century. The other two were the yen and the mark, which became major international currencies after the breakup of the Bretton Woods system in 1971-73. (The euro, of course, did so after 1999.) In the early 1990s, both were spoken of as potential rivals of the dollar for the number one slot. It is easy to forget it now, because Japan's relative role has diminished since then and the mark has been superseded. ...

The current RMB phenomenon differs in an interesting way from the historical circumstances of the rise of the three earlier currencies. The Chinese government is actively promoting the international use of its currency. Neither Germany nor Japan, nor even the US, did that, at least not at first. In all three cases, export interests, who stood to lose competitiveness if international demand for the currency were to rise, were much stronger than the financial sector, which might have supported internationalization. One would expect the same fears of a stronger currency and its effects on manufacturing exports to dominate the calculations in China.

In the case of the mark and yen after 1973, internationalization came despite the reluctance of the German and Japanese governments. In the case of the United States after 1914, a tiny elite promoted internationalization of the dollar despite the indifference or hostility to such a project in the nation at large. These individuals, led by Benjamin Strong, the first president of the New York Fed, were the same ones who had conspired in 1910 to establish the Federal Reserve in the first place.

It is not yet clear that China's new enthusiasm for internationalizing its currency includes a willingness to end financial repression in the domestic financial system, remove cross-border capital controls, and allow the RMB to appreciate, thus helping to shift the economy away from its export-dependence. Perhaps a small elite will be able to accomplish these things, in the way that Strong did a century earlier. But so far the government is only promoting international use of the RMB offshore, walled off from the domestic financial system. That will not be enough to do it.

"We are focusing on major U.S. equities now, looking past the European stock markets because there's too much volatility there," Tim Hartzell, who oversees about $350 million as chief investment officer for Houston-based Sequent Asset Management, said in a telephone interview. "The Fed is still accommodative and we're entering into an election year, when politicians are usually pulling various levers to make the economy grow."

Put aside the issue of the Fed's level of accommodation for the moment. Instead, will politicians be in a position to stoke growth during the upcoming election year? As it stands, policy will turn increasingly contractionary in the months ahead. Moreover, conventional wisdom is that a weak economy favors Republicans, who can run on the "are you better off than four years ago?" platform. And looking at the latest news of declining median incomes in the post-recession period, combined with an economy that has dramatically underperformed relative to the Administration's expectations when the stimulus was proposed in 2009, that argument has some legs.

Sure, we can argue that Republican intransigence is the core policy problem. But at the same time, the Administration had no back-up plan for an L-shaped recovery, joined the fiscal austerity parade, and continued to place faith in reaching a "Grand Bargain" on the debt rather than focusing on the issue at hand - the unemployment crisis. When all is said and done, I suspect that from the view of the average voter, this Administration owns the economy lock, stock, and barrel.

Politically, it makes sense for the Republicans to thwart Democratic attempts to reduce unemployment, and instead keep the focus on the "failure" of such policies to date. And they would like the Federal Reserve kept in line as well. Stan Collender on the recent attempt by Republican leadership to prevent additional easing:

In other words, now that the GOP has made it all but impossible for fiscal policy to be used to improve they economy, they want to make sure that the only other tool the government has at its disposal -- monetary policy -- isn't used either.

Why take on the Fed? The Republicans have some direct control over fiscal policy because they can either refuse to consider a proposal in the House where they are in the majority or can filibuster legislation in the Senate where they are in the minority. Because the Fed is an independent agency, the GOP can only do what they did today in the letter by threatening to bring down the wrath of god if it dares take any action to get the economy moving.

Maybe we will see an extension of payroll tax break and business-tax credits, but that only limits the contractionary turn in fiscal policy. Better than nothing, but far short of what is necessary.

So assume fiscal policy is locked up in Washington through 2012, and the Republicans win the White House. What will policy look like in 2013? I would like to hear the views of the gang at Capital Gains and Games. I see two possible outcomes. One is to embrace fiscal austerity with both arms. The other is to abandon fiscal austerity, as it was only a useful weapon to win back the White House. Instead, do exactly the opposite and embrace the mantra that "Reagan proved deficits don't matter." That seems like a strategy designed to win in 2016. But it pushes meaningful policy action out until 2013 rather than 2012 - bad news for the unemployed and financial markets.

David Romer's name has come up several times in recent discussions of the IS-LM and IS-MP models. This is how Romer's new edition of his graduate level macroeconomics book derives the IS-LM and IS-MP curves:

Assume that firms produce labor using labor as the only input, i.e. Y = F(L), F'>0, F''≤0, and that government, international trade, and capital are left out of the model for convenience (so that Y=C+I+G+NX becomes Y=C).

Also assume that "There is a fixed number of infinitely lived households that obtain utility from consumption and from holding real money balances, and disutility from working. For simplicity, we ignore population growth and normalize the number of households to 1. The representative household's objective function is":

There is diminishing marginal utility (or increasing marginal disutility), as usual. (Note that assuming money is in the utility function is a standard short-cut. See Walsh for a more extensive discussion of this.)

There are two assets in the model, money and bonds. Money pays no interest, while bonds receive an interest rate of it. Wealth evolves according to:

where At is household wealth at the start of period t, WtLt is nominal income, PtCt is nominal consumption, and Mt is nominal money holdings. This equation says that wealth in period t+1 is equal to the amount of money held at the end of time t plus (1+it) times the bonds help from t to t+1 (the term in parentheses is bonds).

Households take the paths of P, W, and i as given, and they choose the paths of C and M to maximize the present discounted value of utility subject to the flow budget constraint and a no-Ponzi-game condition (for simplicity, the choice of L is set aside for the moment). Finally, the path of M is chosen by the monetary authority (later, when the MP curve is derived, this assumption will be changed).

We now, in essence, have the New Keynesian IS curve. To see this, take logs of both sides:

And using the fact that Y=C, approximating ln(1+r) as r (which holds fairly well when r is small), and dropping the constant for convenience gives:

This is the New Keynesian IS curve. It's just like the ordinary IS curve, except for the lnYt+1 term on the right-hand side (in models with stochastic shocks, this becomes EtlnYt+1, where EtlnYt+1 is the expected value of Yt+1 given the information available at time t -- often the information set contains only lagged values of variables in the model).

Thus, the big difference between the old IS and the microfounded New Keynesian IS curve is the EtlnYt+1 term on the right-hand side. (Thus, it's relatively easy to amend the traditional model of the IS curve to incorporate the expectation term.)

It can also be shown (e.g. through a variations argument) that the first order condition for money holding is:

This implies that:

Money demand is increasing in output and decreasing in the nominal interest rate. If this is set equal to (exogenous) money supply, then we have an LM curve. And if we graph the LM curve along with the New Keynesian IS curve, it looks just like the traditional formulation of the model (with the main difference being the expectation of future output term discussed above).

Finally, as Romer notes:

The ideas captured by the new Keynesian IS curve are appealing and useful... The LM curve, in contrast, is quite problematic in practical applications. One difficulty is that the model becomes much more complicated once we relax Section 6.1's assumption that prices are permanently fixed... A second difficulty is that modern central banks do not focus on the money supply.

The first problem is that the LM curve shifts when P changes, so if there is inflation it will be in constant motion making it hard to use as an anlytical tool. That can be overcome, but the second objection is harder to dismiss. However, it is easy to address. Simply assume that the central bank follows a rule for the interest rate such as:

If the central bank adjusts M to ensure this holds, then the money supply is now essentially endogenous (and the interest rate is set externally through the rule). This is an upward sloping curve in r-lnY space, and it is called the MP curve (for monetary policy). It replaces the LM curve in the IS-LM diagram giving us the IS-MP model.

However, it would still be possible to do the analysis with the IS-LM diagram, just put a horizontal line at the fixed interest rate and find the money supply that makes this an equilibrium, but as noted above in the presence of inflation the LM curve shifts out continuously making the model hard to use. Thus, in the presence of inflation and an interst rate rule, the IS-MP formulation is much simpler to use. But for other questions, e.g. quantitative easing at the lower bound or pedagogically examining a money rule, the IS-LM model is often more intuitive.

But the main point is that if you start from (very simple) microfoundations, the resulting model looks a lot like the old IS-LM model. It still needs to be able to handle price-changes, so it's necessary to add a model of supply to the model of demand provided by the IS-MP or the IS-LM diagrams, and the expectation term on the right-hand side of the IS curve is an important difference from the older modeling scheme, but the two models have a lot in common.

The economy was not in recession in the third quarter, which means the backward looking data flow through this month will not be particularly dire.

Consistent with this prediction, the September employment report painted a picture of an economy still wading through knee-deep mud, but not in economic collapse. That said, prior to the report, Barry Ritholtz offered some wisdom regarding individual data points versus trends:

What does matter is the overall vector of a given economic sector. Vectors include the rate of acceleration or deceleration, persistency, direction etc. Think overall "trend" and changes thereto. For employment, this means: Are we seeing an increase in the factors that lead to hiring? What is the ratio between hires at big firms vs small firms? Are Wages increasing, staying flat, or decreasing; Temp workers getting hired, total hours worked etc. What are the likely data and modeling errors? Collectively, those factors all add up to an issue of the employment situation roughly improving, maintaining a stability, or getting worse.

Hence, each data point should be looked at in terms of whether it is continuing the overall trend, or suggesting a reversal in trend. Everything else is noise.

With trends in mind, the data did little to dispel my concern that private sector hiring rolled-over earlier this year, especially when combined with last week's read on employment via the ISM nonmanufacturing report:

A string of stronger-than-projected statistics -- capped by the news on Oct. 7 of a 103,000 rise in payrolls last month --has prompted economists at Goldman Sachs Group Inc. and Macroeconomic Advisers LLC to raise their growth forecasts for third quarter growth to 2.5 percent from about 2 percent. That's nearly double the second quarter's 1.3 percent rate and would be the fastest growth in a year.

"The U.S. economy doesn't look like it's double-dipping at all," said Allen Sinai, president of Decision Economics Inc. in New York. "But it is a crummy recovery."

The article offers up the usual caution on Europe and increasingly tight fiscal policy when the New Year begins. But the bottom line is correct - on the basis of existing data, the recession call looks like a long-shot.

Getting to the recession call requires generally ignoring the incoming data on the real economy and instead focusing on financial markets. Then recognize that in recent experience, financial distress leads to broader economic distress. Moreover, at the moment, the slowdown in US economic growth coupled with the possibility of sovereign default in Europe are combining in such a way as to expose the inherent vulnerabilities in a still-under-capitalised global financial system. See Edward Harrison here.

And although there is optimism the European situation can be resolved in three weeks, they seem to be walking a very fine line between attempting to recapitalize the banking system without undermining sovereign debt ratings while maintaining what effectively amounts to a pegged exchange rate system that is fundamentally inconsistent with the economic needs of more than one nation. In addition, they have an odd situation where every nation needs to issue Euro-denominated debt, but no nation can actually print Euros as a backstop. It's as if each nation issues only foreign-denominated debt, with ultimately no lender of last resort on a national level. Of course, the European Central Bank could fill this role, but will they?

My experience is that when a financial landscape is as ugly as we see here, there is no rescue plan. Things tend to get much worse before they get better. That seems to be what financial market are telling us.

With that cheery thought in mind, I offer another distressing correlation. While I generally find monetary aggregates difficult indicators in the best of time, this caught my attention:

Since the end of the 1990's, there has been a negative correlation between M2 growth and industrial production growth. It appears that financial market disruptions of the current magnitude are sufficient to drive substantial changes in spending. If this correlation continues to hold, then I need to rethink my belief that any recession in the near term will be relatively mild considering the lack of rebound from the last recession. Perhaps underneath today's seemingly comforting data something very ugly is brewing. Which means enjoy these big rallies on Wall Street while you can.

Let me talk a bit about Sims contributions to economics, and if I have time I'll try to cover Sargent later.

Prior to Sims work, in particular his paper "Macroeconomics and Reality," the state of the art in macroeconometrics was to use large-scale structural models. These models often involved scores or even hundreds of equations, essentially a S=D equation for every important market, identities to make sure things add up correctly, etc. But in order to estimate the parameters of these models, the structural parameters as they are known, you had to overcome the identification problem.

Without getting into the details, the identification problem essentially asks if its possible to estimate the structural parameters at all. The answer, in general, is no. For example, if every variable in the model appears in every equation, then it won't be possible to estimate the structural model. Let me give an example to illustrate. Suppose that X and and Y are the endogenous variables, e.g. price and quantity for some market, and that the structural model is:

Yt = a0 + a1Xt + a2Yt-1 + a3Xt-1 + ut

Xt = b0 + b1Yt + b2Yt-1 + b3Xt-1 + vt

The a's and the b's are the parameters that economists are generally interested in, but in this form it is not possible to estimate them. There must be what are known as exclusion restrictions before estimation is possible. In this case, for example, identification can be achieved by making either a1 or b1 equal to zero (more on this below), i.e. excluding one of the variables from one of the equations. If there is a reason for this, then excluding the variable is okay, but a variable can't be left out simply to achieve identification -- there must be good reason for excluding Xt from the first equation, or Yt from the second (or both). Omitting a variable that ought to be in a model in order to satisfy the identification restrictions results in a misspecified model and biased estimates.

In large models, these exclusions are numerous, and many researchers simply assumed whatever exclusion restrictions were needed to achieve identification, and then went on to estimate the model. In Macroeconomics and Reality, Sims pointed out the problem with this approach. The assumptions that researchers were imposing to achieve identification had no theoretical basis. They were ad hoc and difficult to defend (especially when expectations are in the model -- expectations tend to depend upon all the variables in a model making it difficult to exclude anything from an equation involving expectations).

What Sims suggested as an alternative was to drop structural modeling altogether, and to use generalized reduced forms as the basis for estimation. There would be no hope of recovering structural parameters in most cases, but there was still much that could be learned by using reduced forms instead of structural models.

For example, the reduced form for the model above is (you can find the reduced form by expressing the endogenous variables Xt and Yt in terms of exogenous and predetermined variables):

This is a VAR model. At first, Sims thought we could draw important conclusions from this model, e.g. suppose that X is money and Y is output. Then this model could tell us how a shock to money would change output over time (these are called impulse response functions -- you hit the system with a shock, and then use the estimated model to trace out the path of the endogenous variables over time). We could use this model to answer important questions such as whether money causes output (Sims' technique for testing causality was essentially the same as Granger causality, but Sims' made an important contribution in extending the causality techniques to systems with three or more variables when he introduced impulse response functions and variance decompositions).

But, as Cooley and LeRoy pointed out in an important paper, these models don't avoid structural assumptions after all, at least not if you want to say anything about how variables in the model respond to structural shocks. To see this, note first that the shock we are interested in is the shock to money, vt. Now look at the errors in the two reduced form equations. We can estimate each equation by OLS, and when we do the error terms will be estimates of a1vt + ut for the first equation and vt + b1ut for the second. Thus, we get estimates of linear combinations of the vt and ut shocks we are interested in, but we don't get the shocks in isolation like we need. And there's no way to isolate the shocks, i.e. to determine their individual values. That's a problem because we need to find the money shock alone if we want to estimate its effect on output.

How can we do this? One way is to make either a1 or b1 equal to zero. Let's set b1=0 because that's the easiest to discuss. In this case, when we estimate the second equation by OLS (the equation with the d parameters), the error will now be an estimate of vt, which is just what we need. However, notice that this is nothing more than an exclusion restriction -- by assuming that b1=0, we are excluding Yt from the second equation (see the structural model). Thus, we have come full circle.

This is where Sims Structural VARS come into play. The reduced form above is known as a VAR model (in its estimable form, i.e. the second set of reduced for equations above involving the c and d parameters). It turns out that if we can often defend particular restrictions theoretically, e.g. if money can only respond to output with a lag, perhaps due to information problems, then there is no reason to have the contemporaneous value of output on the right-hand side of the structural equation for money, i.e. this implies that b1=0.

Thus, while this still amounts to an exclusion restriction, the restriction is no longer ad hoc -- simply imposed as necessary to achieve identification as back in the old, large-scale structural model days -- it is grounded in theory. And the fact that we insist these restrictions be grounded in theory marks an important difference from the work that came before Sims.

And even better, this technique also allows the model to be identified without using exclusion restrictions at all. For example, if we think that some variables in the model have short-run but not long-run effects, e.g. that money can affect output in the short-run, but only produces price effects in the long-run -- a standard assumption in most macro models -- then the zero impact in the long-run can be imposed as an identifying restriction. Exclusion restrictions won't be needed (this is the Blanchard-Quah and Shapiro-Watson techniques).

This just scratches the surface of Sims' work -- I wish I had time to do more -- but *hopefully* this provides a window into one part of Sims' contributions.

I'm late getting to this, but congratulations to this year's recipients of the Nobel Prize in Economics, Chris Sims and Tom Sargent. Here are more details:

Empirical Macroeconomics: One of the main tasks for macroeconomists is to explain how macroeconomic aggregates -- such as GDP, investment, unemployment, and inflation -- behave over time. How are these variables affected by economic policy and by changes in the economic environment? A primary aspect in this analysis is the role of the central bank and its ability to influence the economy. How effective can monetary policy be in stabilizing unwanted fluctuations in macroeconomic aggregates? How effective has it been historically? Similar questions can be raised about fiscal policy. Thomas J. Sargent and Christopher A. Sims have developed empirical methods that can answer these kinds of questions. This year's prize recognizes these methods and their successful application to the interplay between monetary and fiscal policy and economic activity.

In any empirical economic analysis based on observational data, it is difficult to disentangle cause and effect. This becomes especially cumbersome in macroeconomic policy analysis due to an important stumbling block: the key role of expectations. Economic decision-makers form expectations about policy, thereby linking economic activity to future policy. Was an observed change in policy an independent event? Were the subsequent changes in economic activity a causal reaction to this policy change? Or did causality run in the opposite direction, such that expectations of changes in economic activity triggered the observed change in policy? Alternative interpretations of the interplay between expectations and economic activity might lead to very different policy conclusions. The methods developed by Sargent and Sims tackle these difficulties in different, and complementary, ways. They have become standard tools in the research community and are commonly used to inform policymaking. ...[continue reading]...