Tuesday, October 28, 2014

Coincidences abound. Last night I gave a lecture to my Cost-Benefit Analysis class on uncertainty and precaution, and this morning I see a writeup of a new article by Nassim Nicholas Taleb and his high-profile colleagues on the application of precautionary theory to genetically modified organisms. One concern I had skimming through the article is that it seems to parallel Martin Weitzman’s Dismal Theorem, but he isn’t cited. I don’t know the literature well enough to say anything about priority in this area, and I’d be happy to hear from those who do.

Meanwhile, on with the show. I will leave out the diagrams because they take too long to produce.

1. A convenient property of the normal distribution. Consider a normal distribution—any normal distribution. What’s the probability you will be to the right of the mean? 50%. To the right of one standard deviation (σ) above the mean? About 1/6. To the right of two σ’s above the mean? About 2.5%. To the right of three σ’s above the mean? Less than .5%. This is simply the Empirical Rule. It tells us that the probability of an above-average outcome falls faster than the distance of that outcome from the mean increases. That continues to the very asymptotic end of the distribution’s tale. Of course, the same reasoning applies to the other side of the distribution, as outcomes become ever further below-average.

In an expected value calculation we add up the products of possible future outcomes with their respective probabilities. For two possible outcomes we have:

EV = V(1) p(1) + V(2) p(2)

where V(1) is the value of outcome 1, p(1) its probability and so on with outcome 2. In other words, EV is simply a weighted average of the two potential outcomes, with their probabilities providing the weights. As more possible outcomes are envisaged, the EV formula gets longer, encompassing more of these product terms.

The significance of the empirical rule for EV calculation is that the further from the mean a possible outcome is, the smaller its product term (value times probability) will be. Extreme values become irrelevant. Indeed, because the distribution is symmetrical, you would only need to know the median value, since it’s also the average. But even if you didn’t know the median going in, or if you have only an approximation to a smooth distribution because of too few observations on outcomes, if you know the underlying distribution is normal you can pretty much ignore extreme possibilities: their probabilities will be too small to cause concern.

2. But lots of probability distributions aren’t normal. The normal distribution arises in one of the most important of all statistical questions, the relationship between a sample and the population from which it’s drawn. Sample averages converge quickly on a normal distribution; we just need to get enough observations. That’s why statistics classes tend to spend most of their time on normal or nearly-normal (binomial) distributions.

In nature, however, lots of things are distributed according to power laws. These are laws governing exponential growth, and much of what we see in the world is the result of growth processes, or at least processes in which the size (or some other measure) of a thing in one period is a function of its size in a previous period. In economics, income distribution is power-law; so is the distribution of firms by level of employment. Power law distributions differ in two ways from normal ones: they are skewed, and they have a long fat tail over which the distance from the mean increases faster than probability declines. If you want to know the average income in Seattle you don’t want to ignore a possible Bill Gates.

In many decision contexts, moreover, we don’t have enough observations to go on to assume they are normally distributed. Instead we have a t-distribution. The fewer observations we draw on, the longer and fatter are the tails of t. True, the t-distribution is symmetrical, but, with sufficiently few observations (degrees of freedom), it shares with power law distributions the characteristic that extreme values can count more in an expected value calculation, not less as in a normal distribution.

3. Getting Dismal. The relationship between EV and extreme values depends on three things: whether the probability distribution is normal, if not how fat the tail is, and how long the tail is. Weitzman’s Dismal Theorem says that if the tail is fat enough that the product (value times probability) increases as values become more extreme, and if the tail goes on to infinity—there is no limit to how extreme an outcome may be—the extreme tail completely dominates more likely, closer-to-the-mean values in calculations of EV. The debate over this theorem has centered on whether the unboundedness of the extreme tail (for instance the potential cost of catastrophic climate change) is a reasonable assertion.

4. Precaution, and precaution on precaution. This provides one interpretation of the precautionary principle. On this view, the principle applies under two conditions, a high level of uncertainty and the prospect of immense harm if the worst possibility transpires. High uncertainty means a fat tail; immense potential harm (for which irreversibility is usually a precondition) is about the length of the tail. Enough of both and your decision about risk should be dominated by the need to avoid extreme outcomes.

This view of precaution is consistent with cost-benefit analysis, but only under the condition that such an analysis is open to the possibility of non-normal probability distributions and fully takes account of extreme risks. That said, the precautionary framework described above still typically translates uncertainty into statistical risk, and by definition this step is arbitrary. For instance, we really don’t know what probability to attach to catastrophic releases of marine or terrestrial methane under various global temperature scenarios. Caution on precaution is advised.

UPDATE: I pasted in some images of probability distributions from the web.

My inbox had some long winded story from Bloomberg BNA about some candid remarks from a tax attorney names Hal Hicks on how corporate inversions became such a hot topic:

Hicks epitomizes the world of high-level Washington lawyers who have played a behind-the-scenes role in helping these tax-driven address changes proliferate. Top federal tax officials, many of them career corporate lawyers, have sometimes closed tax breaks only after companies slipped through them. And former officials like Hicks use skills and contacts honed in office to help companies legally outmaneuver the government. Until this year, when address-shifting by more than a dozen companies worth $100 billion caught policy makers’ attention and President Barack Obama clamped down again, inversion rules had for a decade attracted little notice outside the small community of international tax lawyers in Washington. At the Treasury Department and the Internal Revenue Service, officials—many on hiatus from private practice—crafted the rules in dialogue with top corporate law and accounting firms. While some European nations have historically relied on career civil servants, the top ranks of the U.S. tax administration have swapped staff with industry for decades. It’s a low-cost way to provide government with the best legal talent, said Gregory Jenner, a former acting assistant Treasury secretary, who calls it an “incredibly beneficial tradition.” “Putting rookies into these jobs—they would be overwhelmed,” Jenner said. “It’s too high-level, too sophisticated, too complicated.” The risk, critics say, is that some government lawyers may continue to sympathize with corporate interests, or be swayed by former colleagues.

I can see some conservatives reading this and saying that this is due to the overly complex nature of U.S. tax laws governing multinationals. I can see some liberals reading this and saying this is what we get when we let the representatives of multinationals write our tax laws. I would say – they are both right.

Monday, October 27, 2014

For some reason, my comments never show up on Simon Wren-Lewis’ blog, Mainly Macro. Maybe they were not meant to be. But today I will use this site as a soapbox to reply to his (and Nick Rowe’s) argument that public borrowing can impose a burden on future generations.

You can read the original, but the basic idea is that lending money is a form of deferred consumption that wends its way through time like a daisy chain. People live for two periods, with overlapping generations. They buy bonds during the first period and sell them during the second. Thus in each period the debt is neatly handed off to the following generation. But there is an end time, when public debt must be retired. At that point, instead of allowing the final generation, in the bloom of period 1, to purchase and thereby rollover the debt of their ancestors, the government taxes them to retire it. So behold, the borrowing of government from generation the first is a delayed charge against generation the last. And that is why paygo pension systems are an intergenerational crime.

The logic is impeccable, in the sense that if you accept the premises you must accept the conclusion. The question is whether the premises correspond in any meaningful way to the world we inhabit.

One obvious problem is the assertion of an end time. The Greatest Generation, as we know, ran up what was at that point the Greatest Debt; in fact, gross federal debt (including the portion held by the Fed) topped out in 1946 at just under 120% of GDP. Those living today are heirs to that borrowing “binge”. But we haven’t suffered for it, since (1) that specific chunk of debt has become much smaller in relation to our incomes today due to inflation and real growth, and (2) we continue to roll over principle and interest, since the end time is not nigh. As long as we don’t go crazy, and keep our current and future borrowing on a sustainable basis, the end time need never come. (And I’m abstracting, as Simon and others do, from the benefits financed by borrowing—like saving the world from Hitler or, more mundanely, all those nice CCC-built parks—that are also legacies for the future.)

The second is less recognized. The consumption-smoothing life plan at the heart of the standard OLG model, simply does not reflect the facts. Here, for instance, is the 2011 average household net worth (not including home equity) by age of household head, as estimated by the Census Bureau:

Except for those over 75, older people have more net income-generating assets than younger people, and even the geeziest geezers hold more assets than those under 45. They die with their financial boots on, making the daisy chain of deferred consumption a false depiction.

The bottom line is that the generation is not a meaningful unit of accounting when it comes to the distributive effects of public deficits. How about shifting attention to the decision to sell bonds to the rich instead of taxing them?

The New York Times today has an informative article on BASF, the German chemical giant, centered on the effect that fracking in the US has on business decisions in Europe. To sum up, natural gas prices in the US have plummeted due to the widespread use of this dubious technology, which affects chemicals in two ways—lowering the cost of fossil fuel feedstocks and the energy needed to process them. BASF has responded, logically enough, by shifting new investment from Germany to cheaper energy locations, including the US.

But this has an effect on energy policies in Europe too. It shifts the political economic balance away from a decarbonizing energy transition (Energiewende), which raises costs there even as they are tumbling here. In other words, by pushing fracking and generally supporting (non-coal) fossil fuel production, the Obama administration is undercutting foreign efforts to respond to the climate crisis. In the global collective action game of planetary sustainability, the US is a defector.

We are unlikely to see a global agreement on reducing fossil fuel use in the next several years; what can be done to at least protect the space for effective action at the national level? At the top of the agenda should be a framework for carbon tariffs, border taxes that offset cost differences due to differences in carbon emission policies. This would involve at the least a legal framework; ideally it should also spell out tariff-setting formulas to reduce the scope for manipulating the system to serve other ends. If we can’t get everyone to cooperate on sensible action to forestall catastrophic climate change, at least we should try to limit the damage caused by defection.

I understand Governor Christie has relented somewhat on his policy of quarantining all passengers from West Africa, symptomatic or not, in tents. He is now allowing them, if they choose, to spend 21 days stuck in traffic at a bridge. What a mensch!

Sunday, October 26, 2014

The following excerpt is from Duncan Foley's outgoing Presidential Address to the Eastern Economics Association, "Dilemmas of Economic Growth," presented March 9, 2012 (Reprinted by permission from Macmillan Publishers Ltd: Eastern Economic Journal (2012) 38, 283–295 published by Palgrave Macmillan). The title is an allusion to Herman Daly's parody of Cobb-Douglas production function hyperbole "as implying that it is possible to bake a cake without eggs or flour as long as the cook whisks the empty bowl faster and faster."

CAKE WITHOUT FLOUR

Some growth economists might regard the considerations we have just reviewed as rather quaintly anachronistic in putting so much emphasis on the material nature of economic production. Well-established patterns of economic growth show that as incomes rise, the proportion of output as measured by such indexes as real GDP consisting of material goods steadily declines. The major sources of growth in incomes (and, given the way we measure GDP, in indexes of output) shift to the tertiary sector, particularly services. The chief input to services is human intelligence, and at least in some accounts, intelligence is an unlimited resource. So why couldn't real GDP, measured to include the use-value of services, continue to grow without limit?

There are some immediate problems with this conception. Strictly speaking the production of almost all services does require material and energy inputs, as the gigantic server farms required for information technology are a concrete reminder. Maintaining the human capital to provide a glittering array of intellectual services requires material and energy inputs, and these very likely increase as the quality of intellectual output rises.

But this vision of endless growth without material or energy inputs requires some re-examination of just what it is that we regard as output and try to measure in indexes like real GDP. Some rapidly growing service industries, such as finance, seem to be able to produce increasing measured output without much input increase, even of human employment, at all [Basu and Foley 2011; Foley 2011]. An examination of the issues raised by the growing significance of service industries, which have no measurable output, raises some deep questions about the conception of economics.

The paradigmatic economic interaction for economic theories rooted in the marginalist revolution, such as neoclassical economics and its various descendants, is a transaction in which one good moves from the possession of an agent who subjectively values it less to the possession of another agent who values it more, in exchange for another good (in many transactions money). As the familiar Edgeworth-Bowley box construction illustrates, this type of transaction puts both agents on a higher (or at least no lower) indifference curve, and thus achieves a Pareto-improvement in the allocation of existing resources. Many financial transactions are of this type, for example, initial public offerings to take companies public, real estate brokerage, insurance contracts, and other more exotic forms of financial arbitrage. It is important to remember, however, that the transfer of existing goods or assets in these transactions is not production. When financial intermediaries appropriate some part of the economic surpluses generated in these transactions as revenue, however, economic statisticians have felt compelled to regard the resulting incomes as part of national income, and to invent an imaginary product, financial services, to put on the product side of the accounts as a counterpart.

It is hard to imagine limits to the magnitude of subjective economic surpluses that could be realized through transactions of this type. If, for example, policies or the historical evolution of the division of labor increase economic insecurity by eroding the institutions of traditional societies, one can easily imagine an unlimited expansion of insurance transactions as a result. But from the point of view of classical political economy, it is the increase in material productivity of labor, not the increase in economic insecurity associated with the expansion of the division of labor, which is the source of improvements in economic welfare. This point of view is deeply embedded in the methods of national income accounting, for example, in the fundamental rule that transactions involving the transfer of existing assets do not constitute production of goods and services, no matter how much economic surplus they may represent.

The classical political economists and Marx addressed these issues through the concepts of "productive" and "unproductive" labor. In the version of this distinction, Marx distilled from his critical review of Adam Smith, productive labor (whether it produces material goods or services, since providing haircuts is hard to distinguish from making hats) returns the costs of production with a profit, while the cost of unproductive labor is paid out of revenues without any recovery or return. This classical-Marxian line of thinking puts the origin of the incomes from the production of "services," such as finance, in a different perspective.

This perspective is perhaps most clearly articulated in Marx's analysis of wage labor and the origins of surplus value. Productive labor is responsible for the whole value added in production, but receives only a fraction of the value added in the form of the wage. The resulting surplus value constitutes a pool of potential revenue for which capitalist producers, landowners, intellectual property owners, financial firms controlling money capital, and the state compete. The implications of this analysis, which, unfortunately, is for the most part systematically excluded from the modern economics curriculum, are far-reaching. No particular capitalist firm, no matter how large in revenue and employment, can have much direct effect in increasing the pool of surplus value. Thus "money-making" in capitalist society is proximately based on taking surplus value away from others. In an economy where resources and intellectual property command enormous rents, there may be a vanishingly small connection between the revenue of any entity and its actual contribution to production of useful output.

Many people today are dazzled by the apparently magical ability of innovators to appropriate enormous revenues on the basis of ideas and their manipulation alone. This phenomenon has understandably spawned theories of a "new" economy, supposedly based on new principles of the creation of value. Classical-Marxist political economy, in contrast, locates incomes to innovation not in new principles of the creation of value, but in new (or newly important, since most of these "business models" have actually been around for a long time) modes of appropriation of surplus value. As Slavoj Žižek vividly points out, increasing returns in the appropriation of rents for intellectual property simultaneously obscure the origin of the resulting enormous incomes in the pool of surplus value appropriated from productive labor and mystify the factors behind the increasing inequality in the distribution of these revenues [Žižek 2012]. The origin of the rent of a particularly exploitable resource like a waterfall or a petroleum deposit is hard enough to understand, but at least the owner of a waterfall cannot allow an unlimited number of cotton mills to exploit the resulting usable energy. By contrast, the owner of the rights to distribute a piece of software that, due to network externalities, becomes a technical standard, can allow an effectively unlimited number of users to install the software and charge each of them a fee.

It would be, however, a peculiar political economy that convinced itself that the increasing returns in the rents to artificially created assets, such as systems software, were a remedy for thermodynamically imposed decreasing returns to resource use in material production.

Friday, October 24, 2014

Trying to reconcile cost shifting with the discounting of future climate change costs and benefits has taken me on some unexpected detours. I was initially thinking about bills of exchange and their role in the early modern era of concealing church-outlawed "usury" in the guise of a more palatable commercial transaction. Discounting was an arithmetical accounting exercise that arose out of the discounting of bills of exchange.

Both compound interest and discounting partake of the same exponential function -- from different ends of the calculation -- so it is easy (and misleading) to think of the discounting of a bill of exchange as a kind of loan. Discounting a bill of exchange is a sales transaction. The credit involved is commercial credit extended from a supplier to a purchaser. The bank then buys the bill of exchange from the supplier at a discount from its face value.

If one insists on seeing a loan from the banker in the transaction, it would only be an indirect loan to the purchaser of the goods, not to the supplier who sold the bill of exchange to the bank. But that loan would be secured by the goods that were the original object of the transaction that originated the bill of exchange... (Unless, that is, the bill of exchange was only speculative, a circumstance that Marx labeled a swindle.)

The important point is that bills of exchange originated in real transactions of goods, not in purely financial transactions. This has serious implications for the use of "discounting" in cost benefit analysis of public investments.

If the discount rate is meant as a metaphor it is a peculiarly bad one. The goods in question -- costs and benefits of climate change mitigation, for example -- have both negative and positive values but more importantly they have not been contracted for by the interested parties -- there is no "bill of exchange" to be discounted. Furthermore, the beneficiary of the discounted price is not society but the polluting firm who has shifted part of its costs to society and the environment. This perverse distribution of costs and benefits (and incentives) is concealed by the aggregate generality of the climate economy models that construe everything as one big happy economy.

Put it this way: discounting the future costs and benefits of greenhouse gas emissions provides a subsidy to the most prolific emitters of greenhouse gases that they can then reinvest at compound interest. This is hardly a matter of being "neutral" on questions of distribution. Nor is it a question of generational equity. This is simply taking the bankers' perspective on financial accumulation and proclaiming it "socially optimal."

Last night (Oct. 23) at 11:20 PM, CDT, prominent heterodox economist, Fred Lee of the University of Missouri-Kansas City, died of cancer. He had stopped teaching during the last spring semester and was honored at the 12th International Post Keynesian Conference held at UMKC a month ago. While I do not know if he was a card-carrying member of the IWW, as was a friend of mine, Bill Grogan, who died over a month ago and about whom I blogged here then; on more than one occasion, including at this conference at UMKC last month, I heard Fred called an "old Wobbly," and I never heard him dispute this description. For any who do not know, "Wobbly" has always been the nickname for a member of the Industrial Workers of the World (IWW), a pro-working class universal union anarcho-syndicalist group.

Whatever one thinks of heterodox economics in general, or of the views of Fred Lee in particular, he should be respected as the person more than any other who was behind the founding of the International Conference of Associations for Pluralism in Economics (ICAPE), and also the Heterodox Economics Newsletter. While many talked about the need for there to be an organized group pushing heterodox economics in all its varieties, Fred did more than talk and went and organized the group and its main communications outlet. He also regularly and strongly spoke in favor of heterodox economics, the unity of which he may have exaggerated. But his voice in advocating the superiority of heterodox economics over mainstream neoclassical economics was as strong as that of anybody that I have known. I also note that he was the incoming President for the Association for Evolutionary Economics (AFEE), and they will now have to find a replacement. He had earlier stepped down from his positions with ICAPE and the Heterodox Economics Newsletter.

It was both sad and moving to see Fred at the PK conference last month in Kansas City. He was in a wheelchair with an oxygen tank, with his rapidly declining health condition stunningly apparent. There were several sessions honoring his work. However, at one of the major ones, he spoke at the end. Although he was having trouble even breathing and could barely even speak, he rose and made his comments, at the end becoming impassioned and speaking up forcefully to proclaim his most firmly held positions. He declared that his entire career had been devoted to battling for the downtrodden, poor, and suffering around the world, "against the 1% percent!" and I know that there was not a single person in that standing room only audience who doubted him. He openly wept after he finished with those stirring words, as those who were not already standing rose to applaud him with a standing ovation.

Fred's own research agenda focused on developing a heterodox microeconomics, one based on the idea of markets being dominated by oligopolistic firms with price-setting powers and more. In the Post Keynesian camp he drew heavily on the work of Alfred Eichner as well as Michal Kalecki, although he was also influenced by American Institutionalists such as Gardiner Means, hence his Presidency-Elect of the Old Institutionalist AFEE. He wrote on many other topics as well, and in more recent years on the broader issue of the meaning and application of heterodox economics and how to develop a coherent alternative heterodox economics. But his most famous work was and will probably remain his work on a heterodox, arguably Post Keynesian, approach to micreoeconomics.

At this point I must note that while we were always friends, and I knew Fred for a long time, we had some fairly strong differences of opinion in recent years. A decade ago, I with David Colander and Ric Holt, wrote a book and an article, followed up by another book and some other articles, the first book being _The Changing Face of Economics: Conversations with Cutting Edge Economists_ (2004, University of Michigan Press) and the first article being "The Changing Face of Mainstream Economics" (Review of Political Economy, 2004). We argued that "mainstream" is a sociological category, those running the show in the profession (top departments, journals, etc.), while "orthodox" is an intellectual category, the hardline version of which is widely called "neoclassical economics." We argued that "heterodox" was both: not running things and also intellectually anti-orthodox. This opened the door for a category of "non-orthodox mainstream economists," with people like George Akerlof being possible examples. Several heterodox economists disagreed with this argument and viewed us as weakening the criticism of "the orthodox mainstream" with this sort of divisionist argument, and quite a few of those expressed their disagreements in print, with there actually being an entire book dedicated essentially to reading the riot act on us as a bunch of namby-pamby wafflers or worse. Fiercest of all in this crusade, both verbally and in print, was good old Fred Lee, who saw us as undercutting and undermining and demoralizing the movement for a unified and strong heterodox economics battling that "orthodox mainstream."

I note that at the meeting in Kansas City I stood up to speak about this and to praise what I considered to be the strong and principled position held by Fred, despite our disagreements. I also spoke to him privately afterwards, and we parted on friendly terms. However, I note that he laid out in his public remarks a distinction between a "heretic" and a "blasphemer," both of these terms positives for him. A heretic is someone who questions orthodox doctrine, but still at some level believes it, while a blasphemer is someone who utterly and totally rejects it. He told me in our final private conversation that he viewed me as being a mere heretic, while he was a true blasphemer.

RIP, Fred.

Barkley Rosser

Addendum: The book criticizing Colander, Holt, and me is "In Defense of Post-Keynesian and Heterodox Economics: Responses to Their Critics," ed. by Fred Lee and Marc Lavoie, 2012, Routledge. In effect the bottom line may boil down to our saying that the heterodox can be the source of cutting edge ideas that the mainstream sometimes adopts, such as behavioral economics, whereas they say that any idea that can be accepted by the mainstream is simply being coopted, and that the heterodox must overthrow the mainstream orthodoxy root and branch. This may be what separates "heresy" from "blasphemy."

Further Addendum: I have been informed by email from Steve Ziliak, a former colleague of Fred's from when he was at Roosevelt University in Chicago, that like my late friend Bill Grogan, he was a card-carrying member of the IWW from 1985, and that indeed he became the Chair of the General Executive Council, with the IWW's national HQ in Chicago. As a result of that and at that time, he ended up becoming the recipient and owner of the ashes of Joe Hill, which had apparently gone on some long odyssey. But, given that Joe Hill was an honest-to-gosh Wobbly, maybe the most famous of them all aside from Big Bill Haywood, the IWW ended up getting at least some of his ashes, and it was Fred who was theiir overseer, at least for some time.
.
A link from Steve Ziliak to see Fred signing for Joe Hill's ashes on 11/18/1988, http://reuther.wayne.edu/node/12333 .

The monks ascending the steps on the outside of the wall are growing the GDP, while the monks descending the steps on the inside are abating carbon dioxide emissions. Climate change mitigated -- emissions decoupling accomplished!

"Business profit," Schumpeter tells us, "is a prerequisite to the payment of interest on productive loans... The entrepreneur is the typical interest payer." There are three cost-reduction strategies that firms may pursue to maximize profits. The most opportunistic is cost-shifting, in which some third party, society or the environment gets stuck with the cost rather than the firm. The cost doesn't go away, it just becomes external to the accounting entity's balance sheet and thus is an "externality." Greenhouse gas emissions are such an externality. They are a cost-shifting success for the profit maximizing firm.

Carbon trading schemes and Pigouvian taxes are supposed to "internalize" those externalities so that the users of fossil fuels, for example, are made to pay the full cost -- or at least a larger proportion of the cost -- of their production processes or consumption preferences. Assessments of the costs and benefits of such policies typically discount the present value of future costs and benefits. The appropriate discount rate, it is often argued, should reflect market interest rates or else it may result in spending that is less efficient than would occur through the market. William Nordhaus in A Question of Balance:

The choice of an appropriate discount rate is particularly important for climate-change policies because most of the impacts are far in the future. The approach in the DICE model is to use the estimated market return on capital as the discount rate. The estimated discount rate in the model averages 4 percent per year over the next century. This means that $1,000 worth of climate damages in a century is valued at $20 today. Although $20 may seem like a very small amount, it reflects the observation that capital is productive [S'man: no, it reflects the assumption that capital is "productive"]. Put differently, the discount rate is high to reflect the fact that investments in reducing future climate damages to corn and trees should compete with investments in better seeds, improved equipment, and other high-yield investments. With a higher discount rate, future damages look smaller, and we do less emissions reduction today; with a lower discount rate, future damages look larger, and we do more emissions reduction today.

Update:But... if profitability is a function of cost shifting, the market interest rate a function of profit, the discount rate a function of the market interest rate and cost/benefit optimization of GHG abatement a function of the discount rate, doesn't said optimization embed a circular reference? No, this is both too simple and too forgiving an interpretation of the relationship between discounting and cost shifting. More on this soon...

Nordhaus, again:

In thinking of long-run discounting, it is always useful to remember that the funds used to purchase Manhattan Island for $24 in 1626, when invested at a 4 percent real interest rate, would bring you the entire immense value of land in Manhattan today.

Professor Nordhaus here simply updates and tones down the hallucinations of Dr. Richard Price, who exclaimed in 1774:

One penny, put out at our Saviour's birth to 5 per cent compound interest, would, before this time, have increased to a greater sum, than would be contained in a hundred and fifty millions of earths, all solid gold.

As Marx began the chapter in Capital in which he cited Price's dazzled fancy:

The relations of capital assume their most externalised and most fetish-like form in interest-bearing capital. We have here M — M', money creating more money, self-expanding value, without the process that effectuates these two extremes.

In his discussion of discounting, Nordhaus doesn't distinguish between compound interest and the process that brings about the apparent productivity of capital that he extols. What makes this lack of distinction particularly telling is that he is supposedly discussing solutions to a problem that results from the very process that makes capital productive of profits sufficient to sustain interest payments on money capital. It is as if the greenhouse gases are unrelated to the industrial processes that emit them.

Compound interest does not emit greenhouse gases. What people do to make the profits to pay the compound interest does. Money capital does not compound itself. The discount rate is no more independent of the cost-shifting that engenders it than it is of the greenhouse gas emissions whose costs are being shifted. D.I.C.E. thrown will never annul chance.

Wednesday, October 22, 2014

Mark Thoma and his readings have pulled together a nice collection of writing on a concept known as “helicopter money”. To be honest, as I read all the links I decided to fire off my own comment which needs a little refining. My opening line is simple:

Helicopter money means using fiscal stimulus with easy money to overcome one awful shortage of aggregate demand noting the following well established ideas.

PLOG is a Paul Krugman term for prolonged large output gap, which has been the current situation since 2008. This period has also been described as a liquidity trap where fiscal stimulus is clearly needed as traditional monetary policy has done all it can do and we still are in a PLOG. This naturally leads to my first well established proposition:
We should be using fiscal policy that maximizes the bang for the buck.
Which leads me to the rest of my rant:

(1) Transfer payments for the poor does so by giving income to people most likely to spend it;
(2) Payments to the rich or tax cuts for the rich have no bang but a lot of bucks (Barro-Ricardian equivalence);
(3) We could this with public infrastructure investments;
(4) The Republican dorks running Congress are trying to cut (1) and (3) while emphasizing more of (2); which is why
(5) We need to take fiscal policy out of the hands of these Republican dorks who run Congress.

Monday, October 20, 2014

There is a fascinating piece by Gretchen Morgenson in today’s New York Times about the large investments public pensions have made in private equity funds. The focus is on the secrecy of these deals, but the question also comes up as to whether these investments are proper given the fiduciary role that pension fund managers are supposed to play.

One thought that occurs to me is this: pension funds by their nature should position themselves overall toward relatively lower risk portfolios. Yet pension funds pay a management fee to private equity firms, and then the first 20% or so of investment profits go to private equity as well. For these fixed costs pension investors receive rights to the residual returns, which may be positive or, as in the case that leads the article, negative. Present and future pensioners are paying for the opportunity to play a lottery.

It should really be the other way around. General partners like private equity funds should pay pension funds an initial percent on investment for access to capital along with returns up to some specified level. The private equity folks, being more risk-loving (in theory) would then grab what’s left. In this way the risk would be allocated according to levels of fiduciary responsibility. Why should wealthy speculators load the risk onto working class retirees?

Sunday, October 19, 2014

The truth about usury lies somewhere beyond St. Ambrose's condemnation and Jeremy Bentham's cavalier apologetics. In a very brief but valuable essay, Francis Bacon counselled,

It is good to set before us the incommodities and commodities of usury, that the good may be either weighed out or culled out; and warily to provide, that while we make forth to that which is better, we meet not with that which is worse.

Strictly speaking, compound interest is usury. Discounting is compound interest, ergo discounting is usury. Bentham, who upheld usury in a series of letters addressed to Adam Smith, also was a pioneering proponent of cost-benefit analysis for public investments. Considering that usury has both incommodities and commodities, a proper cost-benefit analysis would need to evaluate the costs as well as the benefits that arise from the discounting of future value.

The typical way of handling traditional objections to usury is to cite scripture and the interpretations of it offered by religious authorities. This was the method followed by Benjamin Nelson in The Idea of Usury, whose analysis was taken up by Lewis Hyde in The Gift and by David Graeber in Debt: the first 5000 years. But the biblical injunctions are laconic and subsequent interpretations may partake more of rationalization than impetus. Bentham was right when he observed,

It is one thing, to find reasons why it is fit a law should have been made: it is another to find the reasons why it was made: in other words, it is one thing to justify a law: it is another thing to account for its existence.

Bentham's defence of usury, though, was as verbose and meandering as the infamous passage from Deuteronomy about brethren and strangers was terse. His account of the grounds for the prejudice against usury was frivolous and dismissive. "To trace an error to its fountain head," Bentham cited Lord Coke, "is to refute it." What Bentham meant by "trace" was "assert." According to him, the prohibition of usury was motivated by the perverse asceticism of early Christians, foolish abstractions of Aristotle and ill-tempered envy toward the wealthy by the profligate debtors.

More concisely and substantively, Francis Bacon presented, in one paragraph, a catalogue of seven disadvantages arising from usury. A second paragraph elaborated on three advantages. Bacon's fourth criticism of usury is of particular interest:

…it bringeth the treasure of a realm or state into a few hands; for the usurer being at certainties, and others at uncertainties, at the end of the game most of the money will be in the box; and ever a state flourisheth when wealth is more equally spread…

In favour of usury, Bacon's second point is his most compelling:

…were it not for this easy borrowing upon interest, men's necessities would draw upon them a most sudden undoing, in that they would be forced to sell their means (be it lands or goods), far under foot; and so, whereas usury doth but gnaw upon them, bad markets would swallow them quite up.

In modern parlance, Bacon's most compelling arguments, both for and against usury, refer to what Marshall called the "external economies" -- or positive and negative externalities -- of the loan transactions. For better or worse then, compound interest is a vehicle for the shifting of costs and benefits. It is well to remember, in this connection, Joan Martinez-Alier's observation that "one can see externalities not as market failures but as cost-shifting successes."

One doesn't need to assume that cost shifting is necessarily a bad thing. Insurance, including social insurance, is a form of cost shifting. But when the project being evaluated in a cost-benefit analysis has the overt purpose of internalizing the cost of externalities -- such as in the analysis of abatement of greenhouse gas -- it is disingenuous to overlook the role of compound interest in enabling the social cost shifting in the first place and of perpetuating it over the period being analyzed. In other words, part of the value allegedly being "added" by capital in the analysis is not in fact being produced but is merely being appropriated by capital through social cost shifting.

Saturday, October 18, 2014

David Roberts, bless ‘im, has another fine post in which he sums up a pair of recent journal articles that cast doubt on estimates of the cost of stabilizing greenhouse gas concentrations. The two main points he emphasizes are both quite sensible. First, long-term economic prognostication is a fool’s errand. He highlights a telling quote from one study by Rosen and Guenther:

[G]iven all the uncertainties and variability in the economic results of the IAMs [integrated assessment models] … the claimed high degree of accuracy in GDP loss projections is highly implausible. After all, economists cannot usually forecast the GDP of a single country for one year into the future with such a high accuracy, never mind for the entire world for 50 years, or more.

Precisely. Or as Keynes put it,

The sense in which I am using the term [uncertainty] is that in which the prospect of a European war is uncertain, or the price of copper and the rate of interest twenty years hence, or the obsolescence of a new invention, or the position of private wealth owners in the social system in 1970. About these matters there is no scientific basis on which to form any calculable probability whatever. (The General Theory, 1937)

The second point is that economies are complex interdependent systems whose interconnections can’t possibly be modeled by analysts who know only the world as it is now, not the world as it will become. One disturbing factor, of course, will be climate change itself, which will likely have deep, and mostly impossible to foresee, effects on many aspects of the economy. Similarly, different technological and institutional configurations of the future economy of the planet can’t be captured by models that consider them separately, or only in light of their market connections.

Now I’d like to add two further observations to the mix. The first is that the long run economic costs of climate change mitigation, the ones that will show up over the course of 50 or 100 years, are really irrelevant. The case for taking action doesn’t depend on them, and future people will have to figure out how to cope with them when the time comes. It’s the short term costs, the ones that will make themselves known during the first years of serious policy implementation, that matter. They matter for policy, because if we can anticipate them we can take actions to minimize their impact. Crucially, they matter for political economy, since the opposition to action on climate change is ultimately about short run costs: who bears them and how big they are expected to be.

The second observation is that the biggest nonlinearity is hiding right under our nose: the potential for writing off a portion of the capital stock. All existing models assume that capital goods are employed until they fully depreciate, with reduced productivity of the stock related smoothly to more rapid depreciation: if a change in relative prices means a unit of capital is a bit less productive, its lifespan will be a bit shorter.

This assumption rules out a fundamental nonlinearity: each unit of capital has a tipping point, a critical balance of revenues and operating costs that separates utilizing it from abandoning it. Consider a simple example: an airplane. A large passenger airplane is a significant piece of capital investment. Its profitability depends on the cost of providing air travel and the willingness of travelers to pay for it. If the cost of fossil fuel rises due to a carbon tax or cap, airline companies have to raise prices and cope with the resulting loss of demand. This can mean somewhat fewer flights and more empty seats. But there is a level of price increases at which the plane is simply taken out of service: it’s no longer profitable to operate it. Indeed, an entire company may liquidate, going from a substantial capitalization to scrap. There is almost certainly some fuel price that triggers this discontinuity, although we don’t necessarily know what it is in advance. The same point likely holds for many investments in transportation, shipping, real estate and fuel-intensive manufacturing.

If this view is correct, economists should put research into the short run effects of fossil fuel prices on the capital stock into high gear. The cumulative effect of such writeoffs will be macroeconomic disruption, which we can offset through policy if we can see it coming. Above all, identifying the investments most at risk from climate policy will tell us more about the political barriers we face than a thousand surveys about public attitudes toward science.

I have nothing to add to this important piece about sexual harassment and the dependence of restaurant servers on tips. Read it yourself. The practice of holding servers hostage to the emotional fluxes and fantasies of customers is barbaric. Visitors from Europe, at least the ones I know, are appalled. Surely one of the reasons for working for a living is to not have to depend on alms.

Now that I have the obvious – can Brad DeLong explain why he wastes his precious time with these two? OK – we have this nonsense to deal with:

Even if what the Fed is doing is not inflationary, the arbitrary fashion in which our central bank responds to markets betrays a lack of concern about inflation. And that behavior by monetary authorities is enough to make markets expect inflation in future.

These two sentences convinced more than ever that no one should ever take Amity Shlaes seriously, but why we are at it, here is what I wrote over at Mark Thoma’s blog:

I'm not sure why anyone wastes time with her or this Cliff Asness person. Neither know anything about economics. My proof? The utter stupidity of their writing as noted in this quote. Hey Amity - we are far below full employment. With nominal interest rates at rock bottom and with the fiscal austerity that your idiot Republican masters have imposed on us - what is left? Oh yea - higher expected inflation might lower real interest rates in spite of the zero interest rate bound. And you think this is a bad thing? Stupid!

Friday, October 17, 2014

Nothing fundamental has changed in the Eurozone. The region is sputtering, alternating between sluggish growth and outright recession. Inflation is below target and trending ever downward. Imbalances between surplus and deficit countries remain unsustainable. European banks are still stumbling along with unknown equity buffers, and it falls on the fiscally strapped governments of the periphery to backstop their own institutions. Austerity fails to deliver on debt reduction, since moribund economies can’t generate enough tax revenue, and the denominator in the debt/GDP ratio refuses to grow. So it has been, and so it is.

The only barrier standing between the current mess and a return to the sovereign debt crises of a couple of years ago is Draghi’s pledge to do whatever it takes to keep the bond vigilantes at bay. What has always been unknown is the extent to which this promise (Outright Monetary Transactions) is a false front. If there were new runs on the weakest sovereigns, would the ECB go on a buying binge to keep prices up? Germany has been explicit in its opposition to OMT, which it regards as illegal and, in its version of macroeconomic moralism, sinful. Still, financial markets were reluctant to take on the ECB for fear that Draghi would do what he said, and that bets against the sovereigns would not pay.

Nevertheless, the longer the bleeding continues in the EZ, the more likely it is that OMT will be tested. Greece, with SYRIZA spooking the moneyed class, is already seeing a runup in its interest rates, and contagion is not out of the question. The problem of the hour, however, is that the credibility of OMT hinges on Germany backing down in the confrontation over austerity.

Here’s why. Throughout the zone, governments are challenging the 3% cap on fiscal deficits. In the face of a potentially devastating triple dip recession, they have no choice. Moreover, the steady rise in the popularity of Euroskeptic parties aligns the politics with the economics. Bond issues will expand, and as they do, markets will wonder whether the rising debt still has Draghi’s backing, particularly since it will be in explicit defiance of Germany’s demands—and Germany, in theory, has the power to prevent Draghi from carrying out OMT.

Put this way, it all seems rather obvious: the cost of permitting a renewed run on the debt of weaker sovereigns is so great that surely Germany would have to back down, implicitly if not overtly. So one would think, and I hope this happens. But sometimes political commitments can take on a life of their own. Germany has clearly drawn a line in the sand, and the domestic credibility of Merkel—and her coalition partner, the SPD, as well—would collapse if she were seen to do an about-face. Up to this point, domestic German politics has entirely dominated external relations in German policy-making.

Given enough time and further economic deterioration in Germany itself (which will persuade business interests to demand a change in direction), I expect Germany to accede. The problem is one of timing. Here is the scary scenario:

1. Talks between Germany and the expansionistas break down. France and its Mediterranean allies begin expanding their fiscal deficits, while Germany publicly rebukes them and indicates that it will not permit the ECB to support “irresponsible” deficits with bond purchases. There is a temporary surge in support for Merkel within Germany, as she shows herself to be principled and tough.

2. Investors, sensing new weakness on the part of the ECB, start shorting sovereign debt, first in Greece, then perhaps again in Spain and, crucially, Italy.

3. The moment of truth for OMT, delayed for two years, now arrives. Either Draghi follows through or he doesn’t. To back up his pledge he needs Germany to go along. But Merkel has drawn a line in the sand, one which has overwhelming popularity at home, and if she allows Draghi to go forward she faces a political catastrophe. Moreover, neither the CDU nor the SPD wants to be the party that breaks ranks and allows the other to play the role of the steadfast defender of economic virtue. Germany says “no” and......

You can take it from there.

If you think such a disaster ought to be kept at as great a distance as possible, what you should hope for is that Germany backs down now, before a crisis materializes, and that serious attention is given to the underlying structural and institutional factors that make Eurozone finances so precarious in the first place.

Wednesday, October 15, 2014

Utility is a hypothetical measure of well-being used by economists (and others) to construct models of individual choice. It is whatever motivates people to make the choices they make. It cannot be seen or measured directly, only inferred from those choices under the assumption that there is a single “something” behind them.

In recent years the concept of utility, along with the claim that individuals act to maximize it, has come under attack. Other aspects of well-being, like self-reported happiness or satisfaction, certain types of brain activity, and indicators of physical and emotional health are directly measurable, and it’s been found that people often, and systematically, make choices that fail to optimize these substantive benefits. In fact, there has been a lively and complex debate, kicked off by the Easterlin Paradox, over whether and under what conditions increases in real income correspond to increases in directly measurable well-being.

So today I notice a new post on Vox by Glaeser, Gottlieb and Ziv that defends utility against the claims of self-reported life satisfaction. The big name here, for those who don’t know, is Ed Glaeser, perhaps the most prominent urban economist working today. They say, we’ve found new evidence that people’s choices don’t maximize their happiness: they could move to a different location and become happier but they don’t. Hence there’s a conflict between utility, the invisible whatever that causes people to choose what they choose, and measurable happiness. And this shows that policies geared toward increasing happiness are misguided, because utility is what should be maximized.

Tell me if I’m missing something here, but what I see is this: (1) We have a theory that people’s choices maximize something called utility. (2) But we have evidence that measurable well-being is not maximized by these choices. (3) Therefore we conclude that measurable well-being is a bad proxy for “true” well-being.

Of course, any single measurable dimension of well-being is likely to be incomplete. We really do need, as Stiglitz et al. said, a dashboard of indicators. But surely the shortcomings of any one measure can only be assessed against other measures. And my having chosen A over B is not in itself a substantive measure of how well off I am basking in my subsequent A-ness. The question, after all, is whether the economic choices people make maximize their well-being. To test this we go out and gather various independent measures of well-being. When we find out they diverge in significant ways from revealed preferences, it is weird to use this as a demonstration that the evidence can’t really show what it seems to show, since our hypothesis about utility maximization has to be right. And for “weird” you can also substitute “ideological”.

Tuesday, October 14, 2014

Cameron ridicules France's 'nonsense' 35 hour working week, Daily Mail: "Mr Cameron launched his attack on the French employment model while responding to questions from pensioners and older workers at Age UK's London head office.":

The idea - economists would call it the lump of labour fallacy - the idea that there is just a fixed number of jobs and all you have got to do is try and divide them up between young people, old people, males, females - I think it's nonsense.

...

Very dangerous to ever point a finger at another European country but I sometimes think the French, with their obsession with the 35-hour working week, they are falling into the danger of a lump of labour fallacy, where ‘if only everyone just worked 35 hours there would be more work to go round.

Abstract: The lump-of-labor fallacy has been called one of the “best known fallacies in economics.” It is widely cited in disparagement of policies for reducing the standard hours of work, yet the authenticity of the fallacy claim is questionable, and explanations of it are inconsistent and contradictory. This article discusses recent occurrences of the fallacy claim and investigates anomalies in the claim and its history. S.J. Chapman's coherent and formerly highly regarded theory of the hours of labor is reviewed, and it is shown how that theory could lend credence to the job-creating potentiality of shorter working time policies. It concludes that substituting a dubious fallacy claim for an authentic economic theory may have obstructed fruitful dialogue about working time and the appropriate policies for regulating it.

That’s it. How, you might wonder, can such a simple statement of obvious fact undermine the tenets of modern society?

According to Paul Krugman, though, "there’s a lot of room to reduce emissions without killing economic growth. If you think you've found a deep argument showing that this isn't possible, all you've done is get confused by your own word games."

O.K., let's play some serious "word games," then, and try not to get confused.

Actually, these are word games about pictures. One of them is Wittgenstein's discussion of the duck-rabbit picture. The other is Keynes's discussion of the newspaper beauty contest in which contestants are asked to guess which pictures the most contestants think are prettiest.

But let's start with another quote from Paul Krugman, "So what I end up with is basically Martin Weitzman’s argument: it’s the nonnegligible probability of utter disaster..." What the probability of utter disaster does in Weitzman's argument is render the standard cost-benefit analysis, based on a market-based discount rate, inoperative. What's a discount rate? It's an interest rate, or more specifically, according to Investopedia,

The discount rate also refers to the interest rate used in discounted cash flow analysis to determine the present value of future cash flows. The discount rate in discounted cash flow analysis takes into account not just the time value of money, but also the risk or uncertainty of future cash flows; the greater the uncertainty of future cash flows, the higher the discount rate.

So a discount rate is an interest rate. What is an interest rate? In the retrospective, "From Usury to Interest," Joseph Persky explained,

Our modern word 'interest' derives from the Medieval Latin interesse. The Oxford English Dictionary explains that interesse originally meant a penalty for the default on or late payment of an otherwise legitimate, nonusurious loan. As more sophisticated commercial and financial practices spread through Europe, fictitious late payments became an accepted if disingenuous way of circumventing usury laws. Over time, 'interest' became the generic term for all legitimate and accepted payments on loans.

A discount rate is an interest rate is a (formerly) usurious charge on a loan. Now, if I were to say next that "economic growth is another aspect of compound interest" or that "usury is what propels growth and what makes it imperative" an economist would insist that I am some kind of a crank. I won't say that.

I won't say it because what we're dealing with here are not economic growth and interest rates but accounts of economic growth and interest. These accounts are like pictures and here is where Wittgenstein can be of help. Paul Krugman, and EconoSpeak's own Peter Dorman, are fond of reminding us that critics of growth "mistakenly" identify GDP with "stuff." They point out that it is value, not stuff, that gets added up in the national income accounts. This is a bit like saying the duck-rabbit picture is a picture of a rabbit, not of a duck.

GDP certainly is not stuff. It is an account of something. And it is most definitely an account of something that many -- possibly most -- people perceive of as stuff. The question then arises whether the "value" that economists attribute to GDP would continue to carry the same weight if the people who formerly perceived of GDP as accounting for stuff stopped having that perception. This is another way to pose the question, "what is liquidity?"

What isliquidity? Clearly Keynes thought that liquidity-preference resulted from uncertainty and that changes in liquidity-preference are implicated in slumps to the extent that the rate of interest required to induce people to not hoard liquid assets exceeds the expected rate of return on productive investments. In other words, while GDP is indeed not stuff, whether it is perceived to be an account of stuff may well have a bearing on private investment decisions.

Furthermore, whether an individual investor does or does not believe that GDP is an account of stuff doesn't matter as much as what that investor believes is the average perception of investors. Keynes illustrated this condition with his beauty contest story.

In conclusion, Weitzman presented a compelling case for the inappropriateness of using market interest rates as a discount rate for cost-benefit analysis of policies for abatement of GHG emissions. There are no grounds for assuming that capital markets would react benignly to such policies, however prudent and appropriate they may be.

Is there "a lot of room to reduce emissions without killing economic growth?" as Paul Krugman asserts or "do we dare to question economic growth?" as Warwick Smith wants to know.

Saturday, October 11, 2014

Sorry about the header, but there’s been renewed interest in the pioneering work of Stafford Beer in Allende’s Chile, first as a result of Eden Medina’s Cybernetic Revolutionaries (2011) and now the article by Evgeny Morozov in the current New Yorker. Beer was a management specialist who applied cybernetic principles to business organization. He was brought to Chile to design a cybernetic planning system for the entire economy, Cybersyn, which died stillborn when Allende was overthrown in a coup in 1973. Undoubtedly, this is one of the great might-have-beens in the twentieth century: what could Beer have built if he had been given enough time and resources?

I have more than a passing interest in this topic. I regularly teach a course called Alternatives to Capitalism, and I used the Medina book in the most recent iteration. This fall I am working on a paper (as a co-author) that brings together Beer’s “viable model” of the firm with economic analysis, with a focus on the determinants of worker autonomy. I’ve been imbibing Beer regularly for some time.

There is a lot to say about the new round of Beer enthusiasm. I don’t want to get into a pissing contest (where are these puns coming from?), but I believe Morozov is quite wrong to identify Beer with the Soviet cyberneticians depicted, for example, in Francis Spufford’s marvelous Red Plenty. The Soviet reformers wanted to use computers to calculate efficient planning prices; there were no prices in Beer’s model. There was no bottom-up reversal of authority in the Soviet vision, not even in theory, whereas that was intended to be the distinctive aspect of Cybersyn, the feature that would make it “really” socialist. While computers played a role in both approaches, it’s a mistake to put too much weight on them, big and heavy as they were back then. Beer, after all, had made his reputation with minimum knowledge or use of computers; his cybernetics was organizational and conceptual.

But I don’t think that the actual potential of Cybersyn matched Beer’s vision for it, and its shortcomings even during the limited period in which it was in partial operation bear this out. Beer’s critics were right, in fact: this really was a project whose result could only be to intensify centralized control over decisions at lower levels—the computer as Big Brother. Production systems were taken as given at the enterprise level, and the only questions were those asked in operations research: how much should we dial up this process or dial down that one? People were simply instruments in this framework; they had no ability to change the questions that were being asked.

From an economic point of view, I’m afraid Beer did not rise to the Hayek challenge. Cybersyn processed information about the throughput of materials and products far more efficiently than the Hayek of 1937 could have imagined, but it left unexamined the problem of how information is ultimately generated. An internet of things can tell you what materials are going where, but it can’t identify promising innovations in production systems or tell you which innovations should be replicated and which discarded. Worse, it has no way to assess the quality of what’s being produced, since it is primarily consumers who need to be able to decide this. Hayek is surely right that what we would now call parallel processing is needed to implement trial-and-error methods in real time, and there need to be incentives for improved production methods and higher quality. Hayek, non-Walrasian that he was, would probably say, and I would agree with him, that Beer’s model works well within firms but not between them, since coordination is not the primary problem that economies, rather than firms, need to solve. (The deep problem is coordinate to do what and in what way?)

I’m compressing a much more detailed argument and should probably stop here. None of this, incidentally, has to do with the paper I’m writing, since that one is about the theory of the firm. I should also add that Beer’s ideas are valuable and can be incorporated into a better model of economic planning, just not the way he went about it. The guy was brilliant but he didn’t know much economics.

UPDATE: Here are two more thoughts about Cybersyn.

A. For Beeristas, it should be disturbing that his Chilean model lacked a System II, in this case meaning there was no provision for horizontal communication between firms. All information flowed up and down, passing through the center. In fact, it was all System III—with no apparent Systems IV or V. System III is the element of command-based hierarchy.

B. Mechanical application of the viable systems model to whole economies is a dubious enterprise. The clearest evidence for this is the large role that markets play at present. Markets do not exemplify any of Beer’s systems beyond System I (direct activity of the units); they operate on a different basis. This doesn’t mean that markets are perfect or that planning is impossible, only that before you start postulating how economies need to be organized you ought to take a close look at how markets do this. Specifically, as I tried to explain above, markets accomplish several functions that are necessary to a modern economy but are not addressed by Cybersyn. Does this imply a division of labor? What division?

To put it in Beerian terms, Cybersyn is not an economic brain. What it approximates is the autonomic nervous system, in the sense that János Kornai and Béla Martos described it in Autonomous Control of the Economic System. It’s fine for a paramecium but rather limited for a human.

Poor Fred Hiatt. For years, this Editor of the Editorial page of the Washington Post has made his named appearances on the editorial page (he daily bloviates the main ed lead anonymously) only to call for cutting Social Security, and occasionally Medicare as well. This has been his schtick for many years. Now it is over, but he fails to recognize it.

OK, for some time I have been ridiculing him over this obsession of his, which he has imposed on many other regular writers on WaPo's ed page, including R.J. Samuelson, Ruth Marcus, and even more recently, Catherine Rampell. I almost wrote on this when he went nuts over this on Monday, but Dean Baker whonked on him pretty solidly immediately, pointing out how stupid and ridiculous he looked, declaring that while today's US debt/GDP ratio is 74%, with near zero interest rates, ten years from now the CBO says it will be 78%, which Hiatt hysterically declared to be "dangerous." The 104% forecast for 2039 he declared to be "unsustainable," which Dean correctly pointed out was totally ridiculous. So, I did not post anything.

Needless to say, the ridicule has mounted, some of it more general, some of it more specific. So, Paul Krugman has pointed out the problem of "secret deficit lovers," people who have made a living whining about deficit dangers, but now that the latest reports say the deficit is going down are unhappy, because their longstanding calls to cut benefits for old people are not likely to be taken seriously in the near future. PK named no names, but Fred Hiatt is near the top of the list, if not absolutely at the top. More personally, John Podesta, whom he cited in his Monday WaPo piece, perhaps the single most stupid and embarassing column he has ever written, has dumped all over him in on Twitter with an accompanying column in yesterday's WaPo, as linked to by Mark Thoma.

So, let me add my two bits to this that none of the above have yet said. First of all, it is amazing that when confronted with good news from the CBO that medical care costs are falling, leading to declining future deficit projections, Hiatt does not applaud, indeed, does not anywhere in his column even note that this is a change in the future projections. He does note the new data, without noting how it reduces the hysteria of his past columns, but he continues to whine that while in the past Obama appointed the Bowles-Simpson commission that called for cuts in senior entitlements, along with tax increases that GOP members of that commission would not accept (see Paul Ryan), which somehow for years Hiatt has accepted as something irresolutely unbridgeable (he briefly noted that tax increases were one way out of senior entitlement problems, but did not remotely recommend them, despite longstanding polls showing support for exactly this solution to any such problem seriously arising in the future), he simply cannot bring himself to admit that the problem he has been carrying on about for so many years so hysterically simply is not what he claimed it was. He, and many of his close pals, have simply been wrong wrong wrong.

So, here we have poor Hiatt, resolutely ignoring good news. Not a whisper in his column that in fact the CBO is now forecasting not only continuing deficit reductions in the near future (with most of the US public still mistakenly believing that they are higher, with no help from WaPo on informing them otherwise). CBO carefully does not project forward further reductions in med care costs, and Hiatt does not even remotely raise the possibility of such, much less the idea that maybe the way to avoid having a debt/GDP ratio over 100 a quarter of a century from now might be to focus on continuing the effort of Obama to bring down med care costs in the US to OECD levels. Dean Baker has long pointed out that if our med care costs were at OECD levels, we would not have this long term deficit problem at all. And there are many obvious ways to move in that direction, from reducing the power of pharma patents to loosening immigration rules for physicians, along with many others.

So, I feel sorry for Fred. Beating up on seniors who have paid in their taxes for what they are getting has been the one an only topic that has inspired him to write columns under his own name for many years. The new projections of lower deficits, good news to most of us, simply do not register with him. Actually, they probably do. But Krugman is right. As much as anybody, he is the longstanding VSP in DC who has been whining for years about cutting Social Security and Medicare, whose excuse for this argument has simply disappeared, but he and his pals simply are not willing to face the new facts.

Friday, October 10, 2014

Crossroads Arabia reports on an article in Saudi newspaper al-Monitor by commentator, Bader al-Rashed. He is upset that apparently Daesh (aka ISIS/ISIL/IS) is distributing books in its territory of control by Muhammed ibn Wahhab, the founder of the Wahhabist movement that is the ruling ideology of the Saudi royal family since the 1740s. This would suggest that indeed Daesh is strongly Wahhabist in its fundamental orientation.

Al-Rashed in turn argues that no, they are not. Daesh are really Kharijites, a Muslim group from the early days of Islam that was neither Sunni nor Shi'i, and was strict in its views and was based mostly in what is now southern Iraq. They were famous for their intense takfirism, a practice of excommunicating people they viewed as not being proper Muslims. That would indeed seem to be something that Daesh likes to do. This is tied with the notion of apostasy, which is outlawed in 21 Muslim countries and punishable by death. I note that some interpreters of the Qur'an read the relevant passages as allowing for amputation or expulsion as alternatives, and certainly the last of these would be far more humane.

There are no self-declared Kharijites anywhere in the world now, and Daesh does not identify itself as such. The closest group, although more moderate than the old Kharijites, would be Ibadis, a group descended from a close relative of the Kharijites. They are dominant in Oman today and are neither Sunni nor Shi'i, actually seeming more moderate than most nations ruled by either of those.

In any case, it must be recognized that Daesh is drawing strongly on fundamental theology of the Saudis. The latter must oppose them because their declaraion of caliphate says they should rule Mecca, Medina, and al Quds, (Jerusalem). The king of Saudi Arabia;s proudest title is Protector of the Holy Cities, Mecca and Medina, with a successful Haj just completed. They do not claim to be caliphs, but do not wish to give up their rule of those cities, or that title.

As Harvard’s Martin Weitzman has argued in several influential papers, if there is a significant chance of utter catastrophe, that chance — rather than what is most likely to happen — should dominate cost-benefit calculations. And utter catastrophe does look like a realistic possibility, even if it is not the most likely outcome.

Weitzman argues — and I agree — that this risk of catastrophe, rather than the details of cost-benefit calculations, makes the most powerful case for strong climate policy. Current projections of global warming in the absence of action are just too close to the kinds of numbers associated with doomsday scenarios. It would be irresponsible — it’s tempting to say criminally irresponsible — not to step back from what could all too easily turn out to be the edge of a cliff.

...

So what I end up with is basically Martin Weitzman’s argument: it’s the nonnegligible probability of utter disaster that should dominate our policy analysis. And that argues for aggressive moves to curb emissions, soon.

So far, so good. But Krugman's conclusion produced this pretzel of cognitive dissonance:

...there has to be a real chance that political support for action on climate change will revive.

If it does, the economic analysis will be ready [no, it isn't]. We know how to limit greenhouse-gas emissions [no, we don't]. We have a good sense of the costs [nope] — and they’re manageable [how could we know?]. All we need now is the political will.

Krugman apparently assumed that the cost estimates developed, for example, in Nordhaus's "dynamic integrated climate-economy" (DICE) analyses are independent of when the greenhouse gas abatement actions are taken. But the rationale for delaying abatement is that the discount rate assumed in the model makes it cheaper to wait to do the abatement. Weitzman's critique doesn't present cost estimates. Contra Krugman, there is not even a consensus about what needs to be done or how to do it, let alone "a good sense of the costs" of doing... it? Whatever "it" is.

The bottom line (literally) is that a key consideration of the structure and assumptions of the conventional models was facilitating economic growth and relying on that economic growth to finance the costs of abatement. The DICE were loaded for growth! To put it somewhat crudely, delaying abatement was supposed to make a large part of the cost "pay for itself" through the dividends earned on the money saved by not doing it now. You cannot have your cake and eat it too. Nor can you finance a current expenditure from revenues you would have earned if you hadn't made the expenditure.

The excerpts and abstracts below are not from people Krugman would ridicule as "degrowthers." There seems to be a dawning awareness that the assumptions of the conventional integrated assessment models need to be, at the very least, radically revised, which is essentially the point those silly anti-capitalist degrowthers on the left (going back to that silly anti-capitalist leftist Nicolaus Georgescu-Roegen) have been making all along.

"Climate Change Policy: What Do the Models Tell Us?" Robert S. Pindyck

Very little. A plethora of integrated assessment models (IAMs) have been constructed and used to estimate the social cost of carbon (SCC) and evaluate alternative abatement policies. These models have crucial flaws that make them close to useless as tools for policy analysis: certain inputs (e.g., the discount rate) are arbitrary, but have huge effects on the SCC estimates the models produce; the models’ descriptions of the impact of climate change are completely ad hoc, with no theoretical or empirical foundation; and the models can tell us nothing about the most important driver of the SCC, the possibility of a catastrophic climate outcome. IAM-based analyses of climate policy create a perception of knowledge and precision, but that perception is illusory and misleading.

Economic models help illustrate the links between the climate and the economy, and they are an important component of the multidisciplinary analysis that is needed to address climate change. However, there are major problems with the estimates of potential damages in the IAMs... First, damage functions and estimates appear to have little connection to the empirical findings from econometric studies of sectoral impacts, particularly on agriculture, as we discuss later. More generally, economy-wide damage functions are simply not known, especially at the global level. Thus, as Pindyck (2013b) argues, there is little empirical, or for that matter theoretical, foundation for the specification of functional forms and parameters in the models. This suggests that their quantitative results and policy prescriptions are somewhat arbitrary.

We agree with Stern (2013) that there are gross underestimations of damages in economic impact models and IAMs, and we discuss some additional issues that are not adequately addressed in the models including the importance of nonlinearities, environmental impacts, extreme events, and capital losses.

'To slow or not to slow' (Nordhaus, 1991) was the first economic appraisal of greenhouse gas emissions abatement and founded a large literature on a topic of great, worldwide importance. In this paper we offer our assessment of the original article and trace its legacy, in particular Nordhaus' later series of 'DICE' models. From this work many have drawn the conclusion that an efficient global emissions abatement policy comprises modest and modestly increasing controls. On the contrary, we use DICE itself to provide an initial illustration that, if the analysis is extended to take more strongly into account three essential elements of the climate problem -- the endogeneity of growth, the convexity of damages, and climate risk -- optimal policy comprises strong controls. To focus on these features and facilitate comparison with Nordhaus' work, all of the analysis is conducted with a high pure-time discount rate, notwithstanding its problematic ethical foundations. [We have argued elsewhere that careful scrutiny of the ethical issues around pure-time discounting points to lower values than are commonly assumed (usually with little serious discussion).]

Once the social cost of carbon is high enough to justify maximum feasible abatement in cost-benefit terms, then cost-benefit analysis becomes functionally equivalent to a precautionary approach to carbon emissions. All that remains for economic analysis of climate policy is to determine the cost-minimizing strategy for eliminating emissions as quickly as possible. This occurs because the marginal damages from emissions have become so large; the uncertainties explored in our analysis, regarding damages and climate sensitivity, imply that the marginal damage curve could turn nearly vertical at some point, representing a catastrophic or discontinuous change.

The factors driving this result are uncertainties, not known facts. We cannot know in advance how large climate damages, or climate sensitivity, will turn out to be. The argument is analogous to the case for buying insurance: it is the prudent choice, not because we are sure that catastrophe will occur, but because we cannot be sufficiently sure that it will not occur. By the time we know what climate sensitivity and high-temperature damages turn out to be, it will be much too late to do anything about it. The analysis here demonstrates that plausible values for key uncertainties imply catastrophically large values of the social cost of carbon.
…
Our results offer a new way to make sense of the puzzling finding by Martin Weitzman: his “dismal theorem” establishes that under certain assumptions, the marginal benefit of emission reduction could literally be infinite (Weitzman 2009). The social cost of carbon, which measures the marginal benefit of emission reduction, is not an observable price in any actual market. Rather, it is a shadow price, deduced from an analysis of climate dynamics and economic impacts. Its only meaning is as a guide to welfare calculations; we can obtain a more accurate understanding of the welfare consequences of policy choices by incorporating that shadow price for emissions.

Not much time this morning, but I’d like to respond to the dustup between Mark Buchanan and Paul Krugman over whether energy (and other resource) use can be decoupled from GDP growth.

1. I get the impression that Buchanan identifies GDP with “stuff”, at least subconsciously. But GDP is value, what people are willing to pay for. When I teach an econ class, that’s a component of GDP. If an extra student shows up, that’s GDP growth. Of course, a lot of GDP really is stuff, but as economies develop they tend to become less stuffy. Overall not enough, but how unstuffy they could become is an empirical, not a theoretical matter.

2. But I agree with Buchanan that increases in energy efficiency alone are unlikely to accomplish what we need to contain climate change. Absent changes on other fronts, the growth of demand for energy services will simply swamp the effect of greater efficiency. This has been true in the past and any realistic projection puts it in our future as well.

3. And Buchanan is also right that there is no historical precedent for the kind of decoupling between economic growth and fossil fuel use (let’s be specific here) that we would need to meet both economic and climate goals. The notion of foregoing most of our remaining supplies of extremely energy-dense minerals flies in the face of all of human history. That’s why it will be a big challenge to bring it off. The challenge begins with putting in place a policy that prohibits most fossil fuel development. You can discuss the particulars, but there is no getting around the need for a binding constraint. The reason is exactly the one that Buchanan pinpoints: increases in efficiency and even increases in renewable energy sources alone will not be sufficient by themselves to offset the energy demands stemming from global GDP growth.

Can we compel most fossil fuels to stay in the ground and still have economic growth? Since this is about growth in value and not necessarily stuff, the answer still seems to be yes. But we won’t have a sufficient shift away from stuffiness without measures that prohibit dangerous levels of fossil fuel extraction. An unprecedented change in the trajectory of economic growth requires unprecedented policies.

Postscript: I’m not interested in whether “unlimited” economic growth at some distant future date is incompatible with resource constraints. It’s also true that economic growth can’t continue after the universe collapses in on itself. We’ll let distant future people, if they still exist, worry about this.

Wolfgang Schäuble, the German finance minister, speaking in Washington on Thursday, insisted that “writing checks” was no way for the eurozone to increase growth, according to Reuters. Mr. Schäuble urged France and Italy to do more to overhaul their economies instead.

Yes, the first priority is for the rest of the Eurozone to be more like Germany. They can begin by requiring firms to belong to industry associations that tax their members to finance apprenticeship and other programs, putting worker representatives on firms’ supervisory bodies and transferring the majority of banking assets to noncommercial (public and cooperative) institutions. That’s Schäuble’s plan, right?

Thursday, October 9, 2014

Paul Krugman's column the other day, invoking William Nordhaus's "demolition" of the forecasting model in The Limits to Growth, got me wondering about how well Nordhaus's indictment has stood up over the years. So I started poking around in the archives.

Twenty years after publication of Limits to Growth, the research team reconvened in 1992 for Beyond the Limits, an update of the earlier analysis. Nordhaus, too, followed up his earlier "blistering" review with a critique of the second version. This second review was more conciliatory, albeit still critical of the Limits to Growth and Beyond the Limits assumptions and conclusions:

While the LTG school argued that economic decline was inevitable and economists argued that the LTG argument was fallacious, the argument is ultimately an empirical matter. Put differently, critics would have gone too far had they claimed that the postulated pessimistic scenario could not hold.

Instead of simply "demolishing" the LTG model, in his second review, Nordhaus responded with his own simple model, using more a conventional generalized Cobb-Douglas production function.

Like LTG models, the general model given in the last section shows the tendency toward economic decline. In addition, there are no less than four conditions, each of which is satisfied in the LTG model, that will lead to ultimate economic stagnation, decline, or collapse...

[However]...the entire argument can be reversed with a simple change in the specification of the model; more precisely, I will introduce technological change into the production structure and assume that the Cobb-Douglas production function accurately represents the technological possibilities for substitution.

...the debate about future of economic growth is an empirical one, and resolving the debate will require analysts to examine fundamental structural parameters of the economy... How large are the drags from natural resources and land? What is the quantitative relationship between technological change and the resource-land drag? How does human population growth behave as incomes rise? How much substitution is possible between labor and capital on the one hand, and scarce natural resources, land, and pollution abatement on the other? These are empirical questions that cannot be settled solely by theorizing.

One of the discussants for Nordhaus's 1992 Brookings paper was Martin Weitzman, who described it as "an outstanding paper." that "represents the economic state of the art, circa 1992, in dealing seriously and honestly with the major limits-to-growth arguments." One could almost imagine hearing the scalpel being quietly honed as Weitzman administered that subtle anesthesia.

Fast forward another two decades and it is Nordhaus's turn to comment on a paper by Weitzman, "On modeling and interpreting the economics of catastrophic climate change."

In an important paper, Weitzman (2009) has proposed what he calls a dismal theorem. He summarizes the theorem as follows: "[T]he catastrophe-insurance aspect of such a fat-tailed unlimited-exposure situation, which can never be fully learned away, can dominate the social-discounting aspect, the pure-risk aspect, and the consumption-smoothing aspect." The general idea is that under limited conditions concerning the structure of uncertainty and societal preferences, the expected loss from certain risks such as climate change is infinite and that standard economic analysis cannot be applied.

Nordhaus concluded his discussion of Weitzman's theorem on a somber and humble note:

In many cases, the data speak softly or not at all about the likelihood of extreme events. This means that reasonable people may have quite different views about the likelihood of extreme events, such as the catastrophic outcomes of climate change, and that there are no data to adjudicate such disputes. This humbling thought applies more broadly, however, as there are indeed deep uncertainties about virtually every issue that humanity faces, and the only way these uncertainties can be resolved is through continued careful consideration and analysis of all data and theories.