“The distinction between wage and profit-led growth is a major feature of neo-Kaleckian
growth theory. The essence of the distinction is that in a wage-led economy an increase
in the wage share (i.e. a decrease in the profit share) increases economic activity and
growth, whereas in a profit-led economy it has the reverse effects. This distinction has
important implications for policy, especially in the current environment of stagnation and high unemployment. If economies are wage-led, it suggests policy that increases the
wage share is a powerful means of raising growth and lowering unemployment. The
converse holds for economies that are profit-led …

These policy implications have triggered an extensive econometric literature that
aims to identify whether economies and economic regions are wage or profit-led. The
implicit fundamental assumption within that empirical literature is an economy’s or an
economic region’s character (i.e. whether it is wage or profit-led) is exogenously
determined by deep primitive parameters. The current paper questions that assumption
and explores the foundations of what determines whether an economy is wage or profitled …

The theoretical analysis gives rise to a Post-Keynesian analogue of the Lucas critique (Lucas, 1976). Lucas argued that the estimated econometric impact of policy was endogenous and depended on agents’ expectations of policy. In like vein, the current paper shows that whether an economy is wage or profit-led will depend on existing policies. Consequently, it is not possible to classify an economy as being intrinsically wage or profit-led. Instead, the econometrically identified character of the economy is contingent on policy and may change with changes in policy.

At the policy level, the paper shows that the growth–inequality trade-off posed by profit-led economies can be finessed by changing the distribution of the wage share. Consequently, it may be possible to have faster growth and less inequality in economies that appear profit-led. Even more significantly, if the wage distribution is changed sufficiently, the economy can flip from being profit-led to being wage-led.“

Inspired by the work of Doctors Without Borders (Médecins Sans Frontières), I have recently started a project called Economists Without Borders (Economistes Sans Frontières). Its purpose is to inoculate the global economy against the virus of neoliberalism. Last week, I had two difficult “missions” to Vienna and Warsaw.

In Vienna, I confronted an outbreak of the neoliberal globalization – free trade strain of the virus. Without doubt, this is the most virulent and dangerous of all strains. People who get infected become blind to all evidence, deaf to all argument and prone to intellectual condescension. Massachusetts Avenue in Washington DC is a hot zone of infection. The bad news is that if you are over forty and infected it is doubtful you can be cured. However, younger patients have a chance of recovery. Here is the anti-viral I prescribed titled “The Theory of Global Imbalances: Mainstream Economics vs. Structural Keynesianism”.

In Warsaw, I confronted an outbreak of Milton Friedmanism which is one of the oldest strains of neoliberal virus. Friedmanism is a gateway virus that weakens defenses against other neoliberal strains and younger minds are particularly susceptible to it. The good news is that if diagnosed early there is a good chance of recovery. However, if treatment is delayed, intellectual ossification and closed-mindedness sets in. This ossification is almost always associated with inflation obsessive compulsive disorder and austerity fever. Here is the treatment I recommend titled “Milton Friedman’s Economics and Political Economy: An Old Keynesian Critique”.

The Scandinavian economies top many polls on happiness and living standards. But they also have the worrying distinction of leading the developed world in household debt.

Denmark has the highest household debt-to-disposable income ratio among the world’s richest countries at 310 per cent. Norway and Sweden are not far behind with ratios of 200 and 170 per cent respectively, according to the Organisation for Economic Cooperation and Development.

Policy makers have taken different approaches in each country. In Denmark, officials seem relaxed , arguing that Danes have lots of assets and are able to withstand rising interest rates “The threat to financial stability from [household debt] is therefore not serious in the current situation,” Lars Rohde, governor of the central bank, told Bloomberg this year.

But the Riksbank in Sweden has long worried about rising house prices and household debt levels. The central bank sought (and failed) to gain control over macroprudential policy – measures designed to ensure financial stability, such as capping the amount home buyers can borrow for a mortgage. Instead, macroprudential policy was given to Sweden’s Financial Supervisory Authority, which is already tightening mortgage rules. Today, Swedes only have to repay the interest on mortgages, meaning some loans take as many as 140 years to repay.

The FSA this month proposed that new mortgage holders would have to pay down half of their loans.

The impact of such measures is hotly disputed. Distortions persist in the Swedish and Norwegian housing markets, where there is far greater demand than supply in the biggest cities. Borrowers in both countries are also able to claim tax relief on mortgage interest.

Some argue that reforming these distortions would be the most important change authorities could make. Christian Clausen, chief executive of Nordea, says politicians need to take responsibility for helping create asset bubbles.

On Sweden’s tax incentives to own a house, he adds: “You have a screwed system that wants to over-leverage in principle in an asset that you don’t want to over-leverage on.”

One may wonder how much calibration adds to the knowledge of economic structures and the deep parameters involved … Micro estimates are imputed in general equilibrium models which are confronted with new data, not used for the construction of the imputed parameters … However this procedure to impute parameter values into calibrated models has serious weaknesses …

First, few ‘deep parameters’ have been established at all …

Second, even where estimates are available from micro-econometric investigations, they cannot be automatically importyed into aggregated general equlibrium models …

Third, calibration hardly contributes to growth of knowledge about ‘deep parameters’. These deep parameters are confronted with a novel context (aggregate time-series), but this is not used for inference on their behalf. Rather, the new context is used to fit the model to presumed ‘laws of motion’ of the economy …

This leads to the fourth weakness. The combination of different pieces of evidence is laudable, but it can be done with statistical methods as well … This statistical approach has the advantage that it takes the parameter uncertainty into account: even if uncontroversial ‘deep parameters’ were available, they would have standard errors. Specification uncertainty makes things even worse. Negecting this leads to self-deception.

There are many kinds of useless economics held in high regard within mainstream economics establishment today. Few — if any — are less deserved than the macroeconomic theory/method — mostly connected with Nobel laureates Finn Kydland, Robert Lucas, Edward Prescott and Thomas Sargent — called calibration.

Hugo Keuzenkamp and yours truly are certainly not the only ones having doubts about the scientific value of calibration. In Journal of Economic Perspective (1996, vol. 10) Lars Peter Hansen and James J. Hickman writes:

It is only under very special circumstances that a micro parameter such as the inter-temporal elasticity of substitution or even a marginal propensity to consume out of income can be ‘plugged into’ a representative consumer model to produce an empirically concordant aggregate model … What credibility should we attach to numbers produced from their ‘computational experiments’, and why should we use their ‘calibrated models’ as a basis for serious quantitative policy evaluation? … There is no filing cabinet full of robust micro estimats ready to use in calibrating dynamic stochastic equilibrium models … The justification for what is called ‘calibration’ is vague and confusing.

Given that “calibration” purposefully foresakes error probabilities and provides no way to assess the reliability of inference, how does one assess the adequacy of the calibrated model? …

The idea that it should suffice that a theory “is not obscenely at variance with the data” (Sargent, 1976, p. 233) is to disregard the work that statistical inference can perform in favor of some discretional subjective appraisal … it hardly recommends itself as an empirical methodology that lives up to the standards of scientific objectivity

In physics it may possibly not be straining credulity too much to model processes as ergodic – where time and history do not really matter – but in social and historical sciences it is obviously ridiculous. If societies and economies were ergodic worlds, why do econometricians fervently discuss things such as structural breaks and regime shifts? That they do is an indication of the unrealisticness of treating open systems as analyzable with ergodic concepts.

The future is not reducible to a known set of prospects. It is not like sitting at the roulette table and calculating what the future outcomes of spinning the wheel will be. Reading Lucas, Sargent, Prescott, Kydland and other calibrationists one comes to think of Robert Clower’s apt remark that

much economics is so far removed from anything that remotely resembles the real world that it’s often difficult for economists to take their own subject seriously.

Instead of assuming calibration and rational expectations to be right, one ought to confront the hypothesis with the available evidence. It is not enough to construct models. Anyone can construct models. To be seriously interesting, models have to come with an aim. They have to have an intended use. If the intention of calibration and rational expectations is to help us explain real economies, it has to be evaluated from that perspective. A model or hypothesis without a specific applicability is not really deserving our interest.

To say, as Edward Prescottthat

one can only test if some theory, whether it incorporates rational expectations or, for that matter, irrational expectations, is or is not consistent with observations

is not enough. Without strong evidence all kinds of absurd claims and nonsense may pretend to be science. We have to demand more of a justification than this rather watered-down version of “anything goes” when it comes to rationality postulates. If one proposes rational expectations one also has to support its underlying assumptions. None is given, which makes it rather puzzling how rational expectations has become the standard modeling assumption made in much of modern macroeconomics. Perhaps the reason is, as Paul Krugman has it, that economists often mistake

beauty, clad in impressive looking mathematics, for truth.

But I think Prescott’s view is also the reason why calibration economists are not particularly interested in empirical examinations of how real choices and decisions are made in real economies. In the hands of Lucas, Prescott and Sargent, rational expectations has been transformed from an – in principle – testable hypothesis to an irrefutable proposition. Believing in a set of irrefutable propositions may be comfortable – like religious convictions or ideological dogmas – but it is not science.

No philosopher of science has influenced my own thinking more than Roy did.

Rest in peace my dear friend.

What properties do societies possess that might make them possible objects of knowledge for us? My strategy in developing an answer to this question will be effectively based on a pincer movement. But in deploying the pincer I shall concentrate first on the ontological question of the properties that societies possess, before shifting to the epistemological question of how these properties make them possible objects of knowledge for us. This is not an arbitrary order of development. It reflects the condition that, for transcendental realism, it is the nature of objects that determines their cognitive possibilities for us; that, in nature, it is humanity that is contingent and knowledge, so to speak, accidental. Thus it is because sticks and stones are solid that they can be picked up and thrown, not because they can be picked up and thrown that they are solid (though that they can be handled in this sort of way may be a contingently necessary condition for our knowledge of their solidity).

Comments Policy

I like comments. Follow netiquette. Comments — especially anonymous ones — with pseudo argumentations, abusive language or irrelevant links will not be posted. And please remember — being a full-time professor leaves only limited time to respond to comments.