Tag Archive for 'Heterodox'

When I first read Coase’s (1984: 230) description of the collected works of the old-school institutionalists – as “a mass of descriptive material waiting for a theory, or a fire” – I thought it was (a) hysterically funny and (b) surely dead-on (even though I had not read this work). Sometime later, I encountered Krugman’s (1995: 27) assertion that “Like it or not, … the influence of ideas that have not been embalmed in models soon decays.” I think my reaction to Krugman was almost as enthusiastic as my reaction to Coase, although I hope the word “embalmed” gave me at least some pause. But then I made it to Krugman’s contention that a prominent model in economic geography “was the one piece of a heterodox framework that could easily be handled with orthodox methods, and so it attracted research out of all proportion to its considerable merits” (p. 54). At this point, I stopped reading and started trying to think.

This is really important, fundamental stuff. I’ve been interested in it for a while (e.g. my previous thoughts on “mainstream” economics and the use of mathematics in economics). Beyond the movement of economics as a discipline towards formal (i.e. mathematical) models as a methodology, there is even a movement to certain types or styles of model. See, for example, the summary – and the warnings given – by Olivier Blanchard [MIT] regarding methodology in his recent paper “The State of Macro“:

That there has been convergence in vision may be controversial. That there has been convergence in methodology is not: Macroeconomic articles, whether they be about theory or facts, look very similar to each other in structure, and very different from the way they did thirty years ago.
…
[M]uch of the work in macro in the 1960s and 1970s consisted of ignoring uncertainty, reducing problems to 2×2 differential systems, and then drawing an elegant phase diagram. There was no appealing alternative – as anybody who has spent time using Cramer’s rule on 3×3 systems knows too well. Macro was largely an art, and only a few artists did it well. Today, that technological constraint is simply gone. With the development of stochastic dynamic programming methods, and the advent of software such as Dynare – a set of programs which allows one to solve and estimate non-linear models under rational expectations – one can specify large dynamic models and solve them nearly at the touch of a button.
…
Today, macro-econometrics is mainly concerned with system estimation … Systems, characterized by a set of structural parameters, are typically estimated as a whole … Because of the difficulty of finding good instruments when estimating macro relations, equation-by-equation estimation has taken a back seat – probably too much of a back seat
…
DSGE models have become ubiquitous. Dozens of teams of researchers are involved in their construction. Nearly every central bank has one, or wants to have one. They are used to evaluate policy rules, to do conditional forecasting, or even sometimes to do actual forecasting. There is little question that they represent an impressive achievement. But they also have obvious flaws. This may be a case in which technology has run ahead of our ability to use it, or at least to use it best:

The mapping of structural parameters to the coefficients of the reduced form of the model is highly non linear. Near non-identification is frequent, with different sets of parameters yielding nearly the same value for the likelihood function – which is why pure maximum likelihood is nearly never used … The use of additional information, as embodied in Bayesian priors, is clearly conceptually the right approach. But, in practice, the approach has become rather formulaic and hypocritical.

Current theory can only deliver so much. One of the principles underlying DSGEs is that, in contrast to the previous generation of models, all dynamics must be derived from first principles. The main motivation is that only under these conditions, can welfare analysis be performed. A general characteristic of the data, however, is that the adjustment of quantities to shocks appears slower than implied by our standard benchmark models. Reconciling the theory with the data has led to a lot of unconvincing reverse engineering
…
This way of proceeding is clearly wrong-headed: First, such additional assumptions should be introduced in a model only if they have independent empirical support … Second, it is clear that heterogeneity and aggregation can lead to aggregate dynamics which have little apparent relation to individual dynamics.

I’ve been wanting to write an essay on this for ages, but every time I think or talk to someone about it, I get hit with more ideas and different approaches. In the interests of not forgetting them, I thought it might be worthwhile formalising, if not my opinions, then at least the topics that I want to write on. I’m very interested in people’s opinions on these, so if you have a particular view, please leave some comments.

Economics as an expression of ideology

Language choice as:

(+ve) a means of aiding communication in a specialised field

(+ve) a means of enforcing definitional and therefore intellectual rigour [e.g. arguments over the meaning of “market failure”]

(~) a shaper of methodology

(~) a signal of author competence or paper quality [e.g. “the market for lemmas” or the comment made by a French philosopher, mentioned by Daniel Dennett in a footnote of his book “Breaking the spell”]

(-ve) an embodiment of ideology or bias [e.g. 95% of the work in feminism interpretting literature seems to be in highlighting this sort of stuff]

(-ve) a barrier to outside comment or involvement

The fact that mathematics in general and modelling in particular are each a choice of language

“All models are wrong; some are useful” — George Box

The different purposes of models:

to explore the implications of particular assumptions [moving forwards]

to illustrate the possibility (or plausibility) of a particular outcome [moving backwards]

to explain an observed outcome, or a collection of observed outcomes [moving backwards]

The importance of context [e.g. what is valid at the individual level may not be at the aggregate level]

Fashions and fads in academia. The conflict between:

The need to tackle “the big issues”

The desire to stand out (do something different)

The impulse to follow-the-leader/jump-on-the-bandwagon

The (incentive driven ?) need to publish rapidly, frequently and consistently [i.e. the mantra of “publish or perish“]

The desire to influence real-world policy or public opinion

Heuristics in academia. Rules-of-thumb or a preference for particular techniques. Is it “better” to learn a few types of model extremely well than several models reasonably well? It does allow researchers to jump onto a new topic and produce a few papers very quickly … [e.g. this]

Mainstream conclusions (or opinions) versus mainstream methodology

How to move the mainstream:

Stay in and push or jump out and call to those still in? [e.g. See, in particular, all the discussion on the topic of heterodoxy vs. orthodoxy and Keynesianism vs. Neoclassicalism around the blogosphere before, during and after this comment by Brad DeLong]