Friday, November 24, 2006

[This is the pre-edited text of my latest muse for Nature, which relates to a paper published in the 16 November issue on health and safety issues in nanotechnology.]

Discussions about the risk of emerging technologies must acknowledge that their major impacts have rarely been spotted in advance.

In today's issue of Nature, an international team of scientists presents a five-point scheme for "the safe handling of nanotechnology"[1]. "If the global research community can rise to the challenges we have set", they say, "then we can surely look forward to the advent of safe nanotechnologies".

The five targets that the team sets for addressing potential health risks of nanotechnologies are excellent ones, involving the assessment of toxicities, prediction of impacts on the environment, and establishment of a general strategy for risk-focused research. In particular, the goals are aimed at determining the risks of engineered nanoparticles – how they might enter and move in the environment, to what extent humans might be exposed, and what the consequences of that will be. We need to know all these things with some urgency.

But what is a "safe technology"? According to this criterion, manufacturing nuclear warheads would be "safe" if no human was exposed to dangerous levels of radiation in the process that leads from centrifuge to silo.

To be fair, no one denies that a technology's 'safety' depends on how it is used. The proposals for mapping nanotech's risks are clearly aimed at a very specific aspect of the overall equation, concerned only with the fundamental issues of whether (and how much) exposure to nanotechnological products is bad for our health. But it highlights the curious circumstance that new technologies now seem required to carry out a risk assessment at their inception, ideally in parallel with public consultation and engagement to decide what should and shouldn't be permitted.

There is no harm in that. And there's plenty of scope for being creative about it. Some of the broader ethical issues associated with nanotech, for example, are being explored in the US through a series of public seminars organized by the public-education company ICAN Productions. Funded by the US National Science Foundation, ICAN is creating three one-hour seminars in which participants, including scientists, business leaders and members of the public, explore scenarios that illuminate plausible impacts of nanotech on daily life. The results will be presented on US television by Oregon Public Broadcasting in spring of 2007.

Yet history must leave us with little confidence that either research programs or public debates will anticipate all, or even the major, social impacts of a new technology. We smile now at how anyone believed that road safety could be addressed by having every automobile preceded by a man waving a red flag. In those early days, the pollution caused by cars was barely on the agenda, while the notion that this might affect global climate would have seemed positively bizarre.

Of course, it is something of a cliché now to say that neither the internal combustion engine nor smoking would ever have been permitted if we knew then what we know now about their dangers. But the point is that we never do – it is hard to identify any important technology for which the biggest risks have been clear in advance.

And even if some of them are, scientists generally lose the ability to do anything about it once the technology reacts with society. Nuclear proliferation was forecast and feared by many of the Manhattan Project physicists, but politicians and generals treated their proposals for avoiding it with contempt (give away secrets to the Russians, indeed!). It took no deep understanding of evolution to foresee the emergence of antibiotic-resistant bacteria, but that didn't prevent profligate over-prescription of the drugs. The dangers of global warming have been known since at least the 1980s, and… well, say no more.

In the case of nanotechnology, there have been discussions of, for example, its likelihood of increasing the gap between rich and poor nations, its impacts on surveillance and privacy, and the social effects of nanotech-enhanced longevity. These are all noble attempts to look beyond the pure science, but it's not at all clear that they will turn out to be the most relevant issues.

Part of the impetus for aiming to address the 'risks' of nanotech so early in the game comes from a fear that potentially valuable applications could be derailed by a public backlash like that which led to a rejection in Europe of genetically modified organisms – some (though by no means all) of which resulted from a general lack of information or understanding about the technology, as well as an arrogant assumption of consumer acquiescence.

The GMO experience has sensitized scientists to the need for early public engagement, and again that is surely a good thing. It's also encouraging to find scientists and even industries hurrying along governments to do more to support research into safety issues, and to draft regulations.

What they must avoid, however, is giving the impression that emerging technologies are like toys that can be 'made safe' before being handed to a separate entity called society to play with as it will. Technologies are one of the key drivers of social change, for better or worse. They simply do not exist in isolation of the society that generates them. Not only can we not foresee all their consequences, but some of those consequences aren't present even in principle until culture, sociology, economics and politics (not to mention faith) enter the arena.

Some technologies are no doubt intrinsically 'safer' or 'riskier' than others. But the more powerful they are, the less able we are to distinguish which is which, or to predict how that will play out in practice. Let's by all means look for obvious dangers at the outset – but scientists must also look for ways to become more engaged in the shaping of a technology as it unfolds, while dismantling the now-pervasive notion that all innovations must come with a 'risk-free' label.

Monday, November 20, 2006

Hooke: what came next?

I went to a nice talk by Lisa Jardine at the (peripatetic) Royal Institution last week, on the newly discovered notes of Robert Hooke. Lisa and her students have been studying this portfolio of notes taken by Hooke in his capacity as secretary of the Royal Society since they were rescued from auction and returned to the Royal Society earlier this year (see my earlier blog entry in May). She says that they have completely transformed her view of Hooke since writing his biography (The Curious Life of Robert Hooke, HarperCollins) in 2003. One of the hazards of being a historian, she pointed out, is that you can never be sure what may come to light and revise all your opinions, which have previously been presented to the world with such blithe authority. Well, that happens in science too, of course.

Lisa is now convinced that Hooke himself, not Newton, was his worst enemy: he was a terrible record keeper, and never finished anything. For all his protestations of priority over Huygens in regard to the invention of the spring-balance pocket watch, it seems that he may have sunk his claim himself. The new notes include a page taken from the private notes of Henry Oldenburg, written in 1670, in which Oldenburg relates how Hooke presented such a watch (Huygens’ version was patented in 1674), and then leaves a space for a description of the mechanism, apparently for Hooke himself to fill in the details. Hooke seems to have done this sketchily in pencil, but his words got worn away and he never amended them more permanently. So Oldenburg got no further in trying to transcribe them than a few lines before apparently giving up and crossing the whole lot out. But worst of all, Hooke seems to have filched the page from Oldenburg’s papers after the latter’s death, in the course of preparing his priority claim in obsessive detail – and then promptly left it buried in his own notes until it has surfaced now. So when Oldenburg’s papers were later checked to assess Hooke’s claim, there was no sign of this page!

I’m also interested to hear that the Hooke pages are shortly to be ‘conserved’ – which means that the book will be taken apart and each page placed inside plastic (Lisa says the pages are already literally falling apart beneath their fingers). So I’ll be one of the few ever to have touched the originals, and to have seen the book in its pristine form. Phew.

Wednesday, November 15, 2006

Economists as storytellers

Economist blogger Dave Iverson has written to me about my “tease” (his words, nice choice) in the Financial Times about neoclassical economics. Dave has previously commented in a way that I found insightful and fair on the exchanges and debates in the blogosphere, particularly those on Mark Thoma’s and Dave Altig’s sites. His latest post is another useful contribution, and here it is:

“Philip Ball's Financial Times' critique of economics, titled Baroque Fantasies of a Peculiar Science caused quite a stir recently in the economics blogs (particularly here here and here here.). But last week the bickering subsided with Dave Altig (macroblog) and Philip Ball seeming to have reached an accord.. At one point Altig said, "If you want, call economics an attempt to construct coherent stories about social phenomenon..." Sounds about right to me. We economists are indeed story tellers. Following this discussion, it seems clear that economists need to be much more open and honest about our assumptions and the linkages, such as they are and often are not to the real world of policy and action. No argument from me on that score. I've been arguing similarly for years.

“For more critique, see Steve Cohn's August 2002 Telling Other Stories: Heterodox Critiques of Neoclassical Micro Principles Texts, wherein Cohn attacks the "'rhetoric' of neoclassical theory, …critiquing many of the stories told, the metaphors used, the analogies drawn, and the framing language deployed."

“In addition, there have been many book-form critiques arguing that economists, particularly neoclassical economists have over-driven their headlights in much the same way that Bell argues. Here are six of my favorites (arranged by date of publication): J. de V. Graaff. Theoretical Welfare Economics. 1957 Guy Routh. The Origin of Economic Ideas. 1975 Mark Blaug. The Methodology of Economics: Or How Economists Explain. 1980 Robert L. Heilbroner. Behind the Veil of Economics: Essays in the Worldly Philosophy. 1988 Mark Sagoff. The Economy of the Earth: Philosophy, Law and the Environment. 1988 Andrew Bard Smookler. The Illusion of Choice: How the Market Economy Controls Our Destiny. 1993”

The Cohn paper is excellent – it says pretty much all of what I said in the FT article and much more, and in more depth, and frankly more persuasively. I particularly liked this, in relation to Paul Ormerod’s FT critique of how the textbooks tell the same old neoclassical story, despite what some of the practitioners are now doing to the contrary:

“We shouldn’t allow neoclassical economists to “run away” from their textbooks. The tracts educate well over a million students a year and lay the groundwork for much of educated opinion about economic issues. They should be defended or abandoned. In critiquing principles texts we should quote from the books themselves and if charged with attacking straw men, ask who is to blame: the textbook authors who built these scarecrows, or the photographers who took their picture?”

In any event, I offer the Cohn paper to those who say I’ve misrepresented the field (or have misused the word ‘neoclassical’). And I do so partly because Cohn seems to me to be very fair, acknowledging (in a way that I admit I could have done more explicitly) some of the ways in which modern economics has moved beyond the simplistic picture. This seems to me to be about dialogue rather than attack – which is absolutely what I’d like to see.

Tuesday, November 14, 2006

Was life inevitable?

Here’s the unexpurgated version of my latest story for news@nature. There’s a lot of really interesting back story here, which I hope to return to at some point. This is far and away some of the most interesting “origin of life” work I’ve seen for some time.

Life may be the ultimate in planetary stress relief, a new theory claims

The appearance of life on Earth seems to face so many obstacles that scientists often feel forced to regard it almost as miraculous. Now two scientists working at the Santa Fe Institute in New Mexico suggest that, on the contrary, it may have been inevitable.

They argue that life was the necessary consequence of the build-up of available energy on the early Earth, thanks to purely geological processes. They regard it as directly analogous to the way lightning relieves the build-up of electrical charge in thunderclouds.

In other words, say Harold Morowitz and Eric Smith in a preprint posted on the Santa Fe Institute archive [1], the geological environment "forced life into existence".

This view, the researchers say, implies not only that life had to emerge on the Earth, but that the same would happen on any similar planet. And they hope that ultimately it will be possible to predict the first steps in the origin of life based on the laws of physics and chemistry alone.

Their proposal is "instructive and inspiring", says Michael Russell, a specialist in the origin of life at the California Institute of Technology in Pasadena.

Morowitz and Smith admit that they don't yet have the theoretical tools to clinch their arguments, or to show what form this "inevitable life" must take. But they argue that it is likely to have used the same chemical processes that now drive our own metabolism – but in reverse.

They say that the young Earth would have been accumulating energy from geological processes much as a dam accumulates gravitational potential energy by piling up water. Sooner or later, something had to give.

One source of such energy would have been energy-rich compounds called polyphosphates, generated in volcanic processes. These are 'battery molecules', analogous to the compound ATP, the ubiquitous source of metabolic energy in living cells.

Another source would have been hydrogen molecules, which are likely to have been abundant in the early atmosphere even though they are almost absent today. Hydrogen would have been generated, for example, by reactions between seawater and dissolved iron.

Energy-releasing reactions between hydrogen and carbon dioxide (a volcanic gas) in the atmosphere can produce complex organic molecules, the precursors of living systems.

In our own metabolism, a series of biochemical reactions called the citric-acid cycle breaks down organic compounds from food into carbon dioxide. Horowitz and Smith say that the energy reservoirs of the young Earth could have driven a citric-acid cycle in reverse, spawning the building blocks of life while relaxing the 'energy pressure' of the environment. Eventually these processes will have become encapsulated in cells, which makes the 'energy-conducting' flows more efficient.

Life, agrees Russell is "a chemical system that drains and dissipates chemical energy." He has used similar ideas to argue that "life would emerge using the same pathways on any sunny, wet rocky planet" [2,3]. But he believes that the most likely place for it to occur was at miniature subsea volcanoes called hydrothermal vents, where the ingredients and conditions are just right for energy-harnessing chemical machinery to develop [4].

The biochemical processes of living organisms are highly organized. Scientists have long puzzled over how these 'ordered' systems can come spontaneously into being, when the Second Law of Thermodynamics suggests that the universe as a whole tends to generate increasing disorder.

The answer, broadly speaking, is that local clumps of order come at the expense of increasing the disorder in their environment. But Horowitz and Smith suggest a rationale for why such concentrations of order should happen in the first place. They draw on the idea, proposed in the 1980s by Rod Swenson of the University of Connecticut, that ordered states are much better 'lightning' conductors' for discharging excess energy.

Thus, they say, despite several major extinctions throughout geological time, when most of life on Earth was obliterated, life itself was never in danger of disappearing – because an Earth with life is always more stable than one without. They call this 'condensation' of life from the energy-rich environment a "collapse to life", which in their view is as inevitable as the appearance of snowflakes in cold, moist air.

Friday, November 10, 2006

No offence?

Well well, I hadn’t anticipated that I was lighting a fuse with my FT article on economics. There has been more follow-up in the FT itself – Paul Ormerod wrote a very nice Comment which was partly something of a response to some of the letters claiming that economics ain’t like that any more. Paul’s point is that yes, perhaps academic economics has moved on in many ways (I should have been more explicit about that myself), but the stuff that students are taught is still very much rooted in the old tradition. And these are people who graduate and then presumably go into business and politics believing that that is what economics is about – which is precisely my concern. This squares with what Robert Hunter Wade, a professor at LSE, says in his letter to the FT about how the simplistic picture of market efficiency is what tends to filter down to policy makers. All this leaves me thinking that it’s precisely for this reason that the simple picture of rational maximizers, equilibrium and market efficiency is perhaps a rather dangerous place to start from – sure, academic economists often (even generally) then move beyond it, but not everyone who draws on economic theory has learnt it beyond graduate level.

Much of the discussion prompted by my article has taken place in the blogs, however. Some of those I’ve spotted are here and here and here and here. A lot of the debate seems to focus on how stupid and misinformed my article was (although I can’t help thinking that there wouldn’t be quite so much discussion if it was that easy). I decided to take up the challenge on Dave Altig’s blog, which has been an instructive experience. At first I was taken aback by the aggression of the discourse, which was something I’ve just not experienced in the natural sciences. I don’t know if this is something specific to the economics world, or to the blogosphere generally, but it was not a pleasant discovery. However, I’m very grateful that Dave Altig has made some very gracious and polite comments that have cooled the tone and facilitated a far more constructive exchange. I was at fault here too, taking initially a more gung-ho tone than I needed to. (I think I was probably riled by some comments I received separately from an assistant professor at the University of Pennsylvania, which had a character I’d not experienced since the school playground.) It seems also that my FT piece was misread by some as being more insulting to economists than I’d intended – if that’s the impression I gave, then I regret it. I do think one sometimes needs to be provocative in order to spark a discussion, but I’d hoped to do that without seeming to jeer or ridicule.

I can’t possibly summarize all the blogging discussion; it’s there if you’re interested. But the discussion on Dave Altig’s site has been very useful for me, helping me to sharpen what it is I want to say while pointing to some issues that I need to go away and consider. His post of 9 November gives particularly valuable food for thought; thank you Dave.

Tuesday, November 07, 2006

When you can't do it all with mirrors[This is the unedited text of my recent article for muse@nature.com.]

A new proposal and costing for a technofix to global warming shows that there are probably better ways to spend the money

The leading economist Nicholas Stern has just handed us, in advance, the bill for the impacts of climate change: close to $4 trillion by the end of this century [1].

And with perfect timing, astronomer Roger Angel of the University of Arizona has delivered the equivalent of a builder's estimate for patching up the problem using a cosmic sunshade [2]. It will set us back by… well, let's make it a nice round figure of $4 trillion by the end of the century.

Both figures can be criticized – after all, when costs add up to a significant fraction of global GDP, no one can expect them to be very accurate. But this happy conflux of estimates puts some perspective on the hope that global warming can be addressed with high-tech mega-engineering projects.

From a pragmatic point of view, the sunshade solution looks like a bad bargain. If a builder told you that the cost of fixing a problem with your roof was likely to be about the same as the cost of not fixing it, except that the fix was untested and might not work at all, and in any event you know the work is likely to run over budget and probably over schedule – well, what would you do?

One could argue, however, that in this case the 'problem' involves the potential suffering of millions of people, who could be killed by disease or flood or drought, displaced from their homes, or caught up in conflict as a result of climate change – in which case you might conclude that investing in a risky technofix can be justified on humanitarian grounds.

But Stern's report, commissioned by the UK government and hailed by many other economists as the most definitive study of its sort to date, doesn't just tot up the costs of inaction over climate change. It makes some estimate of the likely costs of tackling it using existing approaches and technologies – and the answer looks cheaper and a whole lot more attainable than Angel's sunshade.

That doesn't mean Angel's proposal is without value. On the contrary, it performs the service of showing just what would be involved in pursuing one of the favourite ideas of those who believe technofixes could save us from rising world temperatures.

A space shade that reduces the amount of sunlight reaching the Earth has been debated for decades. Many of these schemes invoke a screen that would be unfolded or assembled in space, like a gigantic sail. But as James Early of the Lawrence Livermore National Laboratory in California pointed out in 1989 [3], a sail is precisely what it would be: radiation pressure would push against the sunshade, and it would therefore need to be kept actively in position.

Angel has found inventive ways of coping with all the challenges while keeping costs down. To minimize radiation pressure, the screen would deflect sunlight through only a small angle, just enough to miss the Earth. To keep it in line between the Earth and Sun, it would be placed at the so-called Lagrange point L1, a point in space 1.5 million km away that orbits the Sun with the same 1-year period as our planet.

The size of the screen would be mind-boggling: about 4-6 million square km, around half the area of China. But to avoid complicated space-assembly problems, and to simplify the launching and increase the screen's versatility, Angel proposes that it should consist of a vast swarm of 1-m disks, made from lightweight, microscopically perforated and laminated films of ceramics. Each of these 'flyers' is manoeuvrable thanks to tiny solar sails placed on tabs at the rim, powered by solar cells.

As usual, science fiction got there first. In a short story by Brenda Cooper and Larry Niven published in 2001, an alien species wipes out another by deploying a fleet of tiny mirrors around their planet, plunging it into an ice age [4] – a reminder, perhaps, that we'd better not overdo the shadowing.

Angel's flyers would be launched in stacks, like piles of Brobdingnagian dinner plates, packaged into canisters and fired into space from electromagnetic guns more than a kilometre long. Twenty such cannons would fire 1-ton payloads every five minutes for ten years. Once in space, the flyers make their way to the Lagrange point using fuel-efficient ion thrusters, where they spread out into a cloud as wide as the Earth and 100,000 km long.

And the bill, please? Estimating the costs of materials and launch facility, launch energy, and manufacturing, Angel says it could be done for less than $5 trillion.

All this sounds a long way from the sober accounting of the Stern report. But if you take the report seriously – and as a former chief economist of the World Bank, Stern apparently has the right credentials, although his conclusions have proved predictably controversial – it is similarly mind-boggling.

For example, Stern says that the impacts of climate change could end up costing the world up to 20% of its annual GDP. He compares the effect to that of the world wars or the Great Depression. The "radical change in the physical geography of the world" that climate change would produce, he says, "must lead to major changes in the human geography – where people live and how they live their lives".

Mitigating this potential crisis would require equally drastic measures. Stern does not consider technofixes like the space sunshade, but dwells instead on the far less sexy measure of reducing greenhouse-gas emissions. Gordon Brown, the UK's Chancellor of the Exchequer, who commissioned the report, has called for cuts of 30% by 2020 and 60% by 2050.

Stern's solutions involve energy-saving and improvements in energy efficiency, stopping deforestation, and switching to non-fossil-fuel energy sources. That will work only if the effort is international, he says (which is one reason why sceptics have scoffed), and it will incur a substantial cost: 1% of global GDP over the next 50 years, an amount that Stern calls "significant but manageable", and which squares with some previous estimates.

Whether the targets can be reached by putting solar cells on roofs, turning out lights, banning SUVs and building wind farms, or whether this will require more substantial measures such as new nuclear power stations, extensive carbon capture and sequestration, and fierce taxation of air travel, is a question that environmentalists, industrialists and politicians will continue to debate, no doubt as dogmatically as ever.

But as well as sketching an essay in ingenuity, Angel has done us the great favour of showing that there is probably never going to be the option of conducting business as usual under the shelter of a gigantic technofix.

Friday, November 03, 2006

I have drawn some inevitable flak for my criticisms of economic theory in the Financial Times. That’s no more than I expected. Here’s the article; the comments and my responses follow.

Baroque fantasies of a peculiar science

Published in the Financial Times, October 29 2006

It is easy to mock economic theory. Any fool can see that the world of neoclassical economics, which dominates the academic field today, is a gross caricature in which every trader or company acts in the same self-interested way – rational, cool, omniscient. The theory has not foreseen a single stock market crash and has evidently failed to make the world any fairer or more pleasant.

The usual defence is that you have to start somewhere. But mainstream economists no longer consider their core theory to be a “start”. The tenets are so firmly embedded that economists who think it is time to move beyond them are cold-shouldered. It is a rigid dogma. To challenge these ideas is to invite blank stares of incomprehension – you might as well be telling a physicist that gravity does not exist.

That is disturbing because these things matter. Neoclassical idiocies persuaded many economists that market forces would create a robust post-Soviet economy in Russia (corrupt gangster economies do not exist in neoclassical theory). Neoclassical ideas favouring unfettered market forces may determine whether Britain adopts the euro, how we run our schools, hospitals and welfare system. If mainstream economic theory is fundamentally flawed, we are no better than doctors diagnosing with astrology.

Neoclassical economics asserts two things. First, in a free market, competition establishes a price equilibrium that is perfectly efficient: demand equals supply and no resources are squandered. Second, in equilibrium no one can be made better off without making someone else worse off.

The conclusions are a snug fit with rightwing convictions. So it is tempting to infer that the dominance of neoclassical theory has political origins. But while it has justified many rightwing policies, the truth goes deeper. Economics arose in the 18th century in a climate of Newtonian mechanistic science, with its belief in forces in balance. And the foundations of neoclassical theory were laid when scientists were exploring the notion of thermodynamic equilibrium. Economics borrowed wrong ideas from physics, and is now reluctant to give them up.

This error does not make neoclassical economic theory simple. Far from it. It is one of the most mathematically complicated subjects among the “sciences”, as difficult as quantum physics. That is part of the problem: it is such an elaborate contrivance that there is too much at stake to abandon it.

It is almost impossible to talk about economics today without endorsing its myths. Take the business cycle: there is no business cycle in any meaningful sense. In every other scientific discipline, a cycle is something that repeats periodically. Yet there is no absolute evidence for periodicity in economic fluctuations. Prices sometimes rise and sometimes fall. That is not a cycle; it is noise. Yet talk of cycles has led economists to hallucinate all kinds of fictitious oscillations in economic markets. Meanwhile, the Nobel-winning neoclassical theory of the so-called business cycle “explains” it by blaming events outside the market. This salvages the precious idea of equilibrium, and thus of market efficiency. Analysts talk of market “corrections”, as though there is some ideal state that it is trying to attain. But in reality the market is intrinsically prone to leap and lurch.

One can go through economic theory systematically demolishing all the cherished principles that students learn: the Phillips curve relating unemployment and inflation, the efficient market hypothesis, even the classic X-shaped intersections of supply and demand curves. Paul Ormerod, author of The Death of Economics, argues that one of the most limiting assumptions of neoclassical theory is that agent behaviour is fixed: people in markets pursue a single goal regardless of what others do. The only way one person can influence another’s choices is via the indirect effect of trading on prices. Yet it is abundantly clear that herding – irrational, copycat buying and selling – provokes market fluctuations.

There are ways of dealing with the variety and irrationality of real agents in economic theory. But not in mainstream economics journals, because the models defy neoclassical assumptions.

There is no other “science” in such a peculiar state. A demonstrably false conceptual core is sustained by inertia alone. This core, “the Citadel”, remains impregnable while its adherents fashion an increasingly baroque fantasy. As Alan Kirman, a progressive economist, said: “No amount of attention to the walls will prevent the Citadel from being empty.”

So there you have it. Now the critics, published in the 1 November FT and online:

Letter 1:Did this sceptic ever take a course in the one science that calls itself dismal?

Sir, Philip Ball ("Baroque fantasies of a most peculiar science", October 30) quarrels with what he calls neoclassical economics. Perhaps his scarce argument may be better allocated against a competing end: might he notice that physics attempts to describe, explain and predict the action of matter in space, motion and time?

Economic theory establishes a baseline description of human behaviour, while always positing that when humans act, considerable complexity results. Perhaps Mr Ball never took a second course in the only science that, for its challenges, calls itself dismal.

Chris Robling, Chicago, IL 60602, US

Do you understand this? I don’t. Yes, that’s what physics does. And your point is?‘Economic theory establishes a baseline’: well yes, except that it doesn’t, because it manifestly doesn’t describe the way people act even to first order. But the real criticism is that neoclassical economics isn’t consistent even on its own terms – if you swallow its assumptions, the conclusions don’t follow. Steve Keen’s book Debunking Economics shows why.

“A second course”? Is this some kind of American euphemism? Sorry, too strange.

By the way, I suspect most people use the phrase ‘the dismal science’ without knowing what Carlyle was implying (or even that it was Carlyle who implied it). Look it up – it’s interesting. He considered economics dismal not because it was shoddy, but because it dealt with unpalatable truths about human nature. The article in which he used the phrase was, after all, about “the nigger question”.

Sir, Philip Ball says it is easy to mock economic theory ("Baroque fantasies of a most peculiar science", October 30). It is even easier to mock a caricature of economics, which is what he does, resorting to the tired cliché that we economists think we are doing mechanical physics. Once true, perhaps, but certainly not recently.

Like so many critics of economics, he paints an unrecognisable portrait of the subject. Economists do indeed use models that assume perfect competition and identical agents with unchanging behaviour, but only when it is useful to do so. At other times, we make other assumptions, including those used in the kind of models Mr Ball wrote about in his interesting book Critical Mass.

Economics is distinctive in using the concept of equilibrium - a state in which no individual consumer or business has an incentive to change behaviour - as a powerful analytical tool. It is so useful that evolutionary biologists, for example, use it all the time too.

I do of course have criticisms of my own subject. In particular, the typical undergraduate syllabus lags far behind all the remarkable developments of the past decade or two, such as information economics and behavioural economics. But the baroque citadel is Mr Ball's own fantasy; we economists moved out of it long ago, as a proper look at the mainstream journals (or a list of the Nobel winners) will show.

Diane Coyle, Enlightenment Economics, London W13 8PE, UK

Paul Ormerod tells me I would actually get on well with Diane. I think he’s right; I’ll probably get on with anyone who plugs my book. But I think I mostly agree with Diane anyway, except that I do wonder whether ‘we economists’ refers to a more select bunch than she appreciates. It is precisely those economist I mentioned in Critical Mass who are typically marginalized by the mainstream. I used ‘neoclassical’ so much in my article that I was worried by the repetition, precisely to make it clear that that is what I was criticizing, not the interesting ideas that get put forward outside of it. I understand that agent-based modelers have become so fed up with being excluded because their models violate neoclassical dogma that they have been forced to start their own journal. The ‘citadel’ is not my term, nor my fantasy – it is the expression used by Alan Kirman, one of the pioneers of economic agent-based approaches. Ask Paul Ormerod. Ask Paul Krugman, for that matter. If they all feel this way, surely there’s a reason?

Sir, I want to congratulate Philip Ball for his insightful and long-overdue comment about the domination of the economic profession by frustrated mathematicians and physics lecturers (October 30). He is incorrect in one regard, however, when he comments that "there is no other 'science' in such a peculiar state".

In fact, the worlds of finance and risk management have embraced the same nonsensical application of quantitative methods that he describes so well operating in the world of economics. This "derivative" view of credit risk and other important issues of global capital finance has badly damaged the ability of investors to perceive risk and gives managers an unreasonable view of the risks that they do accept. Witness the latest fiasco involving the hedge fund formerly called Amaranth. And there will be more examples very shortly.

Well, precisely. I talk briefly about derivatives and risk management in Critical Mass, simply to say that you’re not going to do very well forecasting risk if you insist on thinking that market noise is Gaussian.

Letter 4:Economists are busy dealing with the impact of 'real agents' in the economy

Sir, Contrary to what Philip Ball believes, many economists are already busy "dealing with the variety and irrationality of real agents" ("Baroque fantasies of a most peculiar science", October 30). These economists include several Nobel prize-winners: Herbert A. Simon, Daniel Kahneman, Vernon L. Smith and Thomas C. Schelling. In fact, the Nobel prize this year was awarded to Edmund Phelps for challenging the Phillips curve trade-offs by "taking into account problems of information in the economy".

Mocking economic theory is easy but doing so by perpetuating "rigid dogma" about economics and economists is pure hearsay. A survey of recent literature in mainstream economic journals or textbooks should enlighten this misconceived view.

Chee Kian Leong, 639798 Singapore

OK, so it’s basically the same point as Diane Coyle’s. But the question of the Nobels is curious. (Needless to say, Simon and Schelling loom large in Critical Mass.) I’ve talked with others about the strange fact that economics Nobels often (though by no means always) go to contributions that lie outside the mainstream, and thus outside of neoclassical dogma. (One could add Stiglitz and Sen to the list, for example.) This speaks of impeccable taste (or nearly so) on the part of the Nobel committee. But it is puzzling – nothing like it happens in the other ‘sciences’.

The bottom line is: do you believe in neoclassical general equilibrium theory, with its efficient market hypothesis, its exogenous shocks, its aggregate price curves and all the rest? If not, do you think it is right that this is what students learn and come to believe about the way the economy works? And that papers which question the theory’s fundamental principles should be excluded from much of the literature?