Over at the excellent Global Dashboard, Alex Evans has a post reflecting on the things he and David Stevens called wrong (and less wrong), looking through their development and poverty lens, in the aftermath of the crisis. In a similar spirit, my sometime colleague Ian Christie sent me ‘Ten notes on the crisis’, representing his take on what we’d learnt about economics and politics since 2008. I thought they deserved a wider audience. And so, with his permission, I’m republishing his Ten Notes here. They start below the fold.

Foresight’s report on the future of food and farming covers the ground, but seems to be blind to disruptive change

The recent UK Government Foresight report on the future of the global food and farming system can’t be faulted for a lack of ambition. It takes on the whole of the global food system, and looks out to 2050. Much of what it says is valuable (and the supporting papers look to be a useful research resource), and this is to be expected, given the calibre of the advisers the project was able to draw on. But there are some telling gaps, and these largely come from a lack of decent futures work in the report. Some of these gaps have been pointed up by some of the other work that’s been published recently on food and its wider impacts. This will be a long post, so I’m going to split it into two parts, the first about the Foresight report, the second on its limits.

Recent economics models of the world in 2050 are projections of the past, not the future. They are not designed to deal with constraints.

Suddenly, with the turn of the decade and the latest World Economic Forum opening, we’re awash with projections of how the world economy might look in 2050, produced by economists, and each as implausible as the other. Two, from HSBC and PwC (both open in PDF) offer similar (and improbable) views of a hugely expanded global economy to 2050. But they also seem to assume that the future will be a lot like the past.

Business risk is usually an exercise in corporate power. But your own business assumptions are about to become a risk in their own right.

A few months ago, I wrote about risk – suggesting that the ‘discipline’ of risk management tended to focus on risks which were (a) understood and (b) for which probabilities could be estimated, and that this led to far too narrow a view of risk. In particular, this meant that companies were usually very poor at assessing their blindspots. This second post has taken longer to write than I expected, but in this one I’m going to look at the factors which mean that companies tend to remain blinded by their blindspots.

BP’s oil disaster is partly the result of approaches to risk which lead us to believe we know more than we do.

It’s hard to know where to start with the BP oil disaster. Commentary has been gushing out almost as quickly as oil. We know the scale of the pollution, and have read ecologists who say the conditions are unlike any other seen on earth, certainly in the anthropocene period. Dark humour is one response; angry satire is another (The Onion: ‘Massive Flow Of Bullshit Continues To Gush From BP Headquarters‘).We have a good idea that BP’s conduct over the drilling was somewhere between careless and reckless (and that sooner or later a court is likely to decide on which), and that regulatory agencies were compliant or ineffective. One area that seems to deserve more thought – especially from a futures perspective – is the way in which essentially man-made disasters such as this are to a significant extent produced by a limited set of ideas about risk, both the way it gets assessed and the way it is managed.

The row over whether the International Energy Agency has or has not nurdled its oil data to (a) prevent financial market panic or (b) appease the Americans or (c) neither of the above, is interesting but a bit of a sideshow. What’s more interesting is how fast the notion that ‘peak oil’ is imminent has moved from being a contested minority view to being mainstream.

Every so often you notice a story that flashes a hazard light about the prevailing wisdom about the way the world works. Thus it was last week, reading Jack Schofield’s Guardian article about the limits of Moore’s ‘law’, the long-standing projection by Intel co-founder Gordon Moore that the number of transistors that can be placed on an integrated circuit will doubled every two years (effectively doubling performance against price in that time). It’s been suggested before that the microprocessor manufacturing process will reach physical limits. Schofield’s piece was on the increasing investment cost of processor manufacturing plants. One of the consequences is likely to be the end of the exponential growth in computing performance which has underpinned the speed and scale of the information and communications technology revolution.

The decision by the British Home Secretary Jacqui Smith to reclassify cannabis as a Category B drug, despite the opposing views of her expert advisers, has reminded me of the chaotic state of Britain’s drugs policy. It is an area where policy has remained completely immune to evidence – as one ‘killer chart’ demonstrates.

One of the observations of last year’s State of the Future report (which I blogged about here) was that organised crime was one of the the three biggest threats to global security and prosperity. Misha Glenny’s new book McMafia (‘a journey through the global criminal underworld’) comes to a similar conclusion – arguing that organised crime is a bigger threat than terrorism.

One of the things you learn working as a journalist is that most news is predictable – a point satirised by Michael Frayn in his outstanding novel The Tin Men in the 1960s. But sometimes headlines do still surprise you. One such was the news that British car exports had reached record levels last year.