There have been a lot of predictions already for 2015. The Chartered Institute for Personnel and Development predicts employment in the UK increasing by half a million, GDP growth of 2.4% and earnings growth of between 1% and 2%, ie keeping its predictions reasonably close to what happened in 2014 and/or what the OBR, OECD and IMF are predicting. The annual FT economists’ survey resulted in an average conclusion amongst 90 economists that GDP growth would increase from 2.4% pa to 2.5% pa around election time and then on to 2.6% pa soon after. I could go on but I think you get the general idea – small changes around current economic statistics with a remarkable level of agreement amongst the experts. It’s enough to make you want to use these predictions to populate your models with, which is of course the general idea.

Once again, there was a trend of projecting a price rather similar to the current one and a remarkable level of agreement amongst the experts. Here is what has actually happened subsequently:

Now I don’t want to pick on these forecasters in particular, after all the futures prices indicated that these views were the overwhelming consensus. But the oil price is a fundamental indicator in most economic models – Gavyn Davies details how the latest fall in oil prices has changed his economic forecasts here – with implications for inflation and GDP growth, and dependent upon predictions about many other areas of the political economy of the world which impact supply (eg OPEC activity, war in oil-producing areas) and demand (eg global economic activity). So an ability to see a big move in oil prices coming would seem to be a clear prerequisite for being able to make accurate economic forecasts. It seems equally clear that that ability does not exist.

Economic forecasts generally tell us that things are not going to change very much, which is fine as long as things are not changing very much but catastrophic over the short periods when they are. Despite the sensitivity testing that goes on in the background, most economic and business decisions are taken on the basis that things are not going to change very much. This puts most business leaders in the individualist camp described here, ie a philosophical position which encourages risk taking. Indeed even if some of the people advising business leaders are in the hierarchist camp, ie believing that the world is not predictable but manageable, to anyone with little mathematical education this is indistinguishable from an individualist position.

The early shots of the election campaign have so far been dominated by the Conservative Party branding Labour’s spending plans (which to the extent they are known appear to involve quite severe fiscal tightening, although not as drastically severe as Conservative ones) as likely to cause “chaos”, while the Labour Party wants to wrap itself in the supposed respectability of OBR endorsement of their economic policies. Neither of them has a plan for another economic crisis, which concerns me.

What are desperately needed are policies which are aimed at reducing our vulnerability to the sudden movements in economic variables which we never see coming. We should stop trying to predict them because we can’t. We should stop employing our brightest and best in positions which implicitly endorse the assumption that things won’t change very much because they will.

What sort of an economy would it have to be for us not to care about the oil price? That’s what we need to start thinking about.

I have been reading Ha-Joon Chang’s excellent book Economics: The User’s Guide after listening to him summarising its thrust at this year’s Hay-on-Wye Festival of Literature and the Arts. It is very disarming to meet an economist who immediately tells you never to trust an economist, and I will probably return to his thoughts on the limitations of expert judgement in a future article.

But today I want to focus on his summary of the major schools of thought in economics, and what the implications might be for actuaries. Chang’s approach is that he does not completely subscribe to any particular school but does not reject any either. He bemoans what he sees as the total domination of all economic discussion currently (and therefore also all political discussion about running the economy) by neoclassical economists. I think actuarial discussion may suffer from a similar problem.

So what is neoclassical economics? Well it has become almost invisible to us due to its omnipresence, in the way fish don’t see the water they swim in, but its assumptions may surprise you. It assumes that all economic decisions are at an individual level, with each individual seeking to maximise what is known as their utility (ie things and experiences they value). The idea is that we self-interested individuals will collectively make decisions which, within the competitive markets we have set up, result in a socially better outcome than trying to plan everything. This approach has become a very conservative outlook (ie interested in preserving the status quo) in Chang’s view ever since it was further developed to include the Pareto principle in the early 20th century, which says that no change in economic organisation should take place unless no one is made worse off. This limits the scope for redistribution within a society, which can lead to the levels of inequality we see now in parts of the developed world which many are becoming increasingly concerned about, Thomas Piketty included.

Arguments between neoclassical economists in Chang’s view tend to be restricted to ones about how well the market actually works. The market failure argument says that there is a role to play for governments in using taxes and regulations (negative externalities) or in funding particular things like research (positive externalities) to mitigate the impacts of markets, particularly in areas where market prices do not fully reflect the social cost of particular activities (eg pollution on the environment). Another criticism made of neoclassical economics is that it does not allow properly for the fact that buyers and sellers do not have the same level of information available to them in many markets, and therefore the price struck is often not the one which would lead to the best outcome for society as a whole. So the more “left wing” neoclassicalism requires more market regulation to protect consumers and the environment they live in.

The more “right wing” neoclassical response to this is that people actually do know what they are doing, and even build in the likelihood that they are being conned due to asymmetric information in the decisions they make. The government should therefore reduce regulation and generally get out of the way of wealth-creating business. This form of neoclassicalism views the risk of government failure as much greater than that of market failure, ie even if we have market failure, the costs of government mistakes will inevitably be much greater.

And if you draw a line between those two forms of neoclassicalism, somewhere along that line you will find all of the main UK political parties and pretty much all economic discussion within the financial services industry.

And, on the whole, it tends to circumscribe the role that actuaries play in the UK.

One of the major drawbacks of neoclassical theory is that is assumes risks can be fully quantified if we only have a comprehensive enough model. Actuaries are predominantly hierarchists, who believe that they can manage the inequalities which flow from neoclassical theory via collectivist approaches, like insurance policies and pension schemes, and protect individuals and indeed whole financial systems from risk. Since Nicholas Nassim Taleb and others made so much money from realising that this was not the case in 2008, this has probably been neoclassicalism’s most obvious flaw, and the one which has given rise to the most discussion (although possibly not so much change to practice) amongst actuaries.

But there are others. Neoclassicalism assumes that individuals are selfish and rational, both of which have been persuasively called into question by the work of Kahneman and others, who have shown that we are only rational within bounds and make most of our decisions through “heuristics” or rules of thumb. Actuaries have tried to reflect these views, some of which were originally developed by Herbert Simon in the 40s and 50s, particularly in the way that information is communicated (eg the recent publication from the Defined Ambition working group), but have very much stayed at the microeconomic level (very much, according to Chang, like much of the Behaviouralist School themselves) rather than exploring the implications of this theory at a macroeconomic level.

Neoclassical theory is also much more focused on consumption than production, with its endless focus on markets of consumers. One alternative approach is that proposed by the Neo-Schumpeterian School, which rightly points out that, in many markets, technological innovation is considerably more important than price competition for economic development. The life-cycle of the iphone, from innovation to temporary market monopoly to the creation of a totally new market in android phones is a case in point. Actuaries have done relatively little work with technology firms.

Another school of economic thought which is much more focused on production is the Developmentalist Tradition, which believes governments can improve outcomes considerably by intervening in how economies operate: from promoting industries which are particularly well-linked to other industries; to the protection of industries which develop the productive capability of the economy, particularly infant industries which might get smothered at birth by the more established players in the market. This tradition clearly believes that the risk of government failure is less than the potential benefits of intervention. The failure of productivity to pick up in the UK since 2008 has been described as a “puzzle” by the Bank of England and other financial commentators. Perhaps some clues might lie outside a neoclassical viewpoint.

The Institutionalists have looked at market transaction costs themselves, pointing out that these extend way beyond the costs of production, and could theoretically encompass all the costs of running the economic system within which the transactions take place, from the courts to the police to the educational and political institutions. They have suggested that this may be why so much economic activity does not take place in markets at all, but within firms. I think actuaries have started to engage with failures in pricing mechanisms recently, particularly where these have environmental consequences such as in the case of carbon pollution and the implications for the long term valuations of fossil fuel reserves on stock markets.

The Keynesians I have written about before. They are probably the most opposed to the current austerity policies, pointing out how, if a whole economy stops spending and starts saving when in debt, as an individual would, the economy will stay in recession longer and recovery (and therefore the possibility of significant deficit reduction) will be slower. The coalition government in the UK have neatly proved this point since 2010.

I could go on, about the Classical or Marxist Schools which have been largely discredited by historical developments over the last 200 years, but which still have useful analysis of aspects of economics, or the spontaneous order of the markets believed in by the Austrian School. However my point is that I think Chang is right to highlight that there is a wider range of economic ideas out there. Actuaries need to engage with them all.

I have been thinking about the turnover of restaurants in Birmingham recently. There have been a number of new launches in the city in the last year, from Adam’s, with Michelin starred Adam Stokes, to Café Opus at Ikon to Le Truc, each replacing struggling previous ventures.

Nassim Nicholas Taleb makes the case, in his book Antifragile, for the antifragility of restaurants. As he says: Restaurants are fragile, they compete with each other, but the collective of local restaurants is antifragile for that very reason. Had restaurants been individually robust, hence immortal, the overall business would be either stagnant or weak, and would deliver nothing better than cafeteria food – and I mean Soviet-style cafeteria food. Further, it would be marred with systemic shortages, with, once in a while, a complete crisis and government bailout. All that quality, stability, and reliability are owed to the fragility of the restaurant itself.

I wondered if this argument could be extended to terrorism, in an equally Talebian sense.

But first, three false premises:

1. Terrorist attack frequency follows a power law distribution.

Following on from my previous post, I thought I had found another power law distribution in Nate Silver’s book The Signal and the Noise. He sets out a graph of the terrorist attack frequencies by death toll. The source of the data was the Global Terrorism Database for NATO countries from 1979 to 2009. I thought I would check this and downloaded an enormous 45Mb Excel file from the National Consortium for the Study of Terrorism and Responses to Terrorism (START). I decided to use the entire database (ie from 1970 to 2011), with the proviso that I would use only attacks leading to at least 5 deaths to keep it manageable (as Nate Silver had done). The START definition of terrorism is that it is only committed by NGOs, and they also had a strange way of numbering attacks which, for instance, counted 9-11 as four separate attacks (I adjusted for this). I then used a logarithmic scale on each axis and the result is shown below. Not even straightish, so probably not quite a power law distribution, it has a definite downward curve and something else entirely happening when deaths get above 500.

In my view it certainly doesn’t support Nate’s contention of a power law distribution at the top end. On the contrary, it suggests that we can expect something worse, ie more frequent attacks with high casualties, than a power law would predict.

So what possible link could there be between terrorism and the demise of the Ikon café (there may be other restaurants where the food served met one of the other definitions of terrorism used by the Global Terrorism Database, ie intending to induce fear in an audience beyond the immediate victims, but not the Ikon)? Well, for one thing, they do have a made up statistic in common:

2. 90% of new restaurants fail within the first year.

This is a very persistent myth, most recently repeated in Antifragile, which was debunked as long ago as 2007. However, new business failures in general are still up at around 25% in the first year, which means the point that the pool of restaurants is constantly renewed by people with new ideas at the expense of those with failing ones remains valid. This process makes the restaurant provision as a whole better as a result of the fragility of its individual members.

3. 90% of terrorist groups fail within the first year.

Now I don’t know for certain whether this conjecture by David Rapoport is false, but given my experience with the last two “facts”, I would be very sceptical that the data (i) exists and (ii) is well-defined enough to give a definitive percentage. However, clearly there is a considerable turnover amongst these groups, and the methods used by them have developed often more quickly than the measures taken to counter them. Each new major terrorist attempt appears to result in some additional loss of freedom for the general public, whether it be what you can carry onto an aircraft or the amount of general surveillance we are all subjected to.

So what else do restaurants and terrorism have in common? What does a restaurant do when public tastes change? It either adapts itself or dies and is replaced by another restaurant better able to meet them. What does a terrorist group do when it has ceased to be relevant? It either changes its focus, or gets replaced in support by a group that already has. However, although individual terrorist groups will find themselves hunted down, killed, negotiated with, made irrelevant or, occasionally, empowered out of existence, new groups will continue to spring up in new forms and with new causes, ensuring that terrorism overall will always be with us and, indeed, strengthening with each successive generation.

The frequency of terrorist attacks, particularly at the most outrageous end, over the last 40 years would suggest that terrorism itself, despite the destruction of most of the people practising it amongst the mayhem they cause, has indeed proved at least as antifragile as restaurants. So, in the same way that we are all getting fed better, more and more people and resources are also being sucked into a battle which looks set to continue escalating. Because the nature of terrorism is, like the availability of pizza in your neighbourhood, that it benefits from adversity.

This suggests to me:

a. that we should rethink the constant upping of security measures against a threat which is only strengthened by them; and
b. that you shouldn’t believe everything you read.