How Modern Scenario Planning Developed

The story of modern scenario planning has been told many times but here is a brief sketch of its history. An account from 1999, when the use of scenarios had been evolving for almost 30 years in corporations, and had also been adopted by government and international organisations, has the following introduction:

“While all companies learn, the crucial element is to be able to learn fast enough to sustain a competitive advantage. This is becoming increasingly important in today's fast-changing world. Unfortunately, teaching is a very ineffective way to communicate information. Sometimes changing or suspending corporate rules can accelerate learning. A very effective learning tool, which can be described as a form of game playing, is developing ‘what if’ scenarios and planning responses to them.”[1]

International organisations, government institutions and multiple other associations don't have commercial competitors but it is still advantageous for the communities they serve to be able to act in an informed fashion. It will assist the music sector project to have a well-defined idea based on extensive preparation to identify issues. Our answer is scenario planning.

A workbook by Michigan State University professor Richard Bawden[2] provides another useful reference by an Australian-born academic who is still actively developing these ideas in the United States.

The United States military first used scenarios after World War II for strategic planning — think-tanks such as the RAND Corporation used scenario-type thinking and simulated war games of military strategy for decades.

In business, Royal Dutch Shell's experience began in 1971 with what was to become the first widely documented and recognised success story of modern scenarios. Pierre Wack, recognised as one of the great teachers of scenario planning, introduced Shell to scenarios. The price of oil rapidly became the main item on the agenda. Wack convinced Shell’s management that hiring more forecasters to extrapolate existing realities in a straight line had become pointless, indeed extremely risky in the new world of societal, geopolitical and economic discontinuities that emerged in many different areas from the late 1960s. The only way to stability, Wack argued, was to accept uncertainty, and organisations would have to be systematically open to heresy.

According to a later chief planner at Shell, Kees van der Heijden, two goals were defined. One was to create ‘a critical mass of intent’ concerning possible upheavals in an industry, and develop scenarios to provide a way of experiencing upheavals before they actually occur — thus preparing managers to react quickly if and when real events arise. The second goal was to spur extensive debate to prepare a context for future planning.[3]

Shell gained a famous advantage when envisaging in one of its first scenarios that the Organisation of Petroleum Exporting Countries in the Middle East would take radical action and proclaim an oil embargo — which as everyone knows actually happened in 1973. This action by the OPEC cartel caused a strong rise in oil prices and a decline in demand for oil after steady annual growth of 6-7% since the 1940s. Shell acted as promptly as possible on the implications of its scenario story to cut its planned volume of production, but the other oil refiners apparently needed two years to discover that anything had happened and then required another five or six years to work out the real impact of the oil crisis. So the world oil market became badly oversupplied through the rest of the 1970s.[4]

Moya Mason notes, remarkably in view of the current Music Trust project:

“The scenario process involves great amounts of research, both broad searches that facilitates educating the planner, and narrow focuses needed for a specific scenario. Each scenario requires specific research, although there are some constant subjects that need to be studied at all times such as science and technology, and music (italics added). Peter Schwartz [another former chief planner at Shell who is also mentioned in Global Risk Factors and Music in Australia] says that music is a window onto the freedom in the future. (So pay attention to the music!)”

It is even more remarkable that not one of the scenarios we discuss in Global Risk Factors and Music in Australia pays any attention to the arts, let alone music — whether as an inspiration for scenario planning or as a subject for such planning.

The History Parallel

Historians have debated for at least two centuries whether their role is to describe events as if there were no underlying causes or effects — as if history was divinely ordered, or governed by ‘reason’ as the Age of Reason in the late 18th and early 19thcenturies would have it, the class struggle (Karl Marx), or any other deterministic ‘law’. Logically, deterministic historical models do not allow ‘what if’ or imagined (‘counterfactual’) alternatives to be offered.

In a 90-page essay prefacing nine ‘what if’ cases ranging across ‘What if there had been no American revolution?’, ‘Germany had invaded Britain in 1940’, and ‘John F. Kennedy had not been assassinated’, British historian Niall Ferguson[5] concludes based on chaos theory that it is legitimate to pose counterfactual historical alternatives if these are plausible so that mere chance is replaced by probabilities.[6] “The counterfactual scenarios we therefore need to construct are not mere fantasy: they are simulations based on calculations about the relative probability of plausible outcomes in a chaotic world (hence ‘virtual history’).” (p 85)

“Under these circumstances, the search for universal laws of history is futile. The most historians can do is to make tentative statements about causation with reference to plausible counterfactuals, constructed on the base of judgements about probability.” (p 89)

“At any given moment in history there are real alternatives. … How can we ‘explain what happened and why’ if we only look at what happened and never consider the alternatives … It is only if we place ourselves before the alternatives of the past …, only if we live for a moment, as the men of the time lived, in its still fluid context and among its still unresolved problems, if we see those problems coming upon us, … that we can draw useful lessons from history.” British historian Hugh Trevor-Roper[7])

What If Britain had Stood Aside in 1914?

I am writing this article in the centenary year of the outbreak of the First World War in August 1914 — in fact only a few days before the 100th anniversary of the final departure on 1 November 1914 from Albany, Western Australia of the first Anzacs who fought at Gallipoli from 25 April 1915. World War I will be on the centenary radar for the next four years. It is an appropriate example to choose as one of the bloodiest and most influential events in the 20th century, quite apart from offering a good illustration of virtual history.[8]

The author Hector Hugh Munro (who used the pen name Saki) in 1913 told how the protagonist of one of his stories returned from ‘darkest Asia’ to find a vanquished Britain ‘incorporated into the Hohenzollern Empire … as a Reichsland, a sort of Alsace-Lorraine washed by the North Sea instead of the Rhine’, with Berlin-style cafés in ‘Regentstrasse’ and on-the-spot fines for walking on the grass in Hyde Park. (p 229)

The historian Basil Liddell Hart argued in 1942, during the thick of World War II, that Germany could have been defeated in World War I without embroiling Britain in the prolonged continental campaign if the British Expeditionary Force (BEF) had been sent to Belgium in 1914 rather than France, or if more troops had been made available for the Dardanelles invasion in 1915.[9] J. M. Hobson has offered the view that a bigger continental commitment before 1914 could have deterred the Germans from attacking France in the first place.[10]

Niall Ferguson argues that there “remains a third possibility, which has been all but ignored by historians: that of British non-intervention.” Prime Minister H. H. Asquith and Foreign Secretary Edward Grey, both in office until 1916, emphasised that Britain had not been obliged to intervene by any kind of contractual obligation. (p 236)

Ferguson asks: Was war between Britain and Germany inevitable in 1914? British politicians including David Lloyd George and Winston Churchill saw a ‘dangerous disease’ at work and used the metaphor of a typhoon beyond the control of the statesmen. Of more importance, however, was the argument that Britain “could not, for its own safety and independence, allow France to be crushed as a result of aggressive action by Germany.”[11]

Between the closing years of the 19th century and 1914, there was a widely held view that the German Reich intended to make some kind of challenge to British power, so British historians should not ridicule Saki for being a scaremonger and propagandist for the conscription campaign.

Despite this, Niall Ferguson was uneasy about the notion of a preordained war between Britain and Germany, if only because at the time Virtual History was written “eighty years on, the cost of war looms so much larger than its benefits.” If all the sacrifices of the ‘Great War’ were supposed to prevent German hegemony in Europe, this achievement was short-lived. Only twenty years after the war, a far more serious German threat to Britain, and indeed the world, emerged. “It is therefore tempting to ask whether the four years of slaughter in the trenches were indeed as futile as they seemed to the poet Wilfred Owen and others.” (pp 234-235)

The neglect of the ‘counterfactual’ case for neutrality reflects the persuasiveness of postwar apologies from Asquith, Grey and other leaders who admitted that Britain had actually had a choice to stay out but had chosen to enter the war not to be discredited for having played an inglorious part. “While it seems undeniable that a continental war between Austria, Germany, Russia and France was bound to break out in 1914, there was in truth nothing inevitable about the British decision to enter that war. Only by attempting to understand what would have happened had Britain stood aside can we be sure the right decision was made.” (p 237)

Anglo-German cooperation was actually quite widespread around the turn of the 20th century. There were numerous overseas areas where German and British interests potentially coincided. In 1898 and 1900 Joseph Chamberlain argued for such cooperation against Russia in China. There was serious though inconclusive discussion of an Anglo-German-Japanese “triplice” in 1901, and other examples of cooperation between Britain and Germany. Ferguson says, “It is simply untrue to say that the fundamental priorities of policy of each country were mutually exclusive.” (p 239)

The approaches between Germany and Britain did, however, wane in the years that followed, mainly because Britain came to see Germany as less of a threat because of its own superior naval power. The German chancellor Theobald von Bethmann-Hollweg proposed a deal to accept British naval supremacy in return for continental neutrality, but it was rejected by Grey because Britain could have the former without giving way on the latter. (p 252)

The crucial issue in the counterfactual history was Germany’s war aims in 1914. It was alleged after the war that these aims were the annexation of French, Belgian and possibly Russian territory, but Churchill had already demonstrated the weakness of Germany’s position before the war. Chancellor Bethmann-Hollweg’s view then was that war would strengthen the political left and weaken the Reich internally. “No war in Europe can bring us much,” he said. (p 259)

In Niall Ferguson’s counterfactual history, the British Expeditionary Force (BEF) that actually started the British commitment was not sent into Europe, and four years of confrontation and devastation for Britain, its allies and foes were averted because the European war finished years earlier, without Britain being involved.

The dispatch of the BEF in 1914, then, was not a foregone conclusion. The counterfactual history is a war without the BEF. “If the BEF had never been sent, there is no question that the Germans would have won the war.” (p 276)

What then? The Germans would have overwhelmed the French defence quickly, and any British intervention that could be imagined would have been very different. The BEF “would have been rendered obsolete by the French defeat; had it been sent, a Dunkirk-type evacuation would probably have been necessary.” (p 277)

Niall Ferguson sums up his counterfactual history (pp 278-279):

“A fresh assessment of Germany’s pre-war war aims reveals that, had Britain stood aside — even for a matter of weeks — continental Europe would have been transformed into something not unlike the European Union we know today — but without the massive contraction in British overseas power entailed by the fighting of two world wars. Perhaps too the complete collapse of Russia into the horrors of civil war and Bolshevism might have been averted: though there would still have been formidable problems of rural and urban unrest, a properly constitutional monarchy (following Nicholas II’s abdication) or a parliamentary republic would have stood more chance of success after a shorter war. And there certainly would not have been that great incursion of American financial and military power into European affairs which effectively marked the end of British financial predominance in the world. True, there might still have been fascism in Europe in the 1920s; but it would have been in France rather than Germany that radical nationalists would have sounded most persuasive. It may even be that, in the absence of a world war’s stresses and strains, the inflations and deflations of the early 1920s and early 1930s would have not been so severe. With the Kaiser triumphant, Hitler could have lived out his life as a failed artist and a fulfilled soldier in a German-dominated Central Europe about which he could have found little to complain.”

“By fighting Germany in 1914, Asquith, Grey and their colleagues helped ensure that, when Germany did finally achieve predominance on the continent, Britain was no longer strong enough to provide a check to it.” (p 280)

An Afterword on Counterfactual History

I remember from my student days in the 1950s reading textbooks by the ‘American Keynes’, Harvard Economics Professor Alvin Hansen (1887-1975), including his 1941 book Fiscal Policy and Business Cycles. He assumed as a matter of course what appeared to be a widely held view that the United States would stay out of World War II. The raid on Pearl Harbor happened at the end of 1941, a few months after the book was published, and prompted America to declare war the following day. Why did Japan attack, and was its action inevitable? Would it have done so if it had expected correctly that American domestic support for non-interventionism, which had been strong, would disappear virtually overnight? Now there is stuff for another counterfactual history, as dramatic as the ‘what if’ story about Britain staying out of the European war in 1914!

Not all the counterfactual history examples in Virtual History were seen to change the world as much as Ferguson’s World War I scenario, or the likely consequences if America had not been provoked to enter World War II. American historian Diane Kunz in her ‘Camelot Continued: What if John F. Kennedy had lived?’[12] argues that by the time Kennedy would have stood down after his two terms and Nixon gaining office in 1968, his presidency would have been less than shining. There would have been no early withdrawal from Vietnam and no ‘great society’, as under Lyndon Johnson’s actual presidency. Kennedy’s personal inclination according to this story was to be a foreign policy president: his lack of success in realising a domestic agenda made it indispensable for him to succeed internationally. Kunz concludes (p 391): “The former Communist world has lost its idols. It is now time for Americans to relinquish one of theirs.”

As all good virtual histories and all good scenarios, this is plausible. Plausibility is everything for both subjects — without plausibility, no credibility. But in virtual history and scenarios for the future, other possible versions could be written. Kennedy might have developed, or been influenced by external events to develop, concerns similar to those that drove Johnson. Bill Clinton turned the tide to favour his own administration in his second term (with possible longer-term consequences for his wife Hillary’s presidential ambitions). Barack Obama may still have time to do likewise, though the political tide keeps running against his administration as I write.

Chaos Theory and the Natural and Social Sciences

Niall Ferguson, in Virtual History, argues that the new science of chaos theory marks the end of scientific determinism.

“A great many … philosophers of history who have argued in this [20th] century about whether history was a ‘science’ seem not to have grasped that their notion of science was an out-of-date relic of the nineteenth century. What is more, if they had paid closer attention to what their scientific colleagues were actually doing, they would have been surprised — perhaps even pleased — to find that they were asking the wrong question.” (p 72)

Indeed, Niall Ferguson finds that many modern developments in the natural sciences have been fundamentally historical in character in that they have been concerned with changes over time. Chaos theory demonstrates that the natural world is unpredictable enough to make the task of accurate prediction virtually impossible. The same applies to the social sciences: ”For economists, chaos theory helps to explain why predictions and forecasts based on the linear equations which are the basis for most economic models are so often wrong.” (p 78) Ferguson refers to an article by John Kay, 'Cracks in the Crystal Ball', first appearing in the Financial Times, 29 September 1995. It starts as follows:

”It is a conventional joke that economic forecasters always disagree, and that there are as many different opinions about the future of the economy as there are economists. The truth is quite the opposite. Economic forecasters do not speak with discordant voices; they all say more or less the same thing at the same time. And what they say is almost always wrong. The differences between forecasts are trivial relative to the differences between all forecasts and what happens.”

Two comments follow on John Kay’s article, and one linking forecasting to scenario planning:

Writing in the 1990s which were generally growth years, he also found that the consensus forecasts fell short of the actual outcome. I could place a reasonable bet (not further explored) that current consensus forecasts, six or seven years after the Global Financial Crisis, tend to be optimistic, unless economists as a group have been gripped by a feeling that the economy has entered a fundamental new phase, which I doubt.

I spent the 1960s as an economic forecaster, covering mainly short-term forecasts of the Australian economy and its building and motor vehicle sectors, 12 to 18 months ahead. In retrospect, the decade was exceptional for its stability and predictability, so my forecasts were reasonably accurate. But that didn’t last. Social upheavals piled up in the late 1960s, followed by events starting with the 1973 oil crisis. By that time I had (fortunately) left formal statistical forecasting behind to develop other professional skills.

Even today, formal Australian scenarios have been misused to produce forecasts of particular indicators such as employment of music professionals relative to gardeners and greenkeepers and other “arts and recreation occupations”. See Global Risk Factors and Music in Australia, commenting on the 2012 scenarios by the Australian Workforce and Productivity Agency (AWPA). It might make sense to produce such derived forecasts to fit alternative scenarios dealing with an interconnected entity like the music sector (and maybe even the arts as a whole), but it is difficult to see the connection between the employment of musicians and other largely unrelated “arts and recreation” occupations.

How Chaos is Changing Perceptions

Modern chaos theory started to emerge in the early 1960s, as a wide range of different sciences ran into unexpected and puzzling snags: sciences including astronomy, biology, chemistry, climate, ecology, economics, health, physics — not to mention the mathematics which underlies the theory. Weather forecasting was the first area that was found to defy “the Newtonian promise that the world unfolded along a deterministic path, rule-bound like the planets, predictable like eclipses and tides.”[13] The stumbling block was nonlinearity — while linear systems are mathematically solvable, that is not the case when more complex behaviour enter the system. Linear systems can be captured with a straight line on a graph, nonlinear systems cannot. Here is a simple example:

”Without friction, a simple linear equation expresses the amount of energy you need to accelerate a hockey puck [disc]. With friction the relationship gets complicated, because the amount of energy changes depending on how fast the puck is already moving. Nonlinearity means that the act of playing the game has a way of changing the rules. You cannot assign a constant importance to friction, because its importance depends on speed. Speed, in turn, depends on friction. That twisted changeability makes nonlinearity hard to calculate, but it also creates rich kinds of behaviour that never occur in linear systems.” (Gleick 2008, p 35)

A whole new terminology has been introduced to deal with the impact of nonlinearity, including “fractal geometry”, “strange attractors”, “self-similarity”, “bifurcations”, “intermittencies” and “periodicities” — not to mention “smooth noodle maps”.[14] The central concept binding them together is fractals, which Wikipedia defines as a natural phenomenon or a mathematical set that exhibits a repeating pattern that displays at every scale — arguably chaos’s most striking feature.[15]

Scientists gradually came to realise that the textbooks were wrong in showing only solvable, linear systems, or relatively rare nonlinear systems which could be subjected to linear approximations or some other uncertain backdoor approach. “Nonlinear systems with real chaos were rarely taught and rarely learned. When people stumbled across such things — and people did — all their training argued for dismissing them as aberrations. Only a few were able to remember that the solvable, orderly, linear systems were the aberrations. Only a few, that is, understood how nonlinear nature is in its soul. [Italian Nobel Prize winning physicist] Enrico Fermi once exclaimed, “It does not say in the Bible that all laws of nature are expressible linearly!”[16]

As a result, a new generation of scientists has come along, armed with a more robust set of assumptions about how nature works. They know that a complex dynamical system can get freaky — causing one mathematician described as ‘revelling in chaos’ to remark: “The freaky stuff is turning out to be the mathematics of the natural world.” (p 395)

Different aspects of chaos have been taken up by modern management theorists on one hand, and postmodern literacy theorists on the other. (p 395) Chaos is getting closer to becoming universally used and accepted.

The so-called butterfly effect in chaos theory has become almost a pop cliché: the image of a butterfly flapping its wings in, say, Beijing, causing a tornado in, say, Texas a week later — sensitive dependence on small differences in initial conditions to quote Wikipedia. The butterfly effect conjures up an image of predictability giving way to pure randomness, but the ‘chaos pioneers’ saw more than randomness embedded in their models. They saw, to quote one of them (the research meteorologist and mathematician Edward Lorenz credited with being first to recognise chaos in 1961 and the father of the illustration to the right), a fine geometrical structure, ‘an order masquerading as randomness’. (p 32)

The butterfly effect has its place in folklore:

For want of a nail, the shoe was lost;For want of a shoe, the horse was lost;For want of a horse, the rider was lost;For want of a rider, the battle was lost;For want of a battle, the kingdom was lost.And all for the want of a horseshoe nail.[17]

In conclusion, chaos theory is here to stay. "In the heady early days, researchers described chaos as the century's third revolution in the physical sciences, after relativity and quantum mechanics. What has become clear now is that chaos is inextricable from relativity and quantum mechanics. There is only one physics." (p 399).

It is tempting to add that so many other sciences have become involved that the influence of chaos and nonlinearity goes way beyond physics, not only into other sciences as Gleick's book demonstrates but — at least potentially — into the way we look at culture and the arts and other vital parts of modern society as well.

Chaos, History and Scenarios

Reversing the sequence of the narrative, chaos is intimately linked to nonlinearity, which implies non-predictability. All or most sciences and many commercial activities have benefited from the insights that chaos theory has supplied over the past 50 years. What about music and the other arts? There seems to be no way to put these into a mechanistic straitjacket. They are inherently nonlinear by their very nature.

My discovery of virtual or counterfactual history came from browsing a bookshop in Santa Monica, California, in August 2014. I was aware of chaos from the paper version of James Gleick’s book from 1987. One of the books I bought was Virtual History with Niall Ferguson’s essay demonstrating that it was legitimate to speculate what if such and such hadn’t happened, as in the World War I and Kennedy examples quoted from the book. Ferguson dismissed the tradition that history should be treated as if it was determined, precluding any ‘what if?’ question. His justification was based on chaos theory: “Chaos … means unpredictable outcomes even when successive events are causally linked.” (p 79) His only general condition was that the counterfactual histories should be plausible.

Turning to scenarios versus history, the common factor is again that the stories must be plausible. In Virtual History, past events are remodelled to come up with a counterfactual past history (only one alternative is offered though there seems no logical reason why more could not be included, as in scenario planning). In contrast to virtual history, which sets one alternative course from a past set of circumstances, scenarios take their origin in the present and compare three or four plausible and equally likely futures. Defining these futures provides a basis for planning to avoid the worst ones to achieve the better ones, given the societal, technological, ecological, economic and political realities postulated in each of the scenarios.

New or refreshed insights like these, and the combination of chaos, history and scenario theory in promoting these insights, should serve to make the Music Trust scenario-planning project even more suitable in creating plausible and realistic futures for the Australian music sector, across the spectrum from favourable to less so. Of course, the draft scenarios (as in all scenario planning) will still need feedback from members of the sector but they would be drafted at a more advanced level than if the research summarised in this article hadn’t been done.

Author

References

↑Richard Bawden, Learning from the Future: Of Systems, Scenarios and Strategies, Michigan State University, written in about 2001. Before his appointment as a Visiting Distinguished Professor at Michigan State University in 1999, Richard Bawden AM was a professor at the University of Western Sydney. He is also a foundation director of Global Business Network Australia. GBN remains the best-known international scenario-planning firm

↑Niall Ferguson (ed.), ‘Towards a “chaotic” theory of the past’, Virtual History, Basic Books, New York 1999, 1-90. Original edition published by Picador, London 1997. The page references following are from the American edition.

↑Chaos theory is discussed in the final sections and is an essential part of how the issues gained relevance for history and scenario research.

↑James Gleick, Chaos: Making a New Science, Iconic EBooks 2011, p 22 (downloaded from iBooks). I recommend this authoritative, up-to-date and clear account of chaos theory, which was concluded in 2008 as an update of the original book version, published in 1987. It gives a fascinating 480-page picture of the scientific ‘heroes’ who encountered and tackled seemingly intractable issues. — The page references refer to the portrait-format presentation in the digital version (it can also be read in landscape format on a tablet computer such as iPad).

↑The term refers to the surface pattern of a human brain but all the references are to an album released in 1990 under the same name by American new wave band Devo which contains the line “snake through the chaos with a smooth noodle map".

↑The realisation that chaos theory is important for scenario planning is not new but its central role took years to emerge. Kees van der Heijden wrote in 1996 (pp 34-35): "Lately chaos theory has impressed on the world the view that many phenomena taking place in nature are unpredictable not just because we lack the analytical knowledge and capacity, but are unpredictable in principle. This is related to complexity and non-linear characteristics of systems, which can be shown to result in behaviour which is intrinsically unknowable in its detail. Terms such as "the butterfly effect" (a flutter of the butterfly's wings here causing a storm on another continent next week) are stock in trade."

↑Gleick 2008, p 86. Fermi made his remark long before chaos theory emerged — he died aged only 53 in 1954.

↑Gleick 2008, p 34. The rhyme is said to refer to the death of Richard III of England at the Battle of Bosworth in 1485, but the Wikipedia entry on “for want of a nail” shows that a rather similar earlier version exists. Shakespeare probably strengthened the reference to Richard by having him shout, A horse! A horse! My kingdom for a horse! after he was unhorsed in the battle (Richard III, Act V, Scene 4).