Perspectives on capital markets and personal finance

Main menu

Post navigation

Complexity, Chaos and Chance

Regular readers of this site have seen this from me before and can read it on the masthead above: Information is cheap but meaning is expensive. But that wasn’t always the case.

For most of its existence, the investment business was driven by information. Those who had it hoarded and guarded it. They tended to dominate the markets too, because good information could be so very hard to come by. But the inexorable march of time as well as the growth and development of information technology has changed the nature of the investment “game” dramatically. Success in the markets today is driven by analysis — in other words, meaning.

As Clay Shirky has pointed out (based upon Dan Sperber’s Explaining Culture), culture is “the residue of the epidemic spread of ideas.” These ideas are, themselves, overlapping sets of interpersonal interactions and transactions. Thus culture is predicated upon a network consisting of the externalization of ideas (A tells B that “successful investing is…”) and the internalization of expressions (B decides that “successful investing is…”). This network allows for ideas to be tested, adapted and adopted, expanded and developed, grown and killed off based upon how things play out and turn out. Some ideas receive broad acceptance and application. Others are only used within a small subculture. Some fail utterly. All tend to ebb and flow.

That explanation of culture seems directly applicable to the markets, especially since culture itself is best seen as a series of transactions – a market of sorts. If I’m right about this, understanding the markets is really a network analysis project and is (or should be) driven by empirical questions about how widespread, detailed, and coherent the specific ideas – the analysis and application of information (“meaning”) – ultimately turn out to be.

Because there is no overarching “container” for culture (or, in my interpretation, the markets), creating (or even discovering) anything like “rules” of universal application is not to be expected generally, especially where human behavior is part of the game. That would explain why actual reductionism – lots of effects from few causes – is so rare in real life. Thus seemingly inconsistent ideas like the correlation of risk and reward and the success of low volatility investing can coexist successfully. Perhaps more and better information and analysis will suggest an explanation, but perhaps not, and we’ll never be able to explain it all.

“Einstein said he Could never understand it all.”

As information increases – within given markets and sub-markets or within the market as a whole – so will various efficiencies. That means that various approaches and advantages, as the ideas that generate them, will appear and disappear, ebb and flow, over time. Trades get crowded. What works today may not continue to work. As the speed of information increases, so will the speed of market change.

Markets – like culture – are thus “asynchronous network[s] of replication, ideas turning into expressions which turn into other, related ideas.” Some persist over long periods of time while others are fleeting. But how long they persist remains always open to new information and better analysis.

In today’s world, with more and more information available faster and faster, it’s easy to postulate that market advantages will be harder to come by and more difficult to maintain. But an alternate (and I think better if not altogether inconsistent) idea is that the ongoing growth and glut of data will make the useful interpretation of that data more difficult and more valuable.

The ultimate arbiter of investment success, of course, is largely empirical. Obviously, we should gravitate toward what works, no matter our pre-conceived notions. No matter how elegant or intuitive the proposed idea, only results should matter. In our information rich world, those who can extract and apply meaning from all the reams of available information will be in ever-increasing demand. Those who cannot will and should fall by the wayside.

Typical thinking — thinking that should be cast to the dustbin of history — fails to grasp the complexity and dynamic nature of financial markets. Which brings me to my topic today.

Earlier this week, my friend Josh Brown quoted (from Forbes) hedge fund billionaire Paul Singer on the difficulties in predicting market behavior. “The important turning points in markets are never identified with precision in advance by ‘experts’ and policymakers. This lack of foresight is not surprising, because markets and the course of the economy are not model-able scientific phenomena but rather are examples of mass human behavior, which are never predictable with anything like precision,” says Singer. “But what is surprising is that even the most sophisticated investors, traders and commentators continue to rely on predictions issued by those who have no record of success at such forecasts.”

In fact, here is graphical proof (from Shiller) that the market is irrational. It shows the real Standard and Poor’s Composite Stock Price Index (solid line p) and the ex post rational price (dotted line p*), 1871- 1979, both detrended by dividing a longrun exponential growth factor. The variable p* is the present value of actual subsequent real detrended dividends, subject to an assumption about the present value in 1979 of dividends thereafter.

Anyone with even an ounce of self-awareness knows we’re anything but rational far too often and that the markets reflect as much. We are driven by emotions such as fear, greed and ego and beset by myriad cognitive flaws that make it monumentally hard for us to make good decisions, especially about money. Anyone who lived through the dotcom bubble (Pets.com!) can only laugh at the silly idea that we’re cool rationalists. On our best days, when wearing the right sort of spectacles, squinting and tilting our heads just so, we can be observant, efficient, loyal, assertive truth-tellers. However, on most days, all too much of the time, we’re delusional, lazy, partisan, arrogant confabulators. It’s an unfortunate reality, but reality nonetheless.

Yet as compelling as the behavioral story is – which is really compelling and one I tell often and passionately – it’s not the whole story. Not by a long shot. We must also consider the world as we experience it and have come to understand it. Despite our best efforts to make it predicable and manageable, that world is too immensely complex, chaotic and chance-ridden for us to do so.

Complexity

“Too large a proportion of recent ‘mathematical’ economics are mere concoctions, as imprecise as the initial assumptions they rest on, which allow the author to lose sight of the complexities and interdependencies of the real world in a maze of pretentious and unhelpful symbols.”

The world we live in is infinitely complex. As NASA’s Gavin Schmidt explains, “[w]hether you are interested in the functioning of a cell, the ecosystem in Amazonia, the climate of the Earth or the solar dynamo, almost all of the systems and their impacts on our lives are complex and multi-faceted.” Similarly, the interworkings and interrelationships of the markets are really complex. It is natural for us to ask simple questions about complex things, and many of our greatest insights have come from the profound examination of such simple questions. However, as Schmidt emphasizes, “the answers that have come back are never as simple. The answer in the real world is never ‘42’.”

The great Russian novelist Leo Tolstoy gets to the heart of the matter when he asks, in the opening paragraphs of Book Nine of War and Peace: “When an apple has ripened and falls, why does it fall? Because of its attraction to the earth, because its stalk withers, because it is dried by the sun, because it grows heavier, because the wind shakes it, or because the boy standing below wants to eat it?” With almost no additional effort, today’s scientists could expand this list almost infinitely.

A system is deemed complex when it is composed of many parts that interconnect in intricate ways. A system presents dynamic complexity when cause and effect are subtle, over time. Thus complex systems may exhibit dramatically different effects in the short-run and the long-run and dramatically different effects locally as compared with other parts of the system. Obvious interventions to such systems may produce non-obvious — indeed wildly surprising – consequences.

In other words, a system is complex when it is composed of a group of related units (sub-systems), for which the degree and nature of the relationships is imperfectly known. The system’s resulting emergent behavior, in the aggregate, is difficult – perhaps impossible – to predict, even when sub-system behavior is readily predictable.

Complexity theory, therefore, attempts to reconcile the unpredictability of non-linear dynamic systems with our sense (and in most cases the reality) of underlying order and structure. Its implications include the impossibility of long-range planning despite a pattern of short-term predictability together with highly dramatic and unexpected change. Contrary to classical economic theory, complexity theory demonstrates that the economy is not as a system in equilibrium, but rather a system in motion, perpetually constructing itself anew.

Sound familiar?

The financial markets are simply too complex and too adaptive to be readily predicted.1

Just as the King thought (albeit wrongly) about Mozart’s music containing “too many notes” in Amadeus (see above), there are too many variables to predict market behavior with any degree of detail, consistency or competence (the explanation for the shuttering of Nevsky Capital, a previously well-regarded hedge fund, provides easy evidence for this idea). Unless you’re Seth Klarman or somebody like him (none of whom is accepting capital from the likes of us), your crystal ball almost certainly does not work any better than anyone else’s.

But the problem extends further up and farther in. As CalTech systems scientist John C. Doyle has established, a wide variety of systems, both natural and man-made, are robust in the face of large changes in environment and system components, and yet they are still potentially fragile to even small perturbations. Such “robust yet fragile” networks are ubiquitous in our world. They are “robust” in that small shocks do not typically spread very far in the system. However, since they are “fragile,” a tiny adverse event can bring down the entire system.

Such systems are efficiently fine-tuned and thus appear almost boringly robust despite the potential for major perturbations and fluctuations. As a consequence, systemic complexity and fragility are largely hidden, often revealed only by rare catastrophic failures. Modern institutions and technologies facilitate robustness and efficiency, but they also enable catastrophes on a scale unimaginable without them — from network and market crashes to war, epidemics, and climate change.

While there are great benefits to complexity as it empowers globalization, interconnectedness and technological advance, there are unforeseen and sometimes unforeseeable yet potentially catastrophic consequences too. Higher and higher levels of complexity mean that we live in an age of inherent and, according to the science, increasing surprise and disruption. The rare (but growing less rare) high impact, low-frequency disruptions are simply part of systems that are increasingly fragile and susceptible to sudden, spectacular collapse. John Casti’s X-Events even argues that today’s highly advanced and overly complex systems and societies have grown highly vulnerable to extreme events that may ultimately result in the collapse of our civilization. Examples could include a global internet or technological collapse, transnational economic meltdown or even robot uprisings.

We are thus almost literally (modifying Andrew Zolli‘s telling phrase slightly) tap dancing in a minefield. Not only is the future opaque to us, we also don’t quite know when our next step is going to result in a monumental explosion.

Over 50 years ago, Edward Lorenz created an algorithmic computer weather model at MIT to try to provide accurate weather forecasts. During the winter of 1961, as recounted by James Gleick in Chaos: Making a New Science, Professor Lorenz was running a series of weather simulations using his computer model when he decided to repeat one of them over a longer time period. To save time (the computer and the model were primitive by today’s standards) he started the new run in the middle, typing in numbers from the first run for the initial conditions, assuming that the model would provide the same results as the prior run and then go on from there. Instead, the two weather trajectories diverged on completely separate paths.

After ruling out computer error, Lorenz realized that he had not entered the initial conditions for the second run exactly. His computer stored numbers to an accuracy of six decimal places but printed the results to three decimal places to save space. Lorenz had entered the rounded-off numbers for his second run starting point. Astonishingly, this tiny discrepancy altered the end results enormously.

This finding (even using a highly simplified model and confirmed by further testing) allowed Lorenz to make the otherwise counterintuitive leap to the conclusion that highly complex systems are not ultimately predictable. This phenomenon (“sensitive dependence”) came to be called the “butterfly effect” in that a butterfly flapping its wings in Brazil might set off a tornado in Texas. Lorenz built upon the work of late 19th century mathematician Henri Poincaré, who demonstrated that the movements of as few as three heavenly bodies are hopelessly complex to calculate, even though the underlying equations of motion seem simple.

Accordingly and at best, complex systems – from the weather to the markets – allow only for probabilistic forecasts with significant margins for error and often seemingly outlandish and hugely divergent potential outcomes. We can generally accept mild outcome differences in our models, forecasts and expectations. Whether a particular stoplight is red or green at a given time doesn’t generally change our expectation for how long our commute to work is going to take. But the outcomes Lorenz discovered are another matter altogether. Traditional economics has generally failed to grasp the inherent complexity and dynamic nature of the financial markets, which (utterly chaotic) reality goes a long ways towards providing a decent explanation for the 2008-2009 real estate meltdown and financial crisis that seem inevitable in retrospect but were predicted by almost nobody.

Financial markets exhibit the kinds of behaviors that might be predicted by chaos theory (and the related catastrophe theory). They are dynamic, non-linear, and highly sensitive to initial conditions. As shown above, even tiny differences in initial conditions or infinitesimal changes to current, seemingly stable conditions, can result in monumentally different outcomes. Thus markets respond like systems ordered along the lines of self-organizing criticality – unstable, fragile and largely unpredictable – at the border of stability and chaos.

One practical outworking of this idea is that markets don’t need any clear and obvious catalyst in order to cascade downward. Think back to Black Monday, October 19, 1987, for a terrifying example. Coverage by The Wall Street Journal of that day began simply and powerfully. “The stock market crashed yesterday.”

On that fateful day, after five days of intensifying stock market declines, the Dow Jones Industrial Average lost an astonishing 22.6 percent of its value (for its part, the S&P 500 dropped 20.4 percent), plummeting 508.32 points after a 5-year run from 776 in August of 1982 to a high of 2,722.42 in August, 1987. Black Monday’s losses far exceeded the 12.8 percent decline of October 28, 1929, the start of the Great Depression. It was “the worst market I’ve ever seen,” said John J. Phelan, Chairman of the New York Stock Exchange, and “as close to financial meltdown as I’d ever want to see.”

But a closer look at the coverage and its aftermath is revealing, especially for what is missing. There is no “smoking gun.” According to the Big Board’s Chairman, at least five factors contributed to the record decline: the market’s having gone five years without a major correction; general inflation fears; rising interest rates; conflict with Iran; and the volatility caused by “derivative instruments” such as stock-index options and futures. Although it became a major part of the later narrative, Phelan specifically declined to blame the crash on program trading alone.

In other words, the market collapse had no definitive (or even clear) trigger. The market dropped by almost a quarter for no obvious reason. And while it’s counterintuitive, that observation is wholly consistent with catastrophes of various sorts in the natural world and in society. Wildfires, fragile power grids, mismanaged telecommunication systems, global terrorist movements, migrating viruses, volatile markets and the weather are all self-organizing, complex systems that evolve to states of criticality. Upon reaching a critical state, these systems then become subject to cascades, rapid down-turns in complexity from which they may recover but which may be experienced again repeatedly.

This phenomenon was discovered largely on account of the analysis of sandpiles. Scientists began examining sandpiles and discovered that each tiny grain of sand added to the pile increased the overall risk of avalanche, but which grain of sand would make the difference and when the big avalanches would occur remained unknown and unknowable. It’s eerie — really bizarre even — but nonetheless true. If you doubt the science, take a look at one simple example, using magnets (or, in more scientific jargon, a small perturbation, in this case a magnetic field disruption, causes a chain reaction).

Chaos – which is inherently unpredictable and which thus impedes our inherent desire for order and certainty – is typically deemed a great enemy to be shunned and is thus routinely seen as a malevolent force, as the Joker demonstrated in The Dark Knight (in the embedded clip above). “I try to show the schemers how pathetic their attempts to control things really are.”

I merely wish to suggest that chaos isn’t necessarily evil or that it always provides opportunity. It’s just reality.

“Life finds a way.”

If we are to succeed in investing and in life generally, we need to recognize that chaos is an inherent part of life. Sometimes it is a pit to be avoided, sometimes it presents opportunity (Warren Buffett: “Be fearful when others are greedy and greedy when others are fearful”), but it is always there. We need to embrace uncertainty and plan for it as best we can all the while recognizing that we can’t plan for every contingency – not even close. The markets (like the world) are too complex, chaotic and chance-ridden to be manipulated in that way and to that extent.

If you can’t take Bob Dylan’s precedent and make chaos a friend, at least accept it as a constant companion, because “it exists and that’s all there is to it.”

For markets, that means that we should not expect ever to be able to identify the trigger of any correction or crash in advance or to be able to predict such an event with any degree of specificity. Even though it probably won’t be — markets generally go up, for very good fundamental reasons — tomorrow could be the day that the market tanks in a big way…but you shouldn’t expect to see it coming.

Chance

“I returned, and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favour to men of skill; but time and chance happeneth to them all.”

Tom Stoppard’s Rosencrantz and Guildenstern are Dead presents Shakespeare’s Hamlet from the bewildered point of view of two of the Bard’s bit players, the comically indistinguishable nobodies who become headliners in Stoppard’s play. The play opens before our erstwhile heroes have even joined the action in Shakespeare’s epic. They have been “sent for” and are marking time by flipping coins and getting heads each time (the opening clip from the movie version is shown above). Guildenstern keeps tossing coins and Rosencrantz keeps pocketing them. Significantly, Guildenstern is less concerned with his losses than in puzzling out what the defiance of the odds says about chance and fate. “A weaker man might be moved to re-examine his faith, if in nothing else at least in the law of probability.”

The coin tossing streak depicted provides us with a chance to consider these probabilities. Guildenstern offers among other explanations the one mathematicians and investors should favor — “a spectacular vindication of the principle that each individual coin spin individually is as likely to come down heads as tails and therefore should cause no surprise each individual time it does.” In other words, past performance is not indicative of future results.

Even so, how unlikely is a streak of this length?

The probability that a fair coin, when flipped, will turn up heads is 50 percent (the probability of any two independent sequential events both happening is the product of the probability of both). Thus the odds of it turning up twice in a row is 25 percent (½ x ½), the odds of it turning up three times in a row is 12.5 percent (½ x ½ x ½) and so on. Accordingly, if we flip a coin 10 times (one “set” of ten), we would only expect to have a set end up with 10 heads in a row once every 1024 sets {(½)10 = 1/1024}.

Rosencrantz and Guildenstern got heads more than 100 consecutive times. The chances of that happening are: (½)100 = 1/7.9 x 1031. In words, we should expect it to happen once in 79 million million million million million (that’s 79 with 30 zeros after it) sets. By comparison, the universe is about 13.9 billion years old, in which time only about 1017 seconds (1 with 17 zeros after it) have elapsed. Looked at another way, if every person who ever lived (around 110 billion of us) had flipped a 100-coin set simultaneously every second since the beginning of the universe (again, about 13.9 billion years ago), we should expect all of the 100 coins to have come up heads two times.

If anything like that had happened to you (especially on a bet), you’d agree with Nassim Taleb that the probabilities favor a loaded coin. But then again, while 100 straights heads is less probable than 99, which is less probable than 98, and so on, any exact order of tosses is as likely (actually, unlikely) as 100 heads in a row: (½)100. We notice the unlikelihood of 100 in a row because of the pattern and we are pattern-seeking creatures. More “normal” combinations look random and thus expected. We don’t see them as noteworthy. Looked at another way, if there will be one “winner” selected from a stadium of 100,000 people, each person has a 1 in 100,000 chance of winning. But we aren’t at all surprised when someone does win, even though the individual winner is shocked.

The obvious conclusion: The highly improbable happens all the time. In fact, much of what happens is highly improbable. This math explains why we shouldn’t be surprised when the market remains “irrational” far longer than seems possible. But we are. The world is far more random than we’d like to think. Indeed, the world is far more random than we can imagine.

Even worse, our economic and market models typically assume a “mild randomness” of market fluctuations. In reality, what visionary mathematician Benoît Mandelbrot calls “wild randomness” prevails: risk is concentrated in a few rare, hard (perhaps impossible) to predict, extreme, market events. The fractal mathematics that he invented allow us a glimpse at the hidden sources of apparent disorder, the order behind monstrous chaos. However, as Mandelbrot is careful to emphasize, it is empty hubris to think that we can somehow master market volatility. When one looks closely at financial-market data, seemingly unexplained accidents routinely appear (as outlined above).

The financial markets are inherently dangerous places to be, Mandelbrot stresses. “By drawing your attention to the dangers, I will not make you rich, but I could help you avoid bankruptcy. I am a doomsday prophet — I promise more blood and tears than windfall profits.” Yet despite those warnings, we continue to search (largely in vain) for methods to the madness.

In what is now a ubiquitous concept, a “black swan” is an extreme event – good or bad – that lies beyond the realm of our normal expectations and which has enormous consequences (e.g., Donald Rumsfeld’s “unknown unknowns”). It is by definition an outlier. Examples include the rise of Hitler, winning the lottery, the fall of the Berlin Wall and the ultimate demise of the Soviet bloc, the development of Viagra (which was originally designed to treat hypertension before a surprising side effect was discovered) and of course the 9.11 atrocities.

As Nassim Taleb famously pointed out in his terrific book outlining the idea, most people (at least in the northern hemisphere) expect all swans to be white because that is consistent with their personal experience. Thus a black swan (native to Australia) is necessarily a surprise. Black swans also have extreme effects, both positive and negative. Even though I think that Taleb overstates their overall significance somewhat, just a few explain a surprising amount of our history, from the success of some ideas and discoveries to events in our personal lives. Moreover, their influence seems to have grown beginning in the 20th century (on account of globalization and growing interconnectedness), while ordinary events — the ones we typically study, discuss and learn about in history books or from the news — seem increasingly inconsequential.

For the purposes considered here, the key element of black swan events is their unpredictability. Yet we still try to predict them. We often insist upon it.

I suspect that our insistence on trying to do what cannot be done is the natural result of our constant search for meaning in an environment where noise is everywhere and signal vanishingly hard to detect. Randomness is dismayingly difficult for us to deal with too. We are meaning-makers at every level and in nearly every situation. Even so, as ever, information is cheap and meaning is expensive.

Of course, having knowledge of the obstacles we face doesn’t always help. We humans don’t want accuracy nearly so much as we want reassurance. As reported by Jason Zweig (and a story told by the late, great Peter Bernstein) in The Wall Street Journal, the Nobel laureate Kenneth Arrow did a tour of duty as a weather forecaster for the U.S. Air Force during World War II. Ordered to evaluate mathematical models for predicting the weather one month ahead, he found that they were worthless. When he reported his findings, his superiors sent back another order: “The Commanding General is well aware that the forecasts are no good. However, he needs them for planning purposes.”

Luck (randomness) is a huge factor in investment returns, irrespective of manager. But we like to think that intelligence and effort can readily overcome the vagaries of the markets. That’s largely because we are addicted to certainty. That explains, for example, the popularity of the color-by-numbers television procedural, wherein the crime is neatly and definitively solved and all issues related thereto resolved in an hour (nearly a quarter of which is populated with commercials). As television critic Andy Greenwald explains, these crime dramas are “selling the gruesomeness of crime and the subsequent, reassuring tidiness of its implausibly quick resolution.”

Even after 100 heads in a row, the odds of the next toss being heads remains one-in-two (the “gambler’s fallacy” is committed when one assumes that a departure from what occurs on average or in the long-term will be corrected in the short-term). Still, we look for patterns (“shiny objects”) to convince ourselves that we have found a “secret sauce” that justifies our making big bets on less likely outcomes. In this regard, we are dumber than rats – literally.

In numerous studies (most prominently those by Edwards and Estes, as reported by Philip Tetlock in Expert Political Judgment), the stated task was predicting which side of a “T-maze” holds food for the subject rat. Unbeknownst both to observers and the rat, the maze was rigged such that the food was randomly placed (without a pattern), but 60 percent of the time on one side and 40 percent of the time on the other.

The rat quickly “gets it” and waits at the “60 percent side” every time and is thus correct 60 percent of the time. Meanwhile, the human observers kept looking for patterns and chose sides in rough proportion to recent results (recency bias). As a consequence, the humans were right only 52 percent of the time – they were (as we are, in this respect at least) much dumber than rats. Overall, we insist on rejecting probabilistic strategies that accept the inevitability of randomness and error. Therefore, our conclusions need to be consistent with and supported by the data, no matter how bizarre the numbers or how long the streak.

Even 100 in a row.

__________

1 Scientists at the RAND Corporation have developed a set of practical principles for coping with complexity and uncertainty — to give us a chance to learn, to adapt, and to shape the future to our liking. Simplicity and flexibility are key. Complexity is not.