Complexity economics as an early-warning system: a proposal

Abstract

Economics has largely failed to predict global financial crises. In an increasingly globalized world, the consequences of that failure are likely to become more severe. The present article attributes much of economics’ malaise to the traditional and continuing use of equilibrium models. The unifying assumption of the equilibrium viewpoint has been that the components of an economy will persist in a state of balance unless perturbed by external factors. This article challenges that assumption, proposing instead that economies are typically out of balance, and that equilibrium is transitory. That single changed assumption, central to complexity economics, entails additional changed assumptions regarding policymaking. Global economic growth will be most effectively managed, and crises most effectively avoided, when economists deploy dynamic models which address the immediate future. As a programmatic example, Boolean Network (BN) modeling of global economic criticality utilizing the Lempel-Ziv (LZ) complexity metric is proposed as an Early Warning System (EWS) useful in decision-making and forecasting economic crises. The new viewpoint, if adopted, would likely present political challenges, which are evaluated at some length.

Posted for comments on 11 Jan 2016, 2:38 pm.
Published

Comments (4)

I find myself in the odd situation of strongly supporting Wallace’s sentiments in this paper, but scratching my head as to the particulars of his execution.

Essentially, Wallace makes three points: 1) that status-quo ‘equilibrium economics’ has been an unsatisfactory modelling framework to explain, or predict, transient events such as the recent global financial crisis (or worse, as Wallace suggests, applying the status-quo when economic systems may actually never leave transient trajectories); 2) that tuned criticality (not Bak’s self-organised criticality, SOC) is actually a blessing as it represents the best balance between stasis and order; and 3) that boolean network (BN) models, coupled to the Lempel-Ziv (LZ) complexity measure will provide a strong candidate for a so-called ‘early-warning system’ (EWS) (that is, one which can provide measures of distance to ‘economic criticality’. However, point 1 is not new, other authors have made this point forcefully for a couple of decades (at least), and in Wallace’s hands, the point is rather over-argued for my eyes; point 2) is highly questionable; and point 3) seems a weak candidate when compared to other tools on the table, such as agent-based modelling (ABM), which Wallace gives only passing reference. Taken together, I would value Wallace’s interesting and forceful prose to speak strongly into this area, but would urge a more mindful exposition of the lay of the “melt-down modelling” landscape (to borrow a title from Buchanan).

I. “Beyond Equilibrium”

Whilst Wallace gives a good treatment of the potential problems of the dominant equilibrium paradigm in so much mainstream economic thought, others have tilled these soils for years. Wallace would do well to reference a little more of W Brian Arthur. For instance, Arthur’s excellent look back at the role of the Santa Fe Institute broadly, and the ‘Economy as an Evolving Complex System’ research program specifically (the first of SFI’s research programs, incidentally), founded in the ‘Summer of 1988’ (p.150), in “Complexity, the Santa Fe Approach, and non-Equilibrium Economics” (Arthur, 2010) will make for instructive reading. Therein, Arthur details three main features of the ‘Santa Fe approach’ to “complexity economics” (coined by Arthur, as mentioned in a footnote in the paper just mentioned — Wallace would do well to cite this as he introduces the term): I. Beyond equilibrium; II. Perpetual Novelty; and III. Expectational indeterminacy and inductive behaviour. Wallace will find in section I of Arthur’s work many familiar voices, quotes and insights to his own paper. Or, to bring Wallace’s critique of non-equilbirum modelling specifically to the financial system, he would do well to reference two excellent synthesis pieces published in Nature, by Mark Buchanan, “Meltdown Modelling” (Buchanan, 2009) and J Doyne Farmer & Duncan Foley, “The econonmy needs agent-based modelling” (Farmer & Foley, 2009). Both of these pieces, written as the rolling mega-cell of the crisis itself lumbered out of view, bring sharply the critique that Wallace wishes to make in his first main point. Admittedly, Wallace, like Arthur, brings the force of some excellent historical literature to bear on the question, but my point is that as a contribution to academic thought, in its present form, Wallace’s paper does not, to my mind, bring a distinctive contribution to the table.

One point of style before moving on: I found that Wallace (unlike Arthur) has a distinctly uncharitable view of mainstream economics. Wallace calls (rational, neo-classical) economics, ‘a largely unsuccessful framework’ (section 4), and claims that, ‘equilibrium models .. must be at last abandoned’ (section 5), and asks whether we should, ‘continue adhering to [the] antiquated and dangerous fiction’ [of current economics]. Whilst I can agree that the assumption of equilibrium, together with rational optimising behaviour, has taken too strong a place in our modelling toolsets and mental models, especially when handling potentially long-transient systems such as characterises entire economies, Wallace’s language leaves the reader to believe that nothing profitable has come from the entire history of ‘standard’ economic thought. I disagree, for two reasons. First, the statement is plainly untrue, as in many, many (many) situations, ‘standard’, neo-classical, rational, optimising agent assumptions make tremendous headway in interpreting human economic behaviour (and designing welfare improving institutions). (To mention just one — Alvin E Roth’s 2012 Nobel Memorial Prize in Economic Sciences was awarded for profound insights gleaned with completely mainstream economic assumptions.) Not only this, the mathematical framework within which these assumptions are prosecuted gives the gold-standard intuition and interpretability of such models, a point I return to below in reference to BN networks. In short, I feel that Wallace’s language regarding mainstream economics could be tempered. Second, this kind of dismissive language is extremely unhelpful in bringing the exact same ‘mainstream economics’ practitioners to the discussion around a broadening (not replacement) of the fundamental modelling paradigms of the discipline. As Wallace, I’m sure, understands, the field of economics is not without its own political economy machinations. For myself, as someone who is both a proponent of complexity science, evolutionary and non-equilibrium models, computational techniques and all the rest, and who yet sits alongside talented, insightful, variously brilliant, ‘mainstream economists’ of my Department, I consider that charitable conversation is the profitable path for the eventual acceptance of complexity economics, not unconditional dismissal. I urge Wallace to adopt a similar strategy.

II. “Economic criticality”

Wallace makes much of the importance of ‘Economic criticality’ which he summarises as, “a poised system situated between the opposed perils of stalled and overheated economies” (section 4), which he, “designate[s] as an optimised state” (section 4). Whilst not explicitly mentioning it, these ideas will sound very familiar to any student of complexity science acquainted with Stuart Kaufmann’s work and the so-called ‘edge of chaos’ (see, e.g. Kauffman & Johnsen, 1991). The idea here is that there is a region in phase-space, between order (either stasis or periodicity) and chaos (deterministic randomness) which appears highly useful to adaptive systems as it allows the system effectively to get the best of both worlds by small departures into adjacent phases: periods of chaos providing uproar, innovation, novelty, experimentation; and following periods of stasis allowing for calm, consolidation, niche-formation and so on. By switching back and forth, hovering at a so-called ‘edge’ a complex adaptive system is perceived to have maximal flexibility, pattern-generating power, and computational grunt. This promising inter-state was called ‘the edge of chaos’ and was famously identified as ‘class IV’ behaviour in Wolfram’s cellular automata experiments (e.g. Wolfram, 1984). However the idea is not without challenges. First, the reader should be aware that computational notions of ‘the edge’ are potentially an artefact of the system under study (see Mitchell et al., 1993). This is not to say that interesting things never happen at the phase transition in dynamical systems (far from it!), but it is a word of caution to bear in mind when parsing this notion.

Second, and more pressing for me, is that Wallace explains little of why a critical system, such as the nuclear fission reaction, requires such technical, constant and immensely careful oversight. By definition, criticality (self-organised or not) implies that perturbations to the system can have outcomes on any scale. Whilst this may be a wonderful thing for evolutionary biology where perhaps time-scales of thousands or millions of years are available for niche-filing search (and absorbing the mess created by some genetic-avalanche-causing mutation), I doubt that any central banker, government, or populous at large, would like to think that their economy should be ’tuned’ towards this position. In effect, this suggests that the economy is ‘optimised’ when a small fluctuation yesterday, which led to little more than a $10 packet of shares being sold, will today, by identical fluctuation, cause 15% of the market’s value to be wiped, triggering the kind of global catastrophe we saw prevail in 2007-2008. Is this an economy that feels ‘optimised’? The central paper Wallace uses to link this optimal critical state of affairs to economics is Torres-Sosa et al (2012) which is entirely biological in nature, applying to gene-regulatory networks. Granted, as Alfred Marshall noted, ‘the Mecca of the economist lies in ‘Economic Biology’”, and I am fully invested in bringing biological concepts to economics, however, it rests with Wallace to convince us further that gene-regulatory networks are sufficiently ‘like’ economics networks to warrant bringing such a controversial concept as ‘optimal’ economic criticality across domains.

More of an issue is that Wallace’s proposal supposes that ‘policymakers or agents’ (section 2), carrying out ‘multilateral tuning .. on an increasingly frequent basis’, perhaps through something called ‘fast diplomacy’ would be capable of ‘sustaining criticality in global economic networks’ (section 4). Again, this is a strange move. Wallace is firm that he is /not/ proposing a self-organised style criticality (despite the central theme of the key paper (Torres-Sosa et al., 2012) being precisely that the criticality in that paper was ‘emergent’), rather he now introduces the central-controllers who will ’tune’ the economic system to be ‘poised’ at economic criticality. Not only do I seriously wonder if criticality is anything like ‘optimised’ for the ‘global economic system’, I am thoroughly unconvinced that any agent, or agents, will ever have the kind of power over the system to ‘sustain criticality’. Wallace appears to miss the full force of phase-transitions — they are inherently unstable. The tendency is to run away (fast) into one of the two or three surrounding phases. Tuning such a system, like the nuclear fission reaction, to preserve the critical state, is surely one of the high water marks of mankind’s technological powers. And here, every variable, every core, every reaction is measured with precise accuracy. We have absolutely nothing like this level of information on the vast messy, organic, ecosystem we call the global economic system. Wallace’s grand notion of control departs, to my mind, from the fundamental Hayekian roots of complexity economics: no intelligence, no computer, not even a machine-augmented future human will be able to stand in for the supreme algorithmic power of the disaggregated, autonomous, market system (Hayek, 1974). Even if we grant the optimised state of economic criticality, ‘tuning’ the system to sustain this state is something I consider, like Hayek, impossible. Taken together, I look forward to Wallace expanding this section to convince the reader of: a) the motivational connection from emergent criticality in gene-regulatory networks; and b) the feasibility of ‘tuning’ the economic system.

III. Why not ABMs?

Finally, the main-course of Wallace’s work is to propose a novel way of measuring the departure of the economic system from this state of economic criticality. His proposal involves first creating an ensemble of abstract BNs with varying characteristics along a {stasis — chaos} interval and then producing probability distributions of their statewise L-Z complexity, as a baseline. Then, a further set of BNs would be tuned to real, ‘binarised’, economic, cultural, and political, data, and the corresponding probability distributions of L-Z complexity collected. Finally, the latter distributions would be compared with a suitable distance metric to the stasis, edge, and chaos ensemble distributions to identify if the global economic system is close, near, or far, (or very far?), from this optimised state of economic criticality. I won’t get into the details of BNs, L-Z complexity and so on — the approach seems to me /technically/ feasible (though my mind goes quickly to the mundane scientific concerns of calibration, validation and so on …).

However, what I wonder is why Wallace makes only passing reference to agent-based-modelling (ABM)? In the text, Wallace suggests that ABMs are potentially useful at a granular level, but higher order modelling such as with BNs is an appropriate step forward. No doubt, scaling considerations are important, and different approaches at different granularities are going to yield complimentary benefits to the scientific community. However, BNs are black-box models — one cannot scrutinise a BN of the kind Wallace will need to mimic global economic dynamics and draw any kind of intuition from it. This is the well-known compromise anyone working with ‘machine-learning’ takes on board in exchange for potentially vast predictive power. Whereas, ABMs of a certain kind seem well suited, with development, to the kind of ‘early warning system’ Wallace hopes for. For example, the work of Giovanni Dosi and co-workers (Dosi et al., 2010 & 2013) are prime examples of an ABMs which deliver non-equilibrium, realistic, economic properties, but include strutinisable industrial, government and other modules. This is white-box modelling with considerable promise. Closer to home for Wallace’s interest, John Geanakoplos and co-workers (2012) show off the usefulness of ABM analysis when applied to the US housing bubble of 97-‘07. Or alternatively, over in the world of advanced climate-change economics modelling, ABMs have been proposed as the state-of-the-art candidate for taking the field beyond computable general equilibrium (CGE) style models, yet retaining insight (Farmer et al, 2015).

In any case, Wallace’s stumping for black-box BNs as his predictive device boils down to suggesting little more than a kind of ‘doomsday clock’ for economic armageddon. I’d thus like to see Wallace work much harder at the motivation for his suggested approach, especially why contemporary methods such as ABMs or even CGE models (and their derivatives) will not work.

To return to where I started, lest the reader mis-understand, I’m in steady agreement with much of Wallace’s contribution. Indeed, it is because of this that I want to see the work developed, refined and deepened along the lines I suggest. I am not here to disagree with Wallace, but rather, to see the thrust of Wallace’s ideas succeed.

I would like to begin by thanking Simon Angus for a detailed and thoughtful critique of my submitted article “Complexity economics as an early-warning system: a proposal.”

Having carefully read—and re-read—Angus’ commentary, I find myself in agreement with all of his methodological recommendations, and will revise the article accordingly. Indeed, the only shortcoming I find in his outstanding review is his evident reluctance to evaluate the admittedly formidable policy implications of a computationally-based Early Warning System (EWS) with the same degree of scrutiny that he has applied to the modeling issues. While he does in fact note that the EWS I have outlined—which utilizes Boolean networks (BNs) and the Lempel-Ziv (LZ) complexity metric—would be confronted with a challenging level of algorithmic complexity when applied to the “vast, messy, organic ecosystem” of the global economy, he does not consider the political dimensions of attempting to deploy this—or any—computational approach (a topic discussed at some length in the article): e.g., the accelerating pace of economic diplomacy, and the problem of transparency. Clearly, even with Agent-Based Modeling (ABM), these issues will not go away.

Regarding Angus’ other points, I am in complete agreement, so these can be considered summarily. Thus, he has noted—and I plead guilty—my “uncharitable view of mainstream economics” which has been expressed in “dismissive language”. Although Joseph Stiglitz (2010) has, in my view, been just as abrasive, accusing neoclassical economists of believing that there “[is] no such thing as unemployment”, this is no excuse. Following Angus’ counsel, I will moderate the tone of the article. His methodological recommendations include placing greater emphasis on the Santa Fe Approach, and in particular, the contributions of W. Brian Arthur. Angus notes that Arthur’s writings would be especially valuable in describing the basic conceptual framework of complexity economics. I agree, and will make the change. Finally, I concur with his view that the use of ABM approaches is preferable to BNs linked to the LZ metric. He makes a good point that the former have already been used in studies of the US housing bubble, and the economics of climate change, while the BN-LZ method has thus far been confined to molecular cell biology.

In short, I deeply appreciate the time and energy that Simon Angus has devoted to evaluating my work. I agree with his recommended changes. Finally, I am pleased that he is “in steady agreement” with the article’s general outlook, and wishes to “see the thrust of [my[ ideas succeed.” In the event that this proves to be the case, it will be due in no small measure to the excellent analysis which he has provided.

References

Stiglitz, J. (2010). Freefall: America, free markets, and the sinking of the world economy. New York:
W.W. Norton.

I enjoyed reading this paper and I think it is a very useful review of a potential method for reducing the incidence of nasty economic surprises. I would happily add the papers suggestions to the approaches that Neil Kay and I considered in a paper we wrote over thirty years ago one how not to end up being nihilistic if one takes the potential for surprise seriously (Earl and Kay, 1985). The techniques discussed in the paper will be somewhat familiar to many evolutionary economists. However, since many heterodox economists tend to stick within their own tribal areas, what happens in evolutionary economics, where complex systems thinking is a key ingredient, is not as widely known as it should be amongst those open to alternative approaches. The review that the paper offers is thus valuable.

I was surprised that when talking about different ways of viewing the notion of equilibrium, the paper did not try to introduce readers to the ‘punctuated equilibria’ perspective and relevant sources in the literature. Gersick (1991) would be a useful starting point: I originally came across this when viewing the evolution of firms as if they operated rather like scientific disciplines that sometimes went through revolutionary upheavals along the lines suggested by Thomas Kuhn, but during periods of normality had sets of business routines (different sets either side of a revolution) akin to Lakatosian scientific research programmes. The punctuated equilibrium perspective came to my mind more recently when reading Rajan’s (2010) book on the Global Financial Crisis. Rajan was one of the few who did anticipate the crisis. His book’s title of his book on its origins and why he expects further trouble points towards an analogy that the paper might use in getting across its message, namely how pressures build up along established fault lines and abruptly go critical. The paper would benefit from taking account of Rajan’s book and from linking the punctuated equilibrium perspective and earthquake analogy to the notion of criticality. If we’re thinking of ‘trouble brewing’, another useful analogy is that of how the state of liquid in apot changes when it boils, but we get early warning of this as bubbles start to appear.

I suspect that my openness to the idea of Boolean networks may come from having read Kornai’s book Anti-Equilibrium nearly forty years ago. It was there that I was introduced to evolutionary and systems thinking and to the idea of thresholds of response as an alternative the mainstream economics view of continuous marginal adjustments. The BN framework, in which something is either off or on, meshes very neatly with this; it also lines up well with Herbert Simon’s satisficing;/aspiration levels-based view of choice. But we can also emphasize how key institutional devices have on/off binary aspects. Most worthy of mention in relation to the financial crisis is the notion of bankruptcy versus solvency, which are very different states to be in with respect to terms of what the rules of the game allow one to do. I can’t remember the exact quotation but it was clear to me that leading general equilibrium theorist Frank Hahn recognized this when he commented to the effect that the possibility of bankrupt debtors is not conducive to stable behavioural functions. Hahn’s comment looks all the more significant if we take, as Minsky (1975) did, a systems view of the financial system and see it as a network of balance sheets that can become increasingly fragile, with one person’s/organization’s bankruptcy potentially triggering a cascade of others. Those of us who like Minsky’s work were not surprised by the Global Financial Crisis happening, even if we could know predict when it would happen, and many reserve banks have prudential supervision departments that should be, if not already, trying to devise early warning systems that take account of the balance sheet network structure.

I noticed that the paper does use the word ‘cascade’ when talking about financial crises. This is another useful analogy that might be worth spelling out more thoroughly, for economists are not used to thinking of the economic system as having buffer components that are akin to reservoir behind dam walls, where nothing much may change for observers until the top of the dam wall is reached or the dam wall breaks, but beyond that point the system functions very differently if the input into the catchment continues. Within modern behavioural economics we get a variation on this idea in Roy Baumeister’s work on ego depletion, i.e., the reserves of self-control that run down as we take a succession of decisions or try to remain patient. Once we are mentally exhausted in the sense of having our reservoirs of self control depleted, we’re in a state which is not conducive to thinking carefully and avoiding impulsive, emotion driven choices. Whilst I have seen him illustrate his arguments in relation to how firms such as Audi go about strategically trying to exhaust their customers to make them susceptible to expensive option packs for their cars, it is of potentially much wider significance in relation to how critical negotiation processes unfold (e.g. over foreign debt bailout).

I suppose that a simplified early warning system with binary underpinnings is a checklist of indicators that may signal criticality: with ego depletion, tell-tale signs might be, say, sweat on the brow or small cracks in composure. Bankers are used to making sovereign loans and smaller loans via checklists, just as prudential supervision bodies may use checklists as early warning tools. It may be useful to comment in the paper on how one might go from the BN analysis to such checklists and on the limitations of checklists as seen from a systems perspective that recognizes it is the conjunction of conditions that matters for the system to go critical.

Finally, I was surprised that the notion of ‘modularity’ or Herbert Simon’s (1962) notion of ‘decomposability’ did not figure in the paper. This was one of the areas that Kay and I focused on in our 1985 paper. The vulnerability of a net work will depend on the structure of its connections, so policymakers who are getting worried about it going critical may wish to consider steps to make the system’s architecture more modular: i.e. rather than policymakers focusing on getting together in a concerted effort to coordinate tweaking of policy settings to stop the system going into meltdown, should they not be taking the network perspective in mind at an earlier stage and seeing their role as including that of system architects?

I thank Peter Earl for a careful and valuable review of my submitted article “Complexity economics as an early-warning system: A proposal.” I am pleased that he found the review useful, and that he would “happily add the paper’s suggestions” to the approaches that he and N.M. Kay presented back in 1985. I will briefly address what I believe are the two major aspects of Peter Earl’s review—the historical anticipations of Boolean-network (BN) approaches, and the relevance of Punctuated Equilibrium Theory (PET) for economic modeling—and then examine the possibility of a rapprochement between Peter Earl’s viewpoint and the quite different response presented earlier by Simon Angus.

Much of the early systems thinking which prefigured BNs, as well as Agent-Based Modeling (ABM) and related approaches, is familiar to me as a biological anthropologist. As Earl correctly notes, systems thinking has long been incorporated into evolutionary biology and related disciplines, and is now exerting an influence on heterodox economics. Accordingly, I concur with his suggestion to acknowledge, in my revision, Herbert Simon’s early contributions; in the bargain, I will also pay homage to Ludwig von Bertanffy and Magoroh Maruyama, both of whom developed early departures—described in those days as a “second cybernetics”—from the homeostatic frameworks of Norbert Wiener and others.

Regarding Stephen Jay Gould’s PET (Gould 1977), however, I would require more convincing that the model is nearly as valuable as Earl evidently believes it to be, especially in relation to the present global economy and the path it seems likely to follow. I believe that we are experiencing a kind of historical disjuncture between an economic trajectory comprised (over the last 400 years) of long periods of stasis interrupted by turbulent episodes—to which PET would apply—and a strikingly different pattern in which booms and crashes will be the norm, and dramatically different approaches in theory and policy will be required. No one can prove the future, but—to borrow from Raghuram Rajan (cited in Earl, above)—some of the “fault lines” are clear enough. In most of the developed nations, financialization—the “new world wealth machine” (Edmunds 1996)—is playing an increasingly significant role in deepening socioeconomic inequality and exacerbating volatility disproportionate to its small sector size in relation to total GDP. (For a valuable demonstration of financialization’s negative effects on inequality in OECD countries, 1995-2007, see Kus, 2012). These negative effects, in a rapidly growing number of countries, appear to be amplified by high-frequency trading (HFT)—perhaps the most powerful recent instance of the technologization of capital—in which totally electronic platforms driven by competing algorithms execute trades in microseconds; by contrast, computational methods deployed by the NYSE and other national and global exchanges occur in milliseconds (three orders of magnitude slower), or roughly at the speed of an eye-blink (Adrian 2016). Why does this matter? Perhaps events will show that it doesn’t (and indeed there are more than a few who view HFT benignly). But in a provocative recent study of HFT in relation to economic instability, Johnson et al. (2013) tracked a millisecond-resolution price stream for multiple stock exchanges from 2006 to 2011; their analysis revealed 18, 520 “ultrafast extreme events” which, importantly, were “entwined across timescales from milliseconds to months” with “smaller global instabilities”. As the platform becomes more sophisticated, and HFT approaches 70 percent of the speed of light, it seems likely that the disruptive coupling between economic time-scales may become more frequent and severe. In addition, as HFT is increasingly adopted by developing countries—it is already in use in BRICS nations, and being adopted by Turkey and Mexico—the prospects of volatility, contagion, and inequality appear even more realistic (Plummer 2012). Returning, therefore, to PET, and by implication, similar models incorporating protracted stability, PET’s continued economic relevance seems to me, at the least, unlikely. As Johnson et al. (2013) have noted, new approaches will soon be needed to explore ultrafast dynamics before and after economic crises.

Turning, thus, to the matter of models, and contrasting Peter Earl’s comment with that of Simon Angus, their divergence of views is remarkable. Peter Earl is sympathetic with my policy suggestion that the Lempel-Ziv (LZ) criticality metric (Lempel and Ziv 1976)—which originated in computer science, and has since been widely applied in molecular cell biology—could be used in concert with BN economic modeling as an Early Warning System (EWS), potentially “reducing the incidence of nasty economic surprises”. By contrast, Simon Angus believes that such a policy would be futile (and, thus, its theoretical rationale sterile) because an artificial criticality regimes, such as those of nuclear reactors, are vulnerable to slight perturbations and are hence difficult to sustain; implicitly but significantly, Angus draws a larger contrast here between natural and artificial systems—or more exactly, between those systems which were fashioned incrementally by natural selection over millions of years, and those designed by human ingenuity within historical time-frames—and suggests that we must be cautious when importing economic models from evolutionary biology. Methodologically, Angus also differs markedly from Earl, preferring the descriptive flexibility of Agent-Based Models (ABMs) to the Spartan simplicity of BNs.

Evaluating the two responses, I believe there is a rapprochement, theoretically and methodologically. Perhaps it is true, as Angus maintains, that we cannot sustain criticality. Nonetheless, in view of the threats to the global economy described above, we must at least be able to identify it—to recognize when the economy is on the brink of collapse—as a basis for policy decisions, rather than continuing the costly tradition of retrospectives to “nasty surprise” (Earl). Richard Bookstaber (2012) has captured the issue nicely: Designing an EWS, he states, is a task “fraught with uncertainty” because of unprecedented actions (recall, as a single example, the credit default swaps that preceded the 2008 Great Recession) which are difficult to anticipate. However, as opposed to identifying a crisis well in advance, it should be possible to develop novel methods that “trace the path a shock follows as it propagates through the financial system”. Importantly, economic shocks could also be artificially induced as a strategy of policy testing, and as an inductive basis for theory. How is this to be achieved? One possibility is a “best-of-both-worlds” approach utilizing an ABM-BN hybrid. In this architecture, each period of an ABM is output to a BN, incrementally changing the latter’s structure, and yielding a portrait of propagating disorder. In addition, BN “snapshots” could be expressed as time series, thus permitting the deployment of the LZ complexity metric (Lemper and Ziv 1976). Through the use of this measure, it would be possible to quantitatively express the extent to which the economy is approaching criticality, in essence mapping a fault line shortly before it becomes a collapse.