WHEN WE CANNOT PREDICT

About a year ago, on Wednesday April 14th, I was on the way to London from JFK, when the pilot announced a slight delay into Heathrow in order to avoid the ash cloud coming out of the Icelandic volcano eruption. This was the first time I paid any attention to the subject. But once in London that is the only subject anybody talked about for a week.

"Something is going on here that requires serious thinking," I wrote on these pages. "We've had earthquakes before, and we've had plane stoppages, but nothing like the continuing effects of the ash cloud." The result was an Edge Special event on "The Ash Cloud". I asked the following question:

"What do the psychologists have to say about the way the decision-makers have acted? What have the behavioral economists learned from this? I am interested in hearing from the earth and atmospheric scientists, the aeronautical engineers, the physicists. What can science bring to the table?"

It's already clear that the earthquake and tsunami that hit northern Japan is the latest tragic example of our inability to predict when it matters most.

What can the Edge community bring to the table?

"Risks are always interesting," writes George Dyson," especially in this case where you have such a mix of probabilities — the earthquake/tsunami that most agree was unpredictable, if inevitable, and the nuclear power plant that some people think was entirely safe, and some people believe was entirely unsafe. So you need to frame this in terms of risk, without getting bogged down in the debate about nuclear power, that may go on forever, certainly long enough to drive people away from Edge."

"The question of preference for different kinds of fate — death by drowning vs death by radiation; death by enemy fire vs friendly fire, etc; tolerance for automobile fatalities because they are "accidents" — is at the heart of this, and you have a lot of people at hand with something to say about that."

To start things off, Edge asked Bruce Parker, Former Chief Scientist of the National Ocean Service in NOAA, and author of The Power of the Sea, to write the lede essay on risk in light of northern Japan earthquake and tsunami.

BRUCE PARKERPhysical Oceanographer, Stevens Institute of Technology; Former Chief Scientist of the National Ocean Service in NOAA and Director of the Coast Survey Development Laboratory; Author, The Power of the Sea: Tsunamis, Storm Surges, Rogue Waves, and Our Quest to Predict Disasters

WHEN WE CANNOT PREDICT

Prediction is the very essence of science. We judge the correctness a scientifictheory by its ability to predict specificevents. And from amore real-world practical point of view, the primary purpose of science itself is to achieve a prediction capability which will give us some control over our lives and some protection from the environment around us. To avoid the dangers of the world we must be able to predict where and especially when they will happen.

While the scientific method may lead us to a reasonably thorough understanding of some phenomenon, unfortunately that does not always translate into an accurate practical prediction capability that, for example, might help us avoid being killed by a natural disaster. When that is the case, we then find ourselves talking about risk, the likelihood that some dangerous event will take place, even though we do not know when. Risk assessment is necessitated by an inability to predict. That inability to predict may come from some deficiency in our knowledge, or it may be the result of a great complexity inherent in the phenomenon (for example, we may not have high-enough-resolution data to represent it, or the process may have a chaotic component that keeps us from determining exactly when it will occur). We are then left only with probabilities.

Along the way to understanding a natural phenomenon we, of course, develop and employ various types of technology. Such technology is typically used to measure the phenomenon and thus provide the data that will stimulate the analytical human mind to develop appropriate scientific theories. More data are then used to test those theories. Ultimately technology will also (hopefully) take the form of a warning system, a computer model (representing an accepted scientific theory) that uses real-time data. In the meantime, other technology will improve the methods of protection against such disasters.

The tsunami that struck northern Japan (where the death toll will likely surpass 25,000) is the latest tragic example of our inability to predict when it matters most. The tsunami's arrival at coasts more distant than Japan was accurately predicted by hydrodynamic computer models, once the location of the submarine earthquake was determined and the generation of a tsunami was confirmed by real-time data from DART buoys and tide gauges. (Such a confirmation is required because most submarine earthquakes do not produce tsunamis and the numerous false alarms that would result from warnings based only on the occurrence of a submarine earthquake would make the warnings useless.) But when the epicenter is so close to the coast that the tsunami arrives only 30 minutes after the earthquake, the only possible warning is a receding ocean prior to the tsunami or the earthquake itself (when a coast shakes for a long time, one is wise to play it safe and act as if a tsunami will be coming very soon). The Japanese are the most tsunami aware people on Earth, and they did immediately run to roof tops and inland. But 30 minutes is not a very long time. (It was even worse in northwest Sumatra in 2004, when an even larger tsunami struck only 15 minutes after the initial earthquake.)

The only way that a more advanced tsunami warning could have been given is if the earthquake itself could have been predicted. But we cannot predict when an earthquake will strike, not the day, or the month, or the year, or even the decade. All we can do is assign a risk to particular regions. Japan, with its numerous tectonic plates butting up against each other, is known to be a high risk area; many earthquakes and tsunamis have occurred there before. As a result, some sea walls had been built and some buildings had been made stronger. Technology contributed to those defenses. But they were not enough, and in fact, could never be enough, without huge sums of money being spent to build 40-foot sea walls along almost the entire Japanese coastline and to make all buildings capable of surviving the very rare 9.0 earthquake.

More effective would be to pour a small fraction of that cost into additional earthquake prediction research (an earthquake prediction capability would have saved even more lives in Haiti). With the great complexity of the worldwide tectonic environment, understanding what makes two tectonic plates suddenly release each other, much less being able to predict when earthquakes will occur using a detailed geophysical model, is still very far off. But using technology to continuously measure the various signals that the solid Earth provides, until we find signals that only come in advance of an earthquake, may be possible a lot sooner. Based on past accomplishments we can be justified in being optimistic that human intellect will someday find a way to predict when an earthquake will happen. But we need to speed up that process, because in the future more lives will be at stake. Whatever additional funds are required to make that happen would certainly be money well spent.

The magnitude of the earthquake that hit North-East Japan was 9.0, the fourth largest on record in modern times. The earthquake and tsunami risk was on the level of once in a thousand years, for this region: the last comparable earthquake in that region occurred in 869, according to historical and archeological evidence. In a shocking and totally unpredictable way, this disaster exhibited the hidden every-day culture of Japanese society. The dignified attitudes of survivors and the efficiency of their mutual help networks indicate the high level of civility instituted in the North-East of Japan. This tangible cultural capital will be an enormously positive asset for the social and economic recovery of the region.

In stark contrast, the very serious problems with the nuclear power plant in Fukushima have unearthed another, more problematic aspect of Japanese culture: their amazing ability of political denial in the face of scientific facts. I am gravely concerned that this disaster reveals the long-term practice of a hidden organizational culture related to the use of nuclear technology with a completely disfunctional regulatory system. It is truly alarming to see the government projecting to the public an imaginary third political reality, in which the first reality of the scientific radiation measurements is covered up in order to highlight the second reality of the social resilience of the Japanese public.

Risk-taking is an integral part of our every day life. Constructing and operating huge public projects involves the issue of appropriate levels of risk and budget considerations. But when combined with an organizational culture of structural secrecy and organizational conformity, it has grave consequences, in spite of all the courage of individual engineers, workers and fire fighters now toiling around the clock in life threatening circumstances at Fukushima Daiichi.

I am writing a longer essay with Junichiro Makino, professor at Tokyo Institute of Technology, who has been expressing his concerns in his Japanese blog on this issue since March 12th. His main point is that the situation is much closer to the Chernobyl disaster than the official understated governmental interpretation lets us believe. After two weeks, the ongoing problems at the Fukushima reactors are in line with Prof. Makino's dire predictions. Here are some salient points.

Already on March 17th, six days after the earthquake, following explosions at Fukushima's Daiichi nuclear power plant, the radiation level in the surroundings were comparable to that of Chernobyl. In one small village, Namie-machi, 30 km the plant, the radiation level was measured to be 170 microsievert/ hour. This value corresponds to an amount of I-131 of 1.3x10^14 Bq/km^2. This probably corresponds to an amount of Cs-131 of 3x10^12 Bq/km^2, roughly the same value as what was observed 30 km from Chernobyl. This measurement was taken by a team, reportedly led by the vice minister Kan Suzuki of Ministry of Education (MEXT).

Thus, already six days after the accident, it should have been clear that the amount of radioactivity released is comparable to that of Chernobyl. Even so, as of March 25, representatives of Tokyo Electric and NISA (Nuclear and Industrial Safety Agency) still stick to their estimate that the accident is at INES level 5, corresponding to 10,000 times less release of radioactivity than what was actually measured. This is inexcusable.

On March 22nd, DoE released data recorded from its Aerial Monitoring System, which shows that the heavily polluted area, with more than 125 microSievert/hour, was extended in the North-West direction of Fukushima-Daiichi over more than 30 km. On March 23, MEXT released results of their analysis of soil samples, 40 km North-West from the reactor, which, not surprizingly, was extremely high in both I-131 and Cs-137. According to the MEXT measurements, the amount of Cs-137 is 10-20% of that of I-137. So the level of pollution by Cs-137 might be as high as 2x10^13 Bq/km^2 at 30km distance.

In conclusion, the Japanese public, and the world at large, have been confronted with different realities: MEXT has been reporting a very high level of radioactivity since March 17th, comparable or ever higher than that of Chernobyl, and yet NISA and Tokyo Electric apparently refuse to accept what is going on. The one reality is based on scientific measurements. What can we say about the other "reality"?

It might look very strange to outside observers that there can be this large a discrepancy between what is so obvious and the official view. Unfortunately, in Japanese bureaucratic systems, such a situation is quite usual. For big projects like nuclear plants, there is of course an evaluation committee consisting of both specialists and scientists from wider fields. However, scientists critical to a project are quickly replaced by others who are willing to accept a more cozy relationship with the powers that be, with the result that such committees can easily wind up with a vast majority of dangerously uncritical members. In this unfortunate symbiosis of supportive scientists and bureaucrats, companies tend to lose an objective view of reality. In the case of the Fukushima disaster, it is hard for us to know whether they have really lost their grip on reality, or whether they only pretend to have done so. In the end, though, that does not matter much: the practical result in terms of clinging to an imaginary political reality is the same.

The only surprising thing is that even at this stage they are still behaving as though they have completely lost touch with reality. They continue the illusion of a relatively small accident that can be fixed in "just a few days", totally ignoring the reality of an ongoing Chernobyl-type situation. Yes, it is true that Chernobyl started with one huge explosion that caused the main contamination. But now that the Fukushima plant is producing a similar amount of contamination over a period of two weeks, with no end in sight, it is altogether possible that over the period of March and April the total contamination will significantly exceed that of Chernobyl.

"Grande profundum est ipse homo," said Augustine of Hippo. "A human being is a vast oceanic depth" — and just as fearfully and mightily unpredictable. Augustine lived on the Mediterranean's southern shore and feared its rages, while he and his contemporaries knew that worse still lay beyond the "pillars of Hercules" at Gibraltar, out on the trackless ocean they had only heard of but never seen.

The movements of people and peoples are full of surprises. Fifty years ago, the "middle East" was a colonial backwater to many, docile with incidental outbreaks of bad temper. To hear our media voices tell it, the region is now a landscape of religious fervor and simmering unpredictable violence. The journalism of these last weeks has had immense difficulty seeing what really is the case, for all that we think we know already. So in ten years— will it be Islamic fervor or Jeffersonian democracy that takes the lead? Can we tell any story forward with anything like confidence?

The material future offers more certainty. Energy business and energy politics will remain powerful forces; climate change moves on a scale of its own; information networks subvert powers that have flourished by controlling what people can know. But the underlying future narrative of human and social change is the oceanic mystery — so far.

What we cannot know is how far we have yet to go in understanding, predicting, and manipulating human behavior on the large scale. Imagine us all two hundred years into the future, on a planet much hotter and more crowded than we have now. Will we really still be surprised by sudden movements of large populations, great shocks inflicted by minor actors, and widely fluctuating electoral flip-flops?

For now, what we can't predict is ourselves. Augustine, the great churchman, told his God that even he couldn't tell which temptation he would give in to next. That uncertainty is a large part of what it has been to be human. WIll it always be that way?

McKinsey Professor, Sante Fe, Institute, and the co-founder and former co-president of Prediction Compan

Viewing the Nuclear Accident in Japan Through the Lens of Systemic Risk

Predicting risk might sound like an oxymoron, but it isn't: We do it everyday. Everyone knows, for example, that the risk of a dangerous fall on a steep mountain trail is higher than it is on level ground. Prediction of risks is more difficult, however, when they are systemic. Systemic risks occur when individual components of a system interact and collectively generate new modes of behavior that would never occur for a single component in isolation, amplifying or generating new risks.

The recent financial crisis provides a good example. Banks normally manage risk under the assumption that the financial system will behave in the future more or less as it has in the past. Such estimates are based on historical losses. This is fine under normal circumstances. But in the recent financial crisis a small drop in housing prices triggered a chain reaction that suddenly made the financial system behave completely differently, and extrapolations of risk based on historical losses became irrelevant.

Systemic risks are hard to predict. They are inherently complex phenomena, typically involving nonlinear feedback that couples together the behavior of many individual components. Systemic risks frequently occur in systems where there are neither good models nor good measurements, where theory or simulation is impossible. They often involve modes of interaction that have not been seen before, making past experience of little value. The amplitude of the resulting problem is often far larger than previously imagined possible.

How can we anticipate and minimize systemic risk? The key general principle is stability. Systemic risks occur when bad behaviors feedback on one another, so that small problems are amplified into big problems. When things go topsy-turvy, do the problem behaviors damp out, or are they amplified? In the recent financial crisis, for example, the key problem was leverage, which amplifies both gains and losses. Leverage is good during good times, but during bad times it makes the financial system unstable.

The recent Japanese earthquake/tsunami provides another example of how a normal risk can turn into a systemic risk. For Japan, given the history of the region, an earthquake in tandem with a Tsunami might be called a normal risk. But no one realized in advance that a Tsunami could destroy both the main power and the backup power of a nuclear power plant, while an earthquake could also create cracks causing a loss of coolant. The resulting nuclear catastrophe came on top of all the other damage to the infrastructure, making the nuclear crisis even harder to solve than it would have been otherwise, and the radiation leakage has made it even harder to get the infrastructure functioning again. The risks of both have been amplified.

With hindsight the consequences of a large earthquake and tsunami seem obvious, so why didn't the engineers plan for them properly? This is the usual story with systemic risk: In hindsight the problems are obvious, but somehow no one thinks them through beforehand.

As already explained, from a complex systems engineering perspective, the key principle is stability. Nuclear power generation is intrinsically unstable. If you walk away from a wind generator or a solar cell when a crisis occurs, not much happens. If you walk away from a nuclear reactor under the wrong circumstances, it can melt down. To cope with the systemic risk one needs to think through all possible scenarios. The experts might be able to plan for all the known failure modes, but it is much harder to anticipate the unknown ones.

The prognosis for nuclear accidents based on simple historical extrapolation is disturbing. After roughly 14,000 cumulative years of nuclear plant operation, we have now had three major accidents. If we ramp up nuclear power by a factor of ten, which is necessary to make a significant contribution to mitigate global warming, we will increase from the 442 reactors that we currently have to about 5000. Historical extrapolation predicts that we should then expect an accident of the magnitude of the current Japan disaster about once a year.

But I don't trust the historical method of estimating. Three events are unlikely to properly characterize the tails of the distribution. My personal choice for a really nasty nuclear scenario goes as follows: Assume the developed world decides to ramp up nuclear power. The developing world will then demand energy independence and follow suit. For independence you need both reactors and fuel concentrators. There will be a lot of debate, but in the end the countries with stable governments will get them. With a fuel concentrator the waste products of the reactor can be used to make weapons grade fuel, and from there making a bomb is fairly easy. Thus, if we go down the path of nuclear expansion, we should probably assume that every country in the world will eventually have the bomb. The Chernobyl disaster killed the order of ten thousand people: A nuclear explosion could easily kill a million. So all it will take is for a "stable government" to be taken over by the wrong dictator, and we could have a nuclear disaster.

I'm not an actuary, so you shouldn't trust my estimates. To bring the actuaries into the picture, anyone who seriously advocates nuclear power should lobby to repeal the Price-Anderson Act, which requires U.S. taxpayers to shoulder the costs of a really serious accident. The fact that the industry demanded such an act suggests that they do not have confidence in their own product. If the act were repealed, we would have an idea what nuclear power really costs. As it stands, all we know is that the quoted costs are much too low.

Danger is not the only property that makes nuclear power exceptional. Even neglecting the boost in cost that would be caused by repeal of the Price-Anderson Act, the cost curve for nuclear power is remarkable. My group at the Santa Fe Institute has collected data on the cost and production of more than 100 technologies as a function of time. In contrast to all other technologies, the cost of nuclear power has roughly remained constant for 50 years, despite heavy subsidies. This cannot be blamed entirely on the cost of safety and regulation, and after Japan, is anyone really willing to say we shouldn't pay for safety? In contrast, during the same period solar power has dropped by a factor of roughly a hundred, making its current cost roughly equal to nuclear. Wind power is now significantly cheaper than nuclear. Solar will almost certainly be significantly cheaper than nuclear within a decade, roughly the time it takes to build a nuclear plant.

To properly assess systemic risks, the devil is in the details. We can't debate the risks of a technology without wading through them. But from a complex systems engineering point of view, one should beware of anything that amplifies risk. Systemic risks are difficult to predict, and precautionary principle dictates that one should take care when faced with uncertainty.

Professor Emerita, George Mason University; Visiting Scholar, Sloan Center on Aging & Work Boston College; Author, Composing a Further Life

There is a reason to be particularly interested in events that we know with some certainty will occur, but for which we cannot set a date. Other kinds of events, beyond natural disasters, fit this definition. Yes, "the big one" will hit California, although we do not know when. But this is true of human upheavals as well. It was clear in the early 70s that the Pahlavi regime in Iran would eventually be overthrown, but there was no way to predict when that would happen, so no one was ready.

This is not a new kind of problem — on the contrary, it is a problem that has characterized the human condition since we began to recognize sequence and to make predictions, the knowledge that each individual has that he or she will die, and the inability to know when. Except for a few eras such as when Egyptian pharoahs prepared their tombs long before their deaths, or during the middle ages when memento mori was constantly reiterated, we have been skilled at avoidance and denial. Our ability to avoid thinking about predictable events that have no predictable date is based on millennia of practice.

In order to improve prediction we need to see such events as recurrent or cyclical, rather than seeing them as unique How many humanitarian disasters on average occur per year (and is global warming accelerating their incidence)? What rate of nuclear incidents is tolerable? Has anyone noticed how often we go to war — it is no longer an unusual event. How often do revolutions spread from country to country as they did in the nineteenth century and recently in the Middle East? (for each person sees his or her own death as in some sense unique). These questions allow relief agencies to stockpile supplies, but asking these questions depends on moving disasters, even the greatest ones, out of the category of the unique — which in turn may make them more tolerable. We are close to becoming inured to disaster, which may be the cost of prediction.

Professor of Psychology, Arizona State University; Author, Sex, Murder, and the Meaning of Life; Editor, Evolution and Social Psychology

In college, I had a friend who, despite his 158 IQ, had an affinity for taking foolish risks. He rode a Harley motorcycle, and got a great kick out of roaring down the Long Island Expressway at 120 mph. He liked to brag that his enjoyment of that experience was amplified by a few stiff shots of 151 rum. His defense: "It's important to live life to the fullest, because you never know what will get you. Some people get killed crossing at an intersection."

My neurotic's reaction was: While you may be taken out by random unforeseen events, you have some control over probabilities. Some intersections are a lot more dangerous to cross, and crossing them drunk on a speeding motorcycle gives random unforeseen events a bit more of a chance to strike.

Those who study the psychology of risk like to compare per-mile probabilities of dying in airplane crashes with probabilities of dying in car wrecks. When the tsunami struck Japan, I was visiting Australia, where one encounters frequent reminders that the odds of being eaten by a shark are substantially lower than the odds of being killed in your car on the way to the beach.

Despite this knowledge, I stayed out of the ocean, finding it difficult to forget a vivid story of a colleague who was actually attacked by a shark during her tenure as an assistant professor in the land down under. She survived, so what was I so worried about?

The psychology of risk is a fascinating and complex issue, but I'll offer a defense of defensiveness. It boils down to psychological economics: What is the enjoyment you'll get from a given activity, and is it worth the risk? To a risk-averse person like me, the slight risk of death is enough to keep me away from some otherwise enjoyable recreational activities (scuba diving in any body of water with any frequency of sharks; skydiving in any body of air over four feet above the earth's surface). If you're worrying about death, that adds a large cost to an otherwise beneficial experience. I had lots of fun hiking around in the hills, where I had visual access to any kangaroo who might want to challenge me to a boxing match. On the same reasoning about economic psychology and risk, I also mostly avoid unnecessary airplane travel (though I do attend occasional conventions, including the one that brought me to Australia).

What does imply for tsunamis and nuclear power plants? I'm with the Black Swan avoidance camp here. As a member of Neurotics Anonymous, I'd say: Let's not compare nuclear deaths (so far) to coal deaths (so far), let's instead invest in solar and tidal power and use the ample existing technologies for energy conservation. Of course, one way to conserve energy is to stay off airplanes (even to take pleasant vacations in Australia). Which has the incidental benefit of a (very slight) reduction in your odds of being eaten by a shark.

It's too easy to discount intuition. Pattern recognition. The almost literary sensibility through which we make sense of our world.

The narrative implicit in the nuclear plant disaster in Japan is just too striking for most humans to ignore: the nation that suffered an atomic bombing is now enduring a nuclear crisis. A particular kind of scientific orthodoxy refuses to even entertain such parallels except as evidence of psychological or cultural biases clouding what should be our reliance on the data.

But, as black swan events like this prove, our reliance on the data continually fails us. We just can't get enough data about our decidedly non-linear world to make accurate predictions. There are just too many remote high leverage points in the chaotic systems constructing our reality for us to take all of them into account. Things that seemed not to matter — or that we didn't even notice — iterate enough until they end up mattering a lot. We're better off looking at a fractal and intuiting its relevant patterns than relying on its various pieces to tell us its unfolding story. Science too often divides to understand, incapable of even acknowledging there might be a science in divining to the same ends.

The coincidence of nuclear crises in Japan, combined with our inability to predict the events that precipitated it, forces another kind of predictive apparatus into play. No, it's not one we like to engage — particularly in rational circles — but one we repress at our own peril. Science is free to promote humanity's liberation from superstition or even God, but not from humanity itself. We still have something in common with all those animals who somehow, seemingly magically, know when an earthquake or tsunami is coming and to move to higher ground.

And our access to that long lost sense lies in something closer to story than metrics. A winter bookended by BP's underwater gusher and Japan's radioactive groundwater may be trying to speak to us in ways we are still human enough to hear.

Prediction is one of the main cognitive processes. Children regularly make predictions.

1. Children predict the actions of the people that they interact with, but they do not necessarily realize that they do this.

2. Children predict the reactions of objects and actions in the physical world but they do not necessarily realize that they do this either.

3. Children predict their own feelings and mental states. They do things that they think will make them happy, but they don't necessarily realize that they do this either.

These three worlds, the social, the physical, and the mental, are at the center of what adults continue to learn to make predictions about. We predict the speed of an oncoming car and decide whether we can cross the street safely. We predict events that will make us happy or sad, such as taking a nice vacation, or playing a game, or a good meal, or establishing a relationship with another person.

To learn to predict well, one needs to be educated about how to predict and one needs to make predictions and to examine what went wrong when those predictions fail. Curiously the schools teach none of this.

How do children learn to predict? They learn as events happen randomly in their lives. If they are lucky enough to have someone helpful to talk with about their experiences they may, in fact, become good at analyzing how the world works and making their predictions conscious. Getting better at prediction is the cornerstone of living one's life in a satisfying way. One can, of course get better at prediction by simply thinking about it, this is how most people do this today of course. But not everyone is capable of doing that and, clearly, most adults are not all that good at making important predictions in their own lives. (This is one reason that there are bad marriages, financial counselors, clinical psychologists, and prisons.)

The idea that kids can make predictions is not a really radical point. My point is that prediction has to be the curriculum and not be ancillary to the curriculum. If we want adults to predict well, we need to help children do it well. As it stands now they are on their own. As adults who have not been taught to predict well, they will make poor life decisions, predicting wrongly about how people and things in their lives (bosses, spouses, children, co-workers, nuclear reactors etc.) will behave towards them after they take certain actions for example.

There are three aspects of prediction: learning a script; functioning without a script because it isn't known; and predicting when there is no script.

How do we teach prediction when there is no script and there are no seemingly relevant prior cases? In some sense you can't. You can teach people how to go about trying to make predictions. This is actually what science is about.

Scientists create theories which make predictions which they then try to verify with evidence. This process — hypotheses verified by evidence can be taught in the sense that it is a way of thinking that can be practiced in various venues and should be practiced in first grade. It is reasonable to start teaching children to think in this way about the world around them. We need to teach children to do scientific reasoning, not to memorize facts about science.

The avian flu outbreaks in 2007, the spike in oil prices in the summer of 2008, the financial system meltdown later that year, the volcanic eruptions of 2009, the oil spill in 2010, and now the earthquake/tsunami/nuclear emergency of 2011, let alone the political upheavals in the middle east, all serve to remind us that we are very poor at making predictions. And we are apparently very poor at assessing risks. But perhaps we can make some observations.

All of these events, one way or another, have shown us how fragile our global supply chain and transportation network really is. There are already reports of how the recent earthquake, never mind the nuclear issues, have slowed down repairs to New York City subway stations, halted production of some GM automobiles in the US (an industry still recovering from the financial meltdown), and even rippled into my own hobby project of building a credible 19th century digital computer.

Our quest for squeezing every efficiency out of our systems of production and supply have lead us to fragility rather than robustness. We have gotten short term margins at the cost of long term stability. Our quest for every last basis point in our financial results have lead us to build a system with countless single points of failure. We are vulnerable to natural disasters, unforeseen economic disasters, or clever exploiters of our systems [such as governments cornering rare earth metals supply chains, or just plain opportunistic hedge traders on rather conventional metals (which is why all nickel based batteries rocketed in price three years ago)].

The drumbeat of continued unexpected failures of nature, technology, or economics, will not go away. Perhaps, however, we can take lessons from the disruptions they cause, and find a way to monetize stability over maximum possible short term efficiencies, so that our constructed civilization will be more resilient to these events.

There is increasing evidence that earthquakes can in fact be predicted. When rocks grind together under pressure, they give rise to a range of electromagnetic phenomena, including so-called "earthquake lights" (EQLs), that can be regarded as precursor phenomena of a forthcoming quake. Empirical data exist showing that such luminous displays often occur before earthquakes; indeed they have been reported since ancient times; in the case of the Saguenay quake in Canada (25 November 1988), such lights were reported 25 days before the event. (The meaning of these lights is controversial among seismologists.)

Another sign of an impending quake is a disturbance to the ultralow frequency radio band, which has been observed weeks before an earthquake. A third precursor signal is a magnetic field change in the vicinity of a forthcoming upheaval. All such phenomena — EQLs, radio disturbances, and magnetic field changes — can be and have been observed by satellite, and for this reason the French government has a earthquake-detection satellite in orbit: DEMETER (Detection of Electro-Magnetic Emissions Transmitted from Earthquake Regions) that has observed such precursor phenomena from space.

Some of these phenomena, furthermore, have been reproduced in the laboratory. For example, researchers have performed experiments in which rocks subjected to high pressures produce bursts of electrical discharges consistent with those observed before and during quakes.

Earthquakes are natural, not supernatural, phenomena. They result from known physical forces, stresses, and fractures within the Earth's crust. Further research on various types of precursor phenomena, together with the emplacement of a range of Earth-based and satellite sensors, might one day enable us to predict earthquakes with enough precision to make an early warning system possible and realistic.

In his essay, Bruce Parker says that technology "could never be enough, without huge sums of money being spent to build 40-foot sea walls along almost the entire Japanese coastline…" Well, societies spend huge sums of money routinely, often for projects that have little if any economic or scientific return or tangible benefit, for instance the building of the Great Pyramids at Giza.

But another large and ancient project, had it been deployed at the Japanese coastline, arguably could have prevented some of the destruction wrought by the Sendai tsunami: the Great Wall of China. The Great Wall extends for a distance of 8,851 km (5,500 mi.), and was built starting in the 5th century BCE. The coastline of the Japanese island of Honshu is some 5,450 km (3,386 mi.) long, with the Atlantic-facing side comprising about half of that distance. This means that the Great Wall could have almost doubled the length of Honshu's Atlantic coastline. If a structure built during ancient times could have provided some protection against a tsunami, it is likely that modern technology is fully up to the task of building a seawall that would be effective against the tsunami at Sendai. Whether or not to build it rests on factors, such as cost, political vision, and tolerance for risk, that are outside and independent of science and technology proper.

The intellectual problem with our planet is that it was not intelligently designed. That leads to the physical problem — because Earth is an entirely natural accident it is a badly constructed death trap.

On the chronic level there is the horrific wastage of youth with — until the last couple of centuries — about half of children dying as kids, to the tune of about 50 billion premature deaths, largely from the diseases the wild west that is dumb ass evolution cooked up. On the periodic level there are your plate tectonics. In one sense moving plates are a good thing for people. If not for them constantly building up mountains the continents would have never formed much less still be around. But the system is fatally flawed. Had the plates been designed with any common sense their edges would glide past one another as smooth as silk. No one aside from geologists would notice them. They would not generate earthquakes. But plate boundaries we have to put up with tend to lock up, build up strain, and snap with bad results. When quakes occur on land it is bad architecture that causes most casualties. But underwater quakes can spawn tsunamis that when big enough overwhelm even well prepared coastal peoples. Underwater landslides and big meteorites hitting oceans make big waves too.

The problem with calculating the big periodic risks is that they are not really calculable. Nuclear plants are generally built to resist the maximum quakes and waves at significant risk of happening in its decades long operating span according to known geological and historical evidence. But some quakes don't even bother to occur at plate boundaries. The New Madrid superquake hit the Mississippi valley. That may have been a once in a millennium result of shifting continental buoyancy after the melting of the last Pleistocene ice sheet. Who knows what faults are out there? Or underwater slopes ready to slide. Build a lot of nuclear plants to resist normal local events, and it is inevitable that some of them will be hit by the locally rare and unpredicted local superquake or superwave. If that happens to coal fired or solar plants it is just a regional power loss. But fission plants are hyper-intense concentrations of radiation and heat just waiting to burst out with lots of nasty items that can profoundly disrupt entire regions. To better ensure safety would require that all nukes be built to survive very rare, unpredictable events, driving up their already high costs.

But as bad as quakes and tsunamis have been, they and other short-term disasters are not major people killers. It is estimated that tremblers have dispatched around 10 million in the last thousand years, a drop in the general premature mortality bucket. Expanding this a few fold will cover all the losses over human existence due to quakes, waves, volcanoes, fires, storms, floods, slides and the like. It is disease that has taken tens of billions, with famines making a major contribution.