Natural Hazards and Risk Reduction in the Modern World [Greg Laden's Blog]

Great disasters are great stories, great moments in time, great tests of technology, humanity, society, government, and luck. Fifty years ago it was probably true to say that our understanding of great disasters was thin, not well developed because of the relative infrequency of the events, and not very useful, not knowledge that we could use to reduce the risks from such events.

This is no longer true. The last several decades has seen climate science add more climatic data because of decades of careful instrumental data collection happening, but also, earlier decades have been added to understanding the long term trends. We can now track, in detail, global surface temperatures well back into the 19th century, and we have a very good idea of change over time, and variability in, global temperatures on a century level scale for centuries. There is a slightly less finely observed record covering hundreds of thousands of years and an increasingly refined vague idea of global surface temperature for the entire history of the planet.

This is true as well with earthquakes, volcanic eruptions, and tsunamis. Most of the larger versions of these events leave a mark. Sometimes that mark is an historical record that needs to be found, verified, critiqued for veracity, and eventually added to the mix. Sometimes the mark is geological, like when the coastline of the Pacific Northwest drops a few meters all at once, creating fossilized coastal wetlands that can be dated. Those events are associated with a particular kind of earthquake that happens on average every several hundred years, and now we have a multi-thousand year record of those events, allowing an estimate of major earthquake hazard in the region.

And so on.

The theory has also developed, and yes, there is a theory, or really several theories, related to disasters. For example, we distinguish between hazard (chance of a particular disaster happening at a certain level in a certain area) vs. risk (the probability of a particular bad thing happening to you as a results). If you live and work in Los Angeles, your earthquake hazard is high. You will experience earthquakes. But your risk of, say, getting killed in an earthquake is actually remarkably low considering how many there are. Why? Partly because really big ones are rare and fairly localized, and partly because you live in a house and work in a building and drive on roads that meet specifications set out to reduce risk in the case of an earthquake. Also, you “know” (supposedly) what to do if an earthquake happens. If, on the other hand, you live in an old building in San Francisco, you may still be at risk if the zoning laws have not caught up with the science. If you live near sea level in the Pacific Northwest, your earthquake hazard is really low, but if one of those giant earthquakes happens, you have bigly risk. Doomed, even.

Since my own research and academic interests have involved climate change, sea level rise, exploding volcanoes, mass death due to disease, and all that (catastrophes are the punctuation makrs of the long term archaeological and evolutionary record), I’ve always found books on disasters of interest. And now, I have a new one for you.

Man catastrophe books are written by science-interested or historically inclined writers, who are not scientists. The regurgitate the historical record of various disasters, giving you accounts of this or that volcano exploding, or this or that tsunami wiping out a coastal city, and so on. But the better books are written by scientist who are very directly, or nearly directly, engaged in the work of understanding, documenting, and addressing catastrophe.

Timothy H. Dixon is a professor in the School of Geosciences and Director of the Natural Hazards Network at the University of South Florida in Tampa. In his research, he uses satellite geodesy and remote sensing data to study earthquakes and volcanoes, coastal subsidence and flooding, ground water extraction, and glacier motion. He has worked as a commercial pilot and scientific diver, conducted research at NASA’s Jet Propulsion Laboratory in Pasadena, California, and was a professor at the University of Miami, where he co-founded the Center for Southeastern Tropical Advanced Remote Sensing (CSTARS). Dixon was a Distinguished Lecturer for the American Association of Petroleum Geologists (AAPG) in 2006–2007. He is also a fellow of the American Geophysical Union (AGU), the Geological Society of America (GSA), and the American Association for the Advancement of Science (AAAS). He received a GSA Best Paper Award in 2006 and received GSA’s Woollard Award in 2010 for excellence in Geophysics.

This book covers risk theory, the basics of natural disasters, uncertainty, and vulnerability of humans. Dixon looks specifically at Fukushima and the more general problem of untoward geological events and nuclear power plants, and other aspects of tsunamis (including the Northwest Coast problem I mention above). He talks about energy and global warming; I found his discussion of what we generally call “clean energy” a bit outdates. He makes the point, correctly, that for various reasons the increase in price of fossil fuels that would ultimately drive, through market forces, the development of non-fossil fuel sources of electricity and motion is not going to happen for a very long time on its own. Environmentalists who assume there will be huge increase in fossil fuel costs any time now are almost certainly mistaken. However, Dixon significantly understates the rate at which solar, for example, is becoming economically viable. It is now cheaper to start up a solar electricity plant than it is to start any other kind of plant, and the per unit cost of solar is very low and rapidly declining.

Dixon is a bit of a free marketeer, which I am not, but a realistic one; He makes valid and important points about science communication, time lags and long term thinking, and he makes the case that more research can produce important technological advances.