Risks, Large and Small

David Leonhardt, the awesome New York Times economics columnist (not that they all aren’t awesome), had an interesting piece on Sunday about how we deal with tiny risks of large bad outcomes. This is one of my favorite subjects. Leonhardt says (borrowing with credit from Robert N. Stavins, a Harvard economist) that we tend to underestimate risks that are hard to imagine and overestimate risks that are easy to imagine.

It is the first problem that worries Leonhardt. His main example is the Gulf oil spill: nobody involved had ever heard of a rig explosion, and so they tragically underestimated the chance that it would ever happen. If BP executives “could go back and spend the extra money to make Deepwater Horizon safe,” they would surely do so, he says. Well of course. But this alone doesn’t prove that BP was cavalier or even honestly mistaken about the risk of a blowout. (There is other evidence; this is not intended as a defense of BP.) I have insurance against the risk of my car being hit by another car. I don’t have insurance against the risk of my car being hit by an asteroid. If it ever is hit by an asteroid, I certainly will wish that I could go back and buy the insurance. That doesn’t mean my original decision not to was wrong. As long as the risk is above zero, there is some point at which it is worth taking. You can’t insure against everything.

My worry is the opposite of Leonhardt’s. It seems to me that in America, at least, we are much more prone to the second type of error: overestimating small risks of large disasters—and underestimating, by comparison, the smaller, everyday risks of life. Leonhardt himself provides the clearest example: people who, after 9/11, decided to drive rather than fly on long trips. In the next year there was no new terrorist attack on airplanes, but deaths in traffic accidents went up because more people were driving. That was like buying insurance against asteroids. Or, to take a more tendentious example, it’s like nuclear power. For a generation we’ve gone without it to protect ourselves from a catastrophic risk, only to find that perhaps this was a mistake because, without enough reflection, we were increasing the risk of a somewhat smaller catastrophe like the one in the Gulf.

Along with this overestimating the chance of disaster comes an unseemly celebration of hindsight when disasters inevitably occur. We see this in our crazy tort system. If a plane crashes or a medical operation goes tragically wrong, someone will be found to blame for not doing something that would have prevented it. (Leonhardt recommends making it easier to sue on occasions like this. Oy vey!) We see it in our health care, for example in the approval process for new pharmaceuticals, where we ricochet between demands that pills be perfectly safe and complaints about how hard they are to get approved.

We see it in our politics, and in journalism. One of the several gaffes of Rand Paul, the Republican senate candidate in Kentucky, was to say in his bloodless and artless libertarian way, after a mining accident that killed 29 people, “We had a mining accident that was very tragic. Then we come in, and it’s always someone’s fault. Maybe sometimes accidents happen.” Once again, this is no defense of the coal company. For all I know, they were criminally negligent. But there would be similar recriminations whether they were or not. That’s why Anderson Cooper gets the big bucks.

Nevertheless, Paul is right. Accidents do happen. And all risks can’t be eliminated. Paul and I might disagree about what follows from that. To me, if you’re arguing for us to tolerate greater risks as a society, you need a stronger safety net for individuals, to catch them when they fall.