Menu

Tag Archives: risk

Conservatives: Just because Obama is a liberal doesn’t mean none of his administration’s Ebola strategies make sense. If it were President Romney explaining why a flight ban won’t help anything, would you be so up in arms?

And since when is a crisis reason enough to abandon free market principles? Conservatives have always hated the CDC. Now they’re blaming Obama for proposing cuts to CDC’s budget back in 2011. They want CDC to do even more! It reminds me of post-9/11 (not that I remember that period…I’ve only read the history), when conservatives were all for war in Afghanistan, the TSA, the Patriot Act. Once Obama was elected, all those things became evil. This is why I rarely vote for Republicans, and didn’t in 2012. Once they’re in power, true free market principles in America virtually disappear!

I’m a free market guy. I like most conservatives. But using a virus that is killing thousands to score political points and attack Obama is pretty low. The reasons why a flight ban won’t work should, I think, make sense to anyone willing to consider the merits of the decision itself instead of criticizing the people making the decision.

This issue hits home for me. My wife is a nurse, and a suspected Ebola case showed up in her hospital late last week (this has been reported in the news…I’m not disclosing confidential information). If she’d had been asked to treat this patient, she’d have done it willingly. Then she’d have come home in the morning, knowing that sensible precautions were taken. Of course, even the best precautions won’t work 100 percent of the time. Thus the Dallas nurse infected last week. But to achieve any success at all, risks have to be taken. It’s vitally important that those risks be understood in light of the facts, not political grudges. If not, we risk putting people like my wife in danger.

For example, consider if a flight ban were mandated. Are hospitals to quit asking if a patient had come from west Africa? They might as well. What patient is going to admit having broken the law if they somehow breached the barricade? This puts Americans in more danger than they are now–nurses most of all.

Then imagine if a patient did admit to evading the flight ban. Mass hysteria would break out. No one with a West African (or probably any African) accent would be trusted. Perceptions of risk would be incalculable, as the number of recent visitors to west Africa in the United States would be unknown. The economy would suffer, and more harm would come to Americans than what would have come otherwise.

The choice is between calculable risk (no flight ban) and incalculable risk (flight ban). Use your brain and think about this. Forget that it’s Obama in power. Forget that the CDC Chief is a poor speaker. Consider the facts and think about the administration’s reasoning. Obama’s made tons of mistakes, but this isn’t one of them.

According to the National Safety Council, the odds of dying in a “motor vehicle incident” are one in 112. The odds of dying from “air and space travel incidents” are one in 8,357.

The implication here is that flying is safer than driving.

You’ve probably heard this before. Economists love to cite this fact. It has shock value. It shows why impressions and emotions can be wildly misleading (flying certainly seems more dangerous than driving). What matters is how often people are actually killed or injured, not our feelings about which is more dangerous.

But what the National Safety Council’s graphic doesn’t explain is that these are aggregates. Their calculated odds of dying in a car accident does not control for the quality of the driver. If you don’t wear a seat belt and don’t stop at stop signs, I’m sure your odds are much higher. If you are safer than the average driver, then your odds are lower.

Of course, it’s not likely anyone is such a great driver (and, for that matter, that everyone driving around him or her is just as careful and considerate on the road) that their odds of death by car are lower than those of death by plane. One in 112 is a far cry from one in 8,357. But the point of this is to encourage you to take statistics like this with a grain (or several) of salt. Statistics on risk can be highly misleading.

I was reading through Kip Viscusi’s Economics of Regulation and Antitrust last night and came across an interesting graph showing how our perceptions of risk don’t correlate well with reality—especially risk of dying from various causes.

I can’t find the graph online to post here, so a verbal breakdown will have to do.

At first thought, this made sense to me. Deaths by cancer and heart disease aren’t well-publicized, while death tolls from natural disasters are media favorites. I’d only expect people to overestimate the risk of dying from well-publicized events. But that raises the question of why accidents and natural disasters are publicized more often than deaths by disease.

Turns out Viscusi was one step ahead of me. Just after the graph, he points out that events are more feared by the public that more often result in relatively early deaths (i.e. natural disasters) than those resulting in death at older ages (i.e. heart disease). In other words, a longer estimated length of life lost is associated with higher perceived risk of dying.

Additionally, Viscusi cites another study showing that respondents’ answers to perceived risk are influenced by their age—young people over-estimate risk of dying from a tornado or flood and underestimate risk of dying from “old people” things, like heart disease. Vice versa for older people, I assume.

In other words, “world-revolves-around-me” attitudes extend even to our general estimates regarding risk of dying and give us inaccurate perceptions about risk. Since our perceptions of risk affect our preferences and behavior, perhaps causing us to spend extra for unnecessary safety features or not enough for better ones, being selfish can be harmful to your health!

But it’s not totally irrational to overestimate risk of dying from lower-probability events, like accidents and natural disasters. Viscusi writes:

Even if we are equipped with this knowledge of how people err in their risk beliefs, it is sometimes difficult to tell whether we are overreacting to seemingly small probabilities. In 2002 a sniper in Washington, D.C., created extraordinary public fears that led football teams to cancel outdoor practices and thousands to change their daily routines. The daily risks were seemingly small—surely under a one-in-a-million chance of being killed, based on the sniper’s past record. But what if one day the sniper decided to kill dozens of people, not just one? Then the odds would become much more threatening. The change that one would be shot by the sniper consequently was not well understood, since we have a small sample of observations that might not reflect accurately the extent of the risk.

I was excited to see this comment, as I was living in DC in 2002 and remember having soccer games and practices cancelled because of the sniper. It wasn’t so much about the odds, but about the unpredictability of the sniper—like Viscusi said, how could we have known he wouldn’t one day shoot dozens of people?

I think the same goes for natural disasters. They can be totally unpredictable and result in unexpected death. It’s entirely possible that tornadoes will kill 500 people tomorrow in Oklahoma, or that an earthquake will destroy large swaths of San Francisco. These events are unlikely, of course, but for no other reason that those things haven’t happened often in the past. We have no other reason than this. Diseases, on the other hand, progress slowly. We can take measures to dramatically reduce our likelihood of getting one. We can predict with more certainty whose susceptible and why.