As we detailed in a recent piece, language often fails us when we try to describe risk. For example, a report on the failing satellite observed: “While the threat of the debris hitting a human is extremely small...” That description is essentially useless in helping us figure out whether to hide in our basements or carry on with our lives.

After all, if the probability of hitting any human is “extremely small” then the likelihood of any specific person being hit by space junk is about 1 in 10 billion, or 0.000000000001%. It’s fair to say that it’s safe to leave your umbrella at home that day.

That said, the probability of space junk damaging other satellites is likely meaningful. Given the potential for consequential economic damages from a loss of communications or weather monitoring satellites, that’s the type of risk issue I want to understand in an incident like this.

Bottom Line: The first question in risk assessment is not, “What’s the probability of a bad outcome?”, but rather, “What’s the outcome I’m worried about?”