I spent the lead up to Y2K working tech-support for the local telco’s ISP. I was about as far down the food-chain as you could get, but was dimly aware of the frantic testing and checking that was being done by the real techs to make sure that the system would stay running on January 1st.

One of my colleagues had a side-business as a Y2K consultant. To this day, I am not entirely sure what he did, but I know he made a fair bit of cash doing it. I saw him feed a lot of it into video lottery terminals at the bar after work. Saw him drive home drunk too. He assured me he was a “careful driver”. He’s never had an accident.

I wasn’t one of the extra people on shift at the call centre, so I spent New Year’s 2000 first drinking at my house and then wandering downtown with friends to see whatever fireworks (planned or otherwise) might go off. I remember, I had a film camera to document events. I looked through the roll the other day; people at a party, some badly exposed shots of fireworks, someone’s lost portable cooler full of smashed bottles.

A friend of mine spent Y2K at the WorldCom offices. Nothing happened there either. They had to have extra staff on hand because they weren’t sure that it wouldn’t. They’d done their best to make sure everything was fixed up and ready to go, but the bug that hits you is the one you don’t know about.

Here at the end of the decade, the Y2K story has been reduced to the status of overblown hoax. Which raises the question: How would we know?

It’s harder to tell when prevention worked. The symptoms of an overblown response and a successful response are exceedingly similar. It’s the tiger repellent on sale in New York City. “See how well it works?” It’s vaccination programmes. “See how well they work?”

A cursory search through articles about Y2K seems to indicate that we have no idea if Y2K didn’t materialize as a crisis because of all the effort that was put into averting it or if it was ridiculous from the get-go. It seems like not many people bothered to really ask once the date had passed and the missiles remained in the ground. Was each side worried that the other side’s opinion would get confirmed?

Also in 2000, a friend’s mother was on a hospital committee responsible for flu pandemic preparation. The thinking was that we were overdue for a major outbreak, the last having been in the 60s. They were in charge of assuring preparedness, having systems in place, and answering logistical questions such as whether we had enough body bags.

I wonder if there were budgetary meetings. Did they have to fight to ensure their procurement orders were met? Did some tough-minded bureaucrat say, “Look, why are we spending all this money on an event that hasn’t happened in over 40 years? There are people out there who desperately need treatment today.”

This post has been sitting in draft form for quite some time. When I started it, it was being written in the context of Russia’s decision to attempt to deflect asteroid 99942. Since then, there have been ample opportunities to finish the post and remain timely. Haiti, The Gulf of Mexico, Pakistan, Queensland, and now Japan.

Here’s Anatoly N. Perminov, the head of Russia’s space agency on the asteroid:

We’re talking about people’s lives here. It’s better to spend several million dollars and create this system, which would not allow a collision to happen, than wait for it to happen and kill hundreds of thousands of people.

If you want to add a little astronomical unease to your day, let me recommend the @LowFlyingRocks Twitter feed. It mentions every object that passes within 0.2AU of Earth.

At this point, I should go back and correct myself. It’s not really the case that we’re very good at noticing when preparation hasn’t met up with need. It is possible to undertake the right amount of preparation and for there to still be a disaster.

Here’s a less planet-threatening example: At the end of Super Bowl XLIV’s first half, the Saints tried to get a touchdown instead of the safer field goal. They failed, and commentators took the play to be a mistake by coach Sean Payton. At the start of the second half, the Saints ran another risky play called an onside kick. This succeeded and so Payton is being hailed as ‘gutsy’. If the play had failed, commentators would likely have argued that Payton’s overly aggressive plays lost his team the Super Bowl. However, even if both plays had failed, statistics seem to show that they were the right calls. Onside kicks have about a 60% chance of working when they’re unexpected. Under the circumstances, that’s a good bet.

Humans are really bad at this. We judge decisions based on the results. This would make sense if results were strictly dependent on our decision, but they’re not. There’s a whole world of other forces that come into play. The important insight is that if you take a bet and lose, that does not retroactively make it a bad bet. It was a good or a bad bet at the moment that you made it, and the outcome does not change this fact. (Correspondingly, if you make a bad bet and win, that doesn’t retroactively make it a good bet.)

It is possible for the outcome to shed additional light on your bet, perhaps by highlighting a force or factor that you had failed to account for, but otherwise, outcomes don’t reach backwards in time to change the status of decisions. When you make a risk management decision, you are dealing with many worlds. When you are living with the consequences, you are dealing with only the one you ended up with. The fact that in 60% of the other worlds you are drinking champagne from a trophy is cold comfort.