Finding Clarity in an Ambiguous World

The Y2K Fallacy

The Y2K Fallacy

If you're new here, you may want to subscribe to my RSS feed or receive an email of new posts [served by Google].

The end of the world as we know it. This is how the Y2k bug was billed. Famous computer scientists staked their reputation on the gravity of the problem. Mobilize we did, dusting off the old Cobol books and scouring our software for dates shortened to two digits instead of four. Companies spent billions of dollars repairing their code and making it robust. Most thought in spite of all the work, we were still doomed. But doomsday arrived, not with a bang, but barely a whimper. Turns out it was not the end of the world as we know it. After the fact, people argued that all of our preparations were why so little happened, but even in countries where Y2K spending was limited, little happened. How did we get ourselves into such a frenzy?

Even as the hype about the problem grew, it became somewhat obvious it was being blown out of proportion. It started with a description of the bug, that when the year switched from 99 to 00 it would create a discontinuity that would cause programs to behave unexpectedly. Almost all computers have a clock and could have software that could be susceptible to this discontinuity. Sometimes the error would cause the program to crash. Sometimes the crash would cause the program to not start-up again or create problems in saved data. Sometimes that would cause the system to stop functioning or to send bad data to other systems causing them to stop functioning. Couple this with the fact that there are embedded computers in our power plants, cell phone networks, even cars. How will we even be able to fix those. So we have the possibility of all computer devices ceasing to function or putting out bad data, hence, death and destruction the likes of which the world has hardly known.

But this cascade didn’t happen and it’s easy to see in retrospect why. A small probability of a small probability of a small probability is so close to zero that it doesn’t matter any more. Yet at each stage, our fear tells us to think, “Yeah, but what if it was that way, then…”. We cascade down until we start seeing this as many people worded it: The Y2K bug will cause many computers to crash and never come back online. This is the Y2K fallacy. It’s a mix of the narrative fallacy (that we believe stories more than facts) mixed with hasty generalizations (generalizing from the small to the large without properly taking into consideration how things change).

It’s easy for us to tell ourselves similar stories about the things we are most worried about or the things we are most hopeful for — focusing in on a small thing and telling ourselves the story of the cascade, believing each stage. When we find ourselves in this situation, step back and consider the likelihood of each stage rationally, try telling yourself the opposite story, and see the situation for what it is. We’re probably not going to die (at least in any way that we see coming) and we probably also won’t land all three of the big whales on our horizon and become multi-billionaires — unless we’re fixing bugs for someone else’s terror. 😀