Uncertainty, Science, and the Edge of the Knife

This is a post that’s been kicking around in my head for a while. As those close to me know, I am prone to thinking in analogies—often of the overextended kind. But I think this one is pretty apt. So bear with me here…

In my opinion, the scientific method, or maybe the scientific attitude, is one of the great things we humans have created. At it’s root, it is nothing more than the willingness to set aside our biases and preconceptions as best we can, look carefully at what mother nature is showing us, and to admit when she shows us that we were wrong. This method and attitude have had some spectacular successes in the past two hundred years. It has allowed us to do things like go to the moon, make plastics, and invent the iPhone. And the fact that it has done all this for us has made many people hope it can inform us on other questions of social importance. How many fish can we take out of the ocean? How much exhaust can we pump into the atmosphere? How much of this chemical will give me cancer? How many people can live a decent life on our small planet?

Unfortunately for anyone expecting an exact answer to any of these questions, science never yields exact answers. The great triumphs of physics and chemistry during the first couple of centuries of scientific inquiry may, in retrospect, have spoiled us by giving answers as precise as they did. Once you start trying to use the scientific method to figure out big, complex, dynamic things like ecosystems and societies, you will no longer have the luxury of ignoring uncertainty. The science is not less true, but it is much less certain, and you will have to learn to live with that uncertainty.

Which brings me to my analogy. I have worked as a line cook in three restaurants, two in New York City, and before that one in Carmel Valley, California. I washed up at the one in CV with very little cooking experience, and learned on the job. One of the first things I learned was knife skills.

Like probably most people out there, I had an understandable respect for knives. Especially high-carbon, folded steel knives hand forged in Japan and sharp enough to shave the hair off the back of your hand. So when told to slice some ungodly number of vegetables as thin as I could (and hurry up, dammit, we need them ten minutes ago!), I did what any reasonable person would do, and kept all my fingers as far away from the knife edge as I could, like so:

Keep doing this and you will eventually cut off a finger.

And I cut myself, more times than was cool. Run to the bathroom, wash it out and wrap it up, and double-glove that hand for the rest of the night. Because the actual way to hold a knife is like this:

Guess who got to eat a handful of fine-dice carrots after writing this post?

Right hand choked up on the handle, so that you’re gripping the blade itself between your thumb and forefinger. Left hand curled up, holding whatever you’re chopping with the fingertips, and the knuckles right up against the flat of the blade. I remember seeing Julia Child do this on PBS when I was a kid, and thinking it was absolutely crazy. It looked like an open invitation to chop off a finger. But by god, it works. That terrifying closeness of your fingers to that blade’s edge means that you know exactly where it is, and have perfect control over it.

Learning to practice science is not unlike learning to use a knife. The scientific method, knife-like, can separate truth from untruth. But there is a dangerous zone of uncertainty in the middle. If you aren’t willing to get cozy with this uncertainty, you will cut yourself.

Scientists, and especially those of us in the truly difficult fields like ecology, have to learn to deal with uncertainty. We live and breathe it. Every single estimate of some quantity in nature—the average span of some bird species’s wings, the growth rate of a fish, the number of trees in a forest— will come with an accompanying estimate of that estimate’s uncertainty. Every conclusion we draw comes followed by a pair of parentheses containing the probability that we got that conclusion wrong. We spend at least as much effort on quantifying uncertainty as we do quantifying the actual value we’re interested in.

This uncertainty—and it is worryingly large on many questions of worrying importance— can be deeply unsettling for those not used to it. These include engineers, politicians, and regular people used to hearing “scientists announced today” on the evening news. It is easy to mistake uncertainty for a lack of understanding, or for incompetence. Why should we trust climate science if it can’t say exactly how much the planet is going to warm? Why should we trust a conservation biologist who can’t tell us exactly how many endangered animals there are? Why not drill, baby drill, if we aren’t even sure that a platform will ever fail?

The problem with that is the same as holding the knife by the end of the handle. You think you’re avoiding the dangerous part, but it’s still there. You’re just ignoring it, and thus don’t know where it is. This is maybe one of our society’s major misconceptions of science: that it is always exact, and that if it isn’t, there is something wrong with it. We haven’t learned to deal well with the uncertainty inherent in answers to the difficult questions, and as a result we cut ourselves, over and over.

I’m not sure how to change this, besides writing long blog posts on it and explaining it detail to anyone who gets me talking about it after a few beers. There are some other attempts out there—The UW’s excellent weather probcast is one that springs to my mind. The “cones of uncertainty” that NOAA puts around its hurricane tracks are another one. I guess it makes sense that meteorologists are ahead of the curve on this one, since weather and weather forecasts are things that everyone deals with daily, and that are inherently uncertain. With other problems, it will be harder, I think. If anyone has ideas, I would love to hear them.