Why Do Many Reasonable People Doubt Science?

The scientific method leads us to truths that are less than self-evident, often mind-blowing, and sometimes hard to swallow. In the early 17th century, when Galileo claimed that the Earth spins on its axis and orbits the sun, he wasn’t just rejecting church doctrine. He was asking people to believe something that defied common sense — because it sure looks like the sun’s going around the Earth, and you can’t feel the Earth spinning. Galileo was put on trial and forced to recant. Two centuries later Charles Darwin escaped that fate. But his idea that all life on Earth evolved from a primordial ancestor and that we humans are distant cousins of apes, whales, and even deep-sea mollusks is still a big ask for a lot of people. So is another 19th-century notion: that carbon dioxide, an invisible gas that we all exhale all the time and that makes up less than a tenth of one percent of the atmosphere, could be affecting Earth’s climate.

Even when we intellectually accept these precepts of science, we subconsciously cling to our intuitions — what researchers call our naive beliefs. A recent study by Andrew Shtulman of Occidental College showed that even students with an advanced science education had a hitch in their mental gait when asked to affirm or deny that humans are descended from sea animals or that Earth goes around the sun. Both truths are counterintuitive. The students, even those who correctly marked “true,” were slower to answer those questions than questions about whether humans are descended from tree-dwelling creatures (also true but easier to grasp) or whether the moon goes around the Earth (also true but intuitive). Shtulman’s research indicates that as we become scientifically literate, we repress our naive beliefs but never eliminate them entirely. They lurk in our brains, chirping at us as we try to make sense of the world.

Most of us do that by relying on personal experience and anecdotes, on stories rather than statistics. We might get a prostate-specific antigen test, even though it’s no longer generally recommended, because it caught a close friend’s cancer — and we pay less attention to statistical evidence, painstakingly compiled through multiple studies, showing that the test rarely saves lives but triggers many unnecessary surgeries. Or we hear about a cluster of cancer cases in a town with a hazardous waste dump, and we assume pollution caused the cancers. Yet just because two things happened together doesn’t mean one caused the other, and just because events are clustered doesn’t mean they’re not still random.

We have trouble digesting randomness; our brains crave pattern and meaning. Science warns us, however, that we can deceive ourselves. To be confident there’s a causal connection between the dump and the cancers, you need statistical analysis showing that there are many more cancers than would be expected randomly, evidence that the victims were exposed to chemicals from the dump, and evidence that the chemicals really can cause cancer. [more]