A recent Pew study found that nearly 40% of Americans say they have doubted a medical professional’s opinion or diagnosis because it conflicted with information they’d found online. [UPDATED: The original poll was commissioned by health care consulting firm Envision Solutions, LLC, and published in 2008. You can see an overview of the results here]

Yes, this seemingly innocuous poll in fact represents the culmination of a centuries' long trend toward both the freedom to seek knowledge - and to be free from knowledge. This poll explains at least part of what's wrong with our health care system, and almost all of what's wrong with our debate over climate change. Why? First, a digression.

Growing up as a Southern Baptist preacher's kid in Oklahoma, where Sen. Jim Inhofe regularly wins 57% of the vote and the State House floats resolutions denouncing evolution, I've always been fascinated by the intersection between science and public opinion. Like all Protestants, we Baptists have a strong tradition of the "priesthood of all believers": every believer's right to access God (or Truth) for him or herself, without the intermediation of a priest (or expert). It's likely that this religious doctrine is at the root of many of our secular (though no less sacred) democratic values; if the peasant's interpretation of scripture is just as valid as the priest's, you're not far from "all Men are created equal."

Many in America seem to take the equality of Man to imply an equality in Men's opinions. In fact, this faith in the wisdom of the common man over expert knowledge pervades American politics - especially among conservatives. William Buckley famously quipped, "I should sooner live in a society governed by the first 2000 names in the Boston telephone directory than in a society governed by the 2000 faculty members of Harvard University." Bill Kristol goes further in a column praising Sarah Palin, entitled "Here the People Rule":

Most of the recent mistakes of American public policy, and most of the contemporary delusions of American public life, haven’t come from an ignorant and excitable public. They’ve been produced by highly educated and sophisticated elites.

Needless to say, the public’s not always right, and public opinion’s not always responsible. But as publics go, the American public has a pretty good track record.

Nobody likes to be made to feel dumb, but to some in America, a claim to privileged knowledge carries connotations of elitism - indeed, violating the very notion that all Men are created equal. As long as the Layman has access to the same raw information as the Expert, he should be able to come to an equal - or superior - conclusion.

And with the advent of the Internet, the Layman now has access to an explosion of information - much of it user-generated. Should this new-found wealth of information be celebrated or feared? Certainly, open access to accurate information can lead to more informed choices. But sourcing insight from the crowd instead of experts also has a dark side. For laypersons without the proper training and background knowledge to interpret raw information, too much of it can be confusing; in the hands of those willing to manipulate it for political gain, it is positively destructive.

And this brings us back to the Pew poll. More Americans are finding health information online; used properly, this information can help patients know when to seek treatment and which questions to ask of their doctors. But what happens when 40% of Americans go further, and stubbornly substitute their own judgment for that of the doctor? What happens when they refuse to believe a diagnosis that there's nothing wrong, or when they demand treatments they happened to read about online? Four things start to go wrong.

First, people make decisions based on disinformation. When people search for information online, they often get it from sources that aren't very credible. According to one study, "although 8 in 10 American adults have searched for healthcare information online, 75% refrain from checking key quality indicators such as the validity of the source and the creation date of medical information." One doctor points out (on WebMD, ironically), "A lot of the stuff on the Internet, especially on health-related bulletin boards, is pure impression and anecdote, and they just don't have a lot of scientific validity."

Second, you get a lot of hypochondriacs imagining symptoms they don't have. The phenomenon is so widespread there's even a term for it: "cyberchondria." Doctors describe the problem in a story on NPR:

The patients come in quoting commercials they've seen on TV, requesting pills or diagnostic tests, describing new treatments for diseases they're convinced they have.

"Five or six times a day, people come in saying, 'I looked this up on the Internet.' Or, 'I saw this and I wonder if I could have this?' " Moore says.

Sometimes her patients are right; more often they're wrong, she says. But Moore isn't judgmental about their self-diagnoses. She views it as a natural response to the ocean of health information that surrounds every modern person, and relates it to her own experience in medical school.

"There's a syndrome in medical school they teach us about called 'medical student syndrome,' " Moore says. When medical students learn about a disease for the first time, it's common for them to become convinced, at least temporarily, that they themselves are afflicted.

"Every time you start reading about this disease, you say, 'Oh my god, I have that!' Then you read about another disease and you say, 'Oh my god, I have that, too!' " Moore says.

"So, the same thing that triggers medical students to worry that they have these diseases is part of what triggers people watching television or surfing the Internet to believe they have these conditions. Continued re-exposure to suggestions of symptoms makes people look for things."

The problem, says Moore, is that it can take a lot of work to convince her patients that their own diagnosis is wrong.

This is backed by statistical evidence. For example, a recent longitudinal study of 515 individuals (White and Horvitz, 2008) found that searching for common symptoms online can lead web users to discover information on rarer, more serious diseases, escalating anxiety that they may have these diseases themselves. The study found this anxiety persists over time and affects medical decisions. The authors put it bluntly:

Benigeri and Pluye [2003] show that exposing people with no medical training to complex terminology and descriptions of medical conditions may put them at risk of harm from self-diagnosis and self-treatment. These factors combine to make the Web a potentially dangerous and expensive place for health seekers.

Third, you get higher health care costs. Around one out of every five patients is considered "difficult" by doctors; these patients "ignore medical advice, complain persistently, insist on unnecessary tests and do not express appropriate respect." It's not just the hypochondriacs demanding tests for diseases they don't have; you also get people asking for treatments they read about online or saw on TV advertisements - and most doctors oblige, lacking the time to explain to these "demanding patients" why they're wrong. NPR continues:

[Dr.] Zebley says that several times a week a patient comes in asking for a test that he is 99.99 percent sure would be a complete waste of time. But Zebley will almost always give the patient the test they request, even though he knows it will cost money and time…

So doctors will order you tests you don't need. And they will write you prescriptions for pills you probably shouldn't take — which is a huge problem with antibiotics, for example.

And, Zebley says, doctors even do operations, like back surgery, that they probably shouldn't do. They do it, he says, because you want it, have become convinced that you need it, and doctors fear that if they don't give it to you, they'll lose you.

It's unclear how high patient demands drive up costs in our health care system. [Dr.] Moore estimates that about 30 percent of the costs in her practice are driven by patient requests.

Last, you get a strained doctor-patient relationship. When patients, supposedly "empowered" by information they found online, don't get the diagnosis or treatment they wanted, they often grow suspicious of the doctor, suspecting the doctor is hiding something. Maybe it's financial gain, or maybe some vast doctor's conspiracy - or maybe perceived arrogance or lack of concern in the doctor. Whatever it is, it's a real concern:

Barsky and Fallon say hypochondria often breeds suspicion and distrust between a sufferer and his or her physician. Some doctors may be too quick to dismiss the worries of hypochondriacs, and hypochondriacs are likely to ruin relationships with good physicians by second-guessing them from the start.

Hypochondriacs may "get suspicious when their doctor doesn't give them a referral or a test they ask for," says Fallon. "They can feel like they're not being listened to, and so they'll go shopping for another doctor and wind up repeating the process."

In other words, there's a reason we trust doctors' judgment over holistic healers' or our own: not everyone goes to med school, and therefore not everyone can make informed medical decisions. Cyberchondriacs and demanding patients both highlight some of the dangers of making scientific information available en masse to the public. The more information we laypeople have, the more we can feel like "experts," even if we lack the training or perspective to actually interpret the raw information. As a result of trusting our own judgment instead of experts', we as laypeople can become overconfident in our opinions, leading to very bad decision-making - for ourselves and society. Hypochondriacs, at least, are better off protected from information, placing their faith in doctors' educated opinions rather than their own.

Which, finally, brings us to Hackergate/Climategate. At this point, I feel like the parallels between health information and Climategate should be fairly obvious, but I'm going to make it explicit and give the climate conspiracy theorists their own term: "climochondriacs." Just as the web has created a host of cyberchondriacs, it seems also to have created a much more dangerous army of climochondriacs: people who become paranoid that global warming is a scam, based on information they find on the internet, and cannot be convinced by objective reassurance from experts. With this description of the actual DSM IV condition of hypochondria, the term is just too perfect:

The small fraction (1-5%) of the general population afflicted with the disorder hypochondria are particularly predisposed to the emergence of unfounded concerns, especially since they are often undiscerning about the source of their medical information [Barsky and Klerman 1983]. Studies have shown that hypochondriacs express doubt and disbelief in their physicians’ diagnosis, report that doctors’ reassurance about an absence of a serious medical condition is unconvincing, and may pay particular attention to diseases with common or ambiguous symptoms (p. 4).

The only question that remains: should scientists protect climochondriacs from raw information on climate change? Should scientists like Phil Jones and Michael Mann be forced to be provide open access to their data, knowing climochondriacs will misinterpret it?

That would be easy to answer - if prominent skeptics were willing to show a little healthy respect for the experts by not deliberately misinterpreting their work. Unfortunately, this looks like too much to ask. A good example is the controversy over "adjustments" to raw climate data. For example, denial sites like WUWT have "revealed" that raw temperature data in New Zealand differs from official data. Nevermind that the New Zealand scientists have verygoodreasons for adjusting the data - deniers like Anthony Watts just aren't impressed, and would rather trust some guy on the internet than listen to the experts who actually do climate science. To people who don't understand how climate science is done, "adjustment" sounds suspiciously like "fraud." (For a great example of this, check out this comment at Dot Earth, in which a skeptic who doesn't understand the definition of "temperature anomaly" wastes a climate scientist's time emailing back and forth to accuse him of fraud.)

Another example is Anthony Watts breathless unveiling of "the deleted data from the 'hide the decline' trick," with the deleted data highlighted in red. And what do you know, the red part trends downward. Of course, to people who actually do paleoclimatology, this is merely indicative of the well-documented "divergence problem" (explained here and here), and perfectly within the bounds of normal science. However, knowing that if this data were publicized, denial blogs would post big red graphs of the "decline" without explaining that it does not represent actual temperatures, the study's authors opted not to give "fodder to the skeptics." How do I know they used those words? Because Watts quotes them in the very same post in which he uses the study as "fodder." Maybe the scientists should have been more forthcoming, but Watts is sure giving them a good reason not to.

Note that I am not advocating blind deference to experts - experts can certainly be wrong too. But it is important that we as laypeople maintain a self-aware respect for the people who spend years of their lives studying complex issues, and recognize the limitations in our own understanding of the issue; this applies both to doctors and to climate scientists. I'm not a scientist, and haven't studied climate science in depth, but I have the self-awareness to give those who do study it the benefit of the doubt. ("Those who do" does not include self-proclaimed "auditors.")

Denialist commentators, on the other hand, often jump to the conclusion that scientists have no good reasons for making the adjustments to data that they do. The irony here is that while deniers call for skepticism and emphasize the uncertainties in climate science, they show such certainty in their own conviction that scientists they have never met are fraudulently falsifying data as part of a vast conspiracy to steal their money and grab power. Given these commentators' lack of training in the intricacies of climate science, they ought to at least give scientists the benefit of the doubt, and raise issues with scientists themselves before making unvetted - and potentially misleading - assertions in public. With the freedom to view propriety data comes the responsibility to use it honestly.

Ultimately I think data should be as openly available as possible to avoid even the appearance of impropriety - climate scientists have the moral high ground, and should do everything to maintain that position. As long as deniers believe climate science is the product of a conspiracy, they will never trust any amount of evidence produced by climate scientists. The most important point is therefore to ensure deniers have no possible justification for their claims of a cover-up, even if it means giving them fodder for their misinterpretations of objective data. As Romans 12:20 puts it, "If your enemy is hungry, feed him; if he is thirsty, give him something to drink. In doing this, you will heap burning coals on his head."

5 comments:

Another factor is our bias toward favoring confidence over expertise. There is a lot of conflicting stuff on the Internet, but what people usually end up believing is that which is most confidently communicated as correct. I fell for this a couple years back. The same would apply for quack-health info and climate denial. Website says you've got X disease vs. doctor who says you show symptoms that indicate you might have Y or Z. Denial sites are 100% sure AGW is a hoax vs. scientists who say they're 95% confident about AGW, but sensitivity varies from 1.5-4.5 degC, and they're not all that certain about how much sea levels will rise and when.

Great point. And there's a double standard here. The same people who yell "fruad! It's a hoax!" with such certainty also complain the most loudly that scientists aren't acknowledging uncertainty. In other words, explain things accurately and acknowledge the *minor* uncertainties, and people aren't persuaded. Explain things confidently, and they accuse you of being dishonest.

There's also something called the "Dunning-Kruger Effect," which basically is the phenomenon of incompetent people being more confident, simply because they lack the self-awareness to recognize their lack of expertise. A better explanation of climate denial does not exist.

This is why I think the SwiftHack affair will make little difference to the inaction the world is already subconsciously committed.

*e.g. flawed correspondent inferences: Yamal proxies were excluded by scientists in order to improve accuracy, which denialists interpret as foul play because the conclusion happens to produce a hockey-stick.

I've just been reading through a number of your posts. It's good to hear yet another voice of reason based on strong evidence. I'll certainly keep an eye out for more of your posts on climate change - and the post on Nickleback got me laughing also!

About Me

A Southern Baptist preacher's kid from Oklahoma, I currently do sales & marketing consulting in the DC area. I'd like to think I know something about messaging, and try to apply lessons from the business world to climate change, health care, and the affairs of state.