It is true that repetition of denialist arguments is a strategic error, and that the repeition itself can reinforce their arguments. One has to consider this when dealing with nonsense and debunking it not to fall in the trap of just fisking it, which can defeat the purpose of your writing – to decrease the amount of BS in the world.

However, a knowledge of the history of denialism is of utility in this discussion. Let us take, for example, the tobacco companies. The publication of internal memos from the tobacco companies has proven definitively that the companies were engaged in a strategy of willful deceit. Documents from as early as 1963 (PDF) demonstrate conclusively that the companies were aware that tobacco is addictive, that tobacco causes disease including emphysema and cancer, and that they were in the business of selling a unhealthy drug-delivery device.

Despite these facts that were well known to them, they engaged in a campaign of disinformation to fight labelling information, smoking restrictions and regulatory control of cigarettes for over 30 years. Now that these documents have come to light their tactics have been studied in detail by various researchers and you see how effective denialism can be when unrecognized. Careful studies have definitively proven denialist tactics were being used to thwart public health for instance:

ICOSI began as a conspiracy among seven tobacco company chief executives to promote internationally the fiction of a “controversy” regarding smoking and disease [15]. It quickly developed into a multi-million dollar global organization with a new name, expanding membership, and a broader mandate. Relying on a network of centralized staff, member company senior personnel, consultants, lawyers, and NMAs, ICOSI’s successor, INFOTAB, operated as an anti-WHO. Its mission was to systematically thwart public health by globalizing “doubt” not only about smoking and disease, but also about the economic costs of tobacco, the social costs of smoking, the motivations of tobacco control advocates, the relationship between smoking and advertising, and the need for smoking restrictions. Where it succeeded, INFOTAB unquestionably facilitated the spread of the global tobacco disease epidemic.

In addition the documents have shown how tobacco companies carefully cherry-pickedtheir research, bought “journalists” to promote their agenda (including Fumento and Milloy), created fake research organizations and used libertarian denialist think tanks such as Cato, the Heritage Foundation, and the Competitive Enterprise Institute to further disseminate lies about the healthiness of smoking and the risks of environmental tobacco smoke (see here). All the while the tobacco companies hid their agenda, and their financial ties to these journals, organizations, and journalists.

More below the fold…

Denialism as a strategy to frustrate public policy and public perception of a problem is demonstrably effective according to the scientific literature. This means it can not be ignored. Mooney’s use of the Heartland Institute’s conference is a particularly poor example of an event that should be ignored as well. The opposition of global warming by think tanks like the Heartland Institute, and CEI (a known offender on tobacco denialism) is part of a similar strategy by interested parties to create disinformation and public distrust of the science of global warming. We know these strategies can work, the tobaco companies have proven it. The key isn’t ignoring their efforts to sow the seeds of disinformation in the popular press and influence political leaders, it’s to point out the strategy itself. Real science doesn’t progress by press release, cherry-picking, and nit-picking the work of other scientists to death. Nor does it progress by political conferences hosted by denialist organizations under the guise of a scientific meeting. Pointing out the falseness of these strategies, their previous use by other liars seeking to decieve the public, and the involvement of some of the same liars like Fred Singer (a tobacco/cancer denialist) and Steven Milloy, a proven shill, can be very helpful to people in illustrating the difference between real science and denialism.

So while I agree that careless debunking of nonsense can help promote it by accident, and this is a scientifically-proven phenomenon, the evidence for the efficacy and organization of denialist campaigns against science are also definitively proven in the literature. They can not be ignored. What is needed is careful demonstration of the dishonesty of the tactics of these groups, and an awareness of the general use of these techniques by highly organized denialists to gain entry of their anti-science agenda into the popular press and political sphere. PZ, I think summarizes it pretty well:

We must counter the superficial advantages of the anti-science side by directly countering their claims. Look at it this way: if one side is promising a million dollars for free, and my side is promising an opportunity for hard work, I don’t win by announcing that I don’t have a million dollars, but I do have some tools. What I have to do is show that they don’t have a million dollars — I have to go straight for the dishonest advertising practices of my competition and expose them.

I would emphasize, go straight for the dishonesty and the practices. While debunking silly nonsense is fun, and I’ve certainly indulged, it’s not as helpful as demonstrating the fundamental dishonesty of organized anti-science. Ultimately our goal is educating people what real science is and, importantly, what real science is not and how to tell the difference. Merely showing how a given argument is silly or incorrect doesn’t help a great deal. However, showing people how to tell the difference between these lies and the real deal is a far more valuable skill to impart, and I think that’s what a majority of us are doing when we confront nonsense from denialists everywhere.

Comments

I’m not sure that you are really quite agreeing with PZ. Sure, you both are saying to go straight to pointing out the dishonesty of the other side, but he is also advocating directly countering the claims, while you seem to be more skeptical as to how helpful that is.

Ms. Smith at ERV contrasted the Mooney of 2006 and the Mooney of 2008. The question is, what happened in the interim years. The answer is that Mr. Mooney met Matthew Nisbett and the former was brainwashed by the latter. It is unfortunate that Mr. Mooney didn’t move to Los Angeles in 2006.

Ignoring irrational thinking is how we got into this mess in the first place, and the answer is not more of the same. You shouldn’t teach people critical thinking skills without demonstrating how they can be applied to separate what is true from what appears to be true. And you can’t point out the “dishonest” tactics unless you can also show they were spouting **false** claims, for unless the claims are scientifically false, the dishonesty is moot.

JJ, what I said that debunking done carelessly isn’t helpful and may hurt. I don’t see many examples of that here on the scienceblogs – that is just weak fiskings of run of the mill crankery.

By carefully I mean that it is important to avoid excess repetition of the myth or topic being debunked, to not pick on tiny little cranks and draw them into prominence (it’s better to wait for news coverage or some broad exposure rather than spawn the coverage yourself), and to demonstrate not only why what they say is wrong but also why the source is fundamentally unscientific and worthy of mockery.

PZ does a good job of this. He points out not only why cranks are wrong but why fundamentally what they do isn’t science. It’s an important aspect of debunking. I think the sciencebloggers do a good job in their concerted efforts to demonstrate not only are people like DI or CEI wrong, but also fundamentally crackpots and scoundrels using the techniques of liars and rogues.

I think you have this basically right here, Mark. One thing I would point out is how often denialist FUD relies on systematic character assassination of experts in the field. Portraying evolutionary biologists as a bunch of atheists who want to brainwash your kids, for example. In addition to going after the tactics, it would be a good idea to make sure that we reinforce an image of those who employ them as industry PR-whores and liars.

“PZ does a good job of this. He points out not only why cranks are wrong but why fundamentally what they do isn’t science. It’s an important aspect of debunking. I think the sciencebloggers do a good job in their concerted efforts to demonstrate not only are people like DI or CEI wrong, but also fundamentally crackpots and scoundrels using the techniques of liars and rogues.”

Well, you already made my point for me…and I didn’t see it. That’s what happens when you keep a tab open too long before commenting.

Mark H: “By carefully I mean that it is important to avoid excess repetition of the myth or topic being debunked, to not pick on tiny little cranks and draw them into prominence (it’s better to wait for news coverage or some broad exposure rather than spawn the coverage yourself)”

This isn’t much different from what Mooney was saying, except that he was dealing not so much with cranks that were tiny to begin with, but cranks that have become weak to the point that to answer them would be to draw them back into prominence.

That does not include the HI then. The AGW cranks have the funds and the organization skills to get their nonsense into the media without our help. Granted, this conference was largely treated with contempt by most mainstream media sources, progress if I’ve ever seen it. But quite a few reported on it favorably and due to no action of ours.

I’ve already commented on this over at Chris’ blog, but basically he is right that we should debate anti-scientists (though both ERV and PZ have shown that this can be done effectively), but we need to address their falsehoods. Science is on our side, but that doesn’t help if people are unaware of this.

Humor through sarcasm and clever analogy also helps make a mind receptive to facts and it keeps people from automatically erecting the impenetrable faith shields that go up as soon as someone detects a threat to their most cherished delusions– the things they WANT to believe are true.