Monday, 17 March 2008

As scientists observe the universe and its contents other academics observe the scientists. Prof. Harry Collins is one of these scientific voyeurs who peers through the crack in the ceiling of knowledge-building and records how it is done. My girlfriend and I went to a little talk of his in the Exeter quays last week.

Having spent a long time with gravitational-wave physicists he has given man-birth to Gravity's Shadow (2004), the sort of book you could knock an ox out with, running to some 870 pages. During his experience with these experts he started thinking about the nature of expertise. This is the subject of his (and Robert Evans') most recent (pithier) offering, Rethinking Expertise (2007).

His talk was a summary of one of the book's points. In his forthright, laddish way he sketched a quick and basic history of the philosophy and sociology of science. Without technical labels, which I add back in here, he covered falsification (Popper, 1959, esp. pg. 27–48) and the problems with it (Kuhn, 1977; Lakatos, 1970, 1974; Maxwell, 1972; Meehl, 1978, 1990; Putnam, 1974) including the problem of facts (who is to know if it is the fact or the method to get to that fact that is wrong?). With this breakdown in philosophical robustness so trust in experts dwindled.

In a nod to Feyeraband’s (1975) Dadaistic 'anything goes' idea, the suggestion was that all trust was lost. In a Feyerabandian world science is no different to other things like Jade Goody's opinions or reading the future by dropping fruit on the floor and examining the patterns. Peer-reviewed science has no great difference in terms of methodological soundness to that of personal experience in this scenario. I strongly agree with Collins that a society like this is "not one you would like to live in"; hell is Feyeraband's world.

His concern is that despite the intellectual repugnance of this scenario, it is gaining some currency under the idea that "ordinary people are wiser than experts in some technical areas". For an example he cited the case of MMR and autism where a dodgy paper (Wakefield et al, 1998)and the media forged and perpetuated a scientific controversy where none existed. In his words, the link between these two on the available evidence is "zero" (he is right). To this one woman in the audience shouted "That's not true"; she knew someone whose child had both MMR vaccine and autism.

The whole force of Collins' point was evident in this little bit of heckling. It is stunningly simple (as basic as common-sense gets some might think), but no less important for that: All else being equal we should listen to "those who know what they are talking about". In this case, the evidence said there was no link, the co-occurrence of the MMR vaccine and autism in this child she knew was irrelevant. If she had retorted, "What, it is just coincidence that the child has autism and had the vaccine?", the answer can only be "Yes, precisely that: a coincidence."

I was reminded of an episode of House where House lambasts a junior for choosing the a treatment option on personal factors despite the fact it saved the patient's life. This is because statistically more people would be killed using that particular treatment over time than the evidence-based option.

Had the junior doctor used the evidence-based (correct) option the patient would have died. Nevertheless, it would have been the right option. Experts must be allowed to be experts, even if they are wrong occasionally because the cost of having no experts is far too great. In the case of MMR, lives are at risk.

Collins and Evans call expertise "the pressing intellectual problem of the age". It's certainly a biggie. And as the number of opinions on technical issues froth up with more user-generated content online I think their point, although simple, is very important. No one is cleverer than everyone. But that does not mean everyone is cleverer than experts.