Wednesday, January 31, 2007

Having recently read all the Sherlock Holmes stories (not for the last time, I suspect), I can assure the reader that Holmes never utters the words, ‘Elementary, my dear Watson’. ‘Elementary’ sometimes, and ‘My dear Watson’ often, but never together. Conan Doyle's son used it in an attempt to prolong Holmes’ career, though not to any enduring effect.

Of course, that should not stop anyone ‘quoting’ Holmes/Conan Doyle to that effect. It is a very good phrase. In John Lennon’s case, it has even been embellished - twice:

As Umberto Eco has observed, William of Ockham (c. 1285-1349) was clearly an ancestor of Sherlock Holmes (fl. 1887-1927). When the famous detective asserted that:

How often have I said to you that when you have eliminated the impossible, whatever remains, however improbable, must be the truth. (Arthur Conan Doyle, The Sign of Four)

… he was only reiterating his philosophical forebear’s famous dictum that one should always prefer the hypothesis which calls for the least number of assumptions. Or, in Ockham’s own words:

Entia non sunt multiplicanda praeter necessitatem

- which is to say, entities should not be multiplied unnecessarily. It’s an attractively simple, all-purpose principle, a sort of intellectual Swiss Army knife, and it’s interesting that such otherwise dissimilar figures as Holmes and Ockham should agree on this issue. Not only did they live six centuries apart but, as nominalist philosopher/amateur coiffeur and consulting detective/cocaine addict respectively, their experience could scarcely have been more different.

At first sight it seems that this most striking convergence must surely betoken a profound truth. Surely such a preposterous yet consonant mésalliance can only rest on a deeper meaning? But does their agreement indicate a verity spanning half a millennium and more, or merely a prejudice so commonplace that only the most foolhardy - or the most brilliant - would think to challenge it? Mere endurance is not evidence, one way or the other. After all, slavery was considered the natural order for millennia.

What should we do when ‘impossible’ turns out to be only the name of that part of reality for which our prejudices have no place, to sustain which we therefore have to introduce ever more - and ever more incredible - assumptions? Should we be whipping out that Swiss Army knife again when the situation really calls for is an intellectual linear accelerator? Is the principle of parsimony merely a principle of false economy?

For example, there is the question of cosmology. Leaving aside the obvious facts that the Earth is flat, stands still, and the Sun rotates around it once a day (to which either may have subscribed), Holmes and Ockham would both have taken a dim view of Copernicus. In fact according to Watson, Holmes was unaware that the Copernican revolution had ever taken place, so their intellectual alliance is obviously all the closer. And one cannot help but feel that Ockham would have felt much the same, if not about a heliocentric universe as such then certainly about the rather radical re-jigging of the cosmos Copernicus was proposing.

In assessing the change wrought by Copernicus, it is important to recognise why his theory was so appealing. First and foremost, Copernicus’ solar system did not work any better than Ptolemy’s, and indeed was in most respects an attempt to save the Copernican fundamentals such as a closed space, the circle as the perfect form of celestial motion, the epicycle and the eccentric as basic explanatory tools, and so on. Indeed, Copernicus’ account predicted the position of the stars rather worse than Ptolemy's. Nor was it more economical: not only did it call for exactly the same suite of spheres and epicycles but now there were 48 of the latter instead of just 40.

In fact, the only major difference was that the solar system was now centred on the Sun rather than the Earth. No basic change in the fabric of the heavens, no shift of method. Only a change in the arrangement of things. Of course this was pretty reasonable, given the appalling implications of a truly unbounded space combined with the lack of stellar parallax. In other words, there were almost no additional ‘entities’ or assumptions, yet what was needed was the complete re-conceiving of the universe - something which both Ockham and Holmes would have reviled.

Thus, the entire history of astronomy between Ptolemy’s atlas of the Heavens and Copernicus setting to with a chain saw consisted precisely of the kind of minimalist thinking Ockham called for - the same structure exactly, only with more or less epicycles. The problem is that possible/impossible is not the only dimension of knowledge. It might be so, if we already knew all the dimensions of reality, and could therefore derive every possible truth from first principles. But we don’t, so we must respect Holmes’ and Ockham’s right to believe what they please, but concur with Aristotle that:

A likely impossibility is always preferable to an unconvincing possibility. (Poetics, 24, 1460).

For, as an only marginally less prominent philosophical figure has explained:

The impossible often has a kind of integrity to it which the merely improbable lacks. How often have you been presented with an apparently rational explanation of something which works in all respects than one, which is just that it is hopelessly improbable? Your instinct is to say, ‘Yes, but he or she simply wouldn’t do that. ’... The [impossible] merely supposes that there is something we don’t know about, and God knows there are enough of those. The [improbable], however, runs contrary to something fundamental and human which we do know about. We should therefore be very suspicious of it and its specious rationality... If it could not possibly be done, then obviously it had been done impossibly. The question is how? (Douglas Adams, The Long Dark Tea Time of the Soul, p. 132)

As usual, fiction is so much more truthful than mere fact (leaving aside the debatable conclusion...). ‘The facts’ are so often little more than a rag-bag, picked over now and then as circumstances dictate; great fiction, on the other hand, is deduced with geometric precision from a central truth, a overriding vision of the inexorable logic which underpins the merely correct.

And from time to time a revolution occurs that reminds us that, beneath the seeming self-evidence of the possible and the impossible there is a more or less hidden framework of concepts which determine what can even exist for us while thinking and reasoning in that mode. And there is no guarantee that the Universe is confined by the concepts which currently confine our minds. Thus, Copernicus’s universe was not merely unknown to Ptolemy; he might well have found it inconceivable. Likewise the relationship between Newton and Einstein and, I suspect, Darwin and quite a few eminent neo-Darwinians.

In short, intellectual history is the history not of intellectual accumulation but of intellectual struggle. And that struggle is not only a struggle to wrest truth from the Universe, but also a struggle to prise the beams from our eyes. Hence the transmutation not only of one scientific method into another, but also of base common sense into science - and, perhaps, science into some form of reason for which science and morality, politics and art are all one. Same for cognitive development.

Now what has all this to do with management? It really comes down to the all but universal management mantra that ‘if you can’t measure it, you can’t manage it’. I suspect that this really originates from Lord Kelvin, who once write that:

When you can measure what you are talking about and can express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind.

Ah, Kelvin. The man who used the latest thinking in physics to prove that Darwinism was impossible. But what is wrong with his emphasis on measurement? Well, as John Maynard Keynes put it:

Am I right in thinking that ... the statistical method ... essentially depends on ... having furnished, not merely a list of the significant causes, which is correct so far as it goes, but a complete list?

For example, suppose three factors are taken into account, it is not enough that these should be in fact verae causae; there must be no other significant factor. If there is a further factor, not taken account of, then the method is not able to discover the relative quantitative importance of the first three.

If so, this means that the method is only applicable where [one] is able to provide beforehand a correct and indubitably complete analysis of the significant factors. The method is neither one of discovery nor of criticism.

In other words, Kelvin is assuming what he should be proving, namely that he knows what he is measuring, rather than using measurement to decide what he is looking at. And that is exactly the fallacy of “If you can’t measure it, you can’t manage it” – the view that we already know how things work, and measurement will only add to the precision of our knowledge. And indeed it will – but it won’t add anything to its accuracy unless we understand the things we are measuring. Which is precisely what we think we are using measurement to do...