Search This Blog

Subscribe to this blog

Follow by Email

Introduction - first draft for comments please

Here's the most of the introduction the new book on Intellectual Black Holes. Commentsplease.

INTRODUCTION

Intellectual black holes

Wacky and ridiculous belief systems abound. One cult promises members a ride to heaven on board a UFO. Another insists the Earth is ruled by lizard-like aliens. Even mainstream religions have people believing absurdities. Preachers have promised 46 heavenly virgins to suicide bombers. Others insist the entire universe is just 6,000 years old (extraordinarily, polls consistently indicate this belief is currently held by about 45% of US citizens – that’s around 130 million individuals). And of course its not only cults and religions that promote bizarre beliefs. Significant numbers of people believe in astrology, the amazing powers of TV psychics, astrology, crystal divination, the healing powers of magnets, the prophecies of Nostradamus, and that the Word Trade Centre was brought down by the US Government. There is even a handful who continue to believe that the Earth is flat.

How do such ridiculous views succeed in entrenching themselves in people’s minds? How are such wacky belief systems able to take sane, intelligent, college-educated people and turn them into the willing slaves of claptrap? How, in particular, do the true believers manage to convince themselves that they are the rational, reasonable ones and that everyone else is deluded?

This book identifies eight key mechanisms that can transform a set of ideas into a psychological fly trap – a bubble of belief that, while seductively easy to enter, can be almost impossible to reason your way out of again.

Cosmologists talk about black-holes, objects so gravitationally powerful that nothing, not even light, can escape from them. Unwary space travellers passing too close to a black-hole will find themselves inexorably sucked in. Increasingly powerful motor is required to resist its pull, until eventually one passes the “event horizon” – the point of no return – and escape is impossible. My suggestion is that our contemporary cultural landscape contains, if you like, numerous intellectual black-holes – belief systems constructed in such a way that unwary passers-by can similarly find themselves drawn in, often never to escape. While those of us lacking robust intellectual and other psychological defences will be most easily trapped, even the most intelligent and educated of us are potentially vulnerable. Some of the world’s greatest thinkers have fallen victim. If you find yourself encountering a belief system in which several of these eight mechanisms feature prominently, you should be wary. Alarm bells should be going off and warning lights flashing. For you are now approaching the intellectual equivalent of a black-hole.

As the centuries roll by, such self-sealing bubbles of belief appear and disappear. Sometimes, many may fizz into existence in one place, because the conditions are just right (an age of superstition). Occasionally, one of these little bubbles may grow huge, perhaps encompassing an entire civilization, before dividing or deflating or popping or being subsumed by another bubble. The greatest threat such bubbles of irrational belief face, perhaps, is the flourishing of a rigorous, sceptical culture that encourages free thought and in which all beliefs are subjected to close critical scrutiny - an "Enlightened" society. However, some bubbles are able to flourish even within such a society.

Aim of this book

The central aim of this book is to help immunize readers against the wiles of cultists, political zealots and other purveyors of intellectual snake oil by, as it were, clearly setting out the tricks of the trade by which such self-sealing bubbles of belief are created and maintained, by revealing how an intellectually impregnable fortress can be constructed around a set of even patently ridiculous beliefs, providing them with a veneer of “reasonableness” and rendering them immune to rational criticism.

Most of us will have at some point experienced the frustration and of trying to change the convictions of someone powerfully committed to a ridiculous belief, and will have come against many of these strategies. My aim here is to provide an overview of eight key strategies, which I call:

In each case I (i) explain the strategy, (ii) diagnose exactly what is wrong with it, and (ii) provide illustrations of how it is applied.

It is worth clarifying seven things at the outset:

1. This book focuses particularly, though by no means exclusively, on religious examples of intellectual black holes. Why, given there are many non-religious examples from which to choose? My main reason for doing so is that while many belief-systems (e.g. political philosophies such as Marxism, New Age philosophies, belief systems involving dubious or bogus medical treatments, and belief systems centred on grand political conspiracies (such as those involving 9/11) also employ various combinations of these eight mechanisms to ensnare minds, religions typically employ a wider range. Historically, the established religions have had a great deal of time and huge intellectual and other resources to deploy in refining their own particular versions of these strategies. They have, as a result, produced some of the most powerful and seductive intellectual black holes. They therefore provide some of the best illustrations.

2. I also want to stress that this book certainly does not argue that all religious belief-systems are essentially irrational. Several recent books have done that, of course. The aim of this book is different. It is not the content of religious belief systems that is attacked here, but the manner in which they are often bolstered and defended. It’s important to realize that any belief-system, including perfectly sensible belief-systems, can be bolstered and defended by means of the same eight mechanisms. To point out that a belief-system is both propped up and defended against intellectual threats by means of bullshit strategies is not yet to show that the content of that belief-system is itself bullshit. It’s worth remembering that many of the same strategies can and have been employed to defend atheistic belief-systems (I’m thinking, in particular, of certain totalitarian atheist regimes). I am not, here, suggesting that atheism is intrinsically any more or less sensible than theism. However, given a belief system is fairly rational, its proponents won’t need to rely on the kind of dubious strategies outlined here in order to bolster and defend it. The fact that many religious people rely pretty heavily on many – in some cases all – of these eight strategies in order to generate the impression that their particular belief system is, at the very least, not unreasonable, would of course be neatly explained by the fact that their particular religious system of belief is, in fact, pretty unreasonable.

3. Third, not only are some atheists guilty of using such strategies to bolster and defend their atheism, some religious people are largely innocent. My aim is not to tar all religious people with the same brush. To say that religion may have produced many of the most dramatic and powerful intellectual black holes is one thing. To insist that every religious person is a victim is quite another. I’m certainly not suggesting that..

4. Fourth, we should acknowledge that those who fall victim to intellectual black-holes need be neither dim nor foolish. The sophistication of some of the strategies examined in this book demonstrates that those who develop and use them are often highly intelligent. They are, in many cases, clever strategies– sometimes very clever indeed. Nor need those who fall foul of intellectual black holes be generally gullible. Victims may, in other areas of their lives, be models of cautious acceptance, subjecting claims to close critical scrutiny, weighing evidence scrupulously, and generally tailoring their beliefs according to robust rational standards. If, after reading this book, you begin to suspect that may yourself have fallen victim to an intellectual black-hole, there’s no need to feel particularly foolish. People far wiser and cleverer than either you and me have also become trapped. Neither need those who create or work to sustain intellectual black holes be particularly bad or deliberately deceitful people. Those who work hardest to sustain intellectual black holes are typically victims themselves. Yes, some intellectual black holes are deliberately fashioned by frauds and con artists. But in most cases, such bubbles of belief are a product of the ingenuity of honest and sincere people genuinely committed to the belief system at its core.

5. Fifth, notice that I am not suggesting that every intellectual black hole will exhibit all eight of mechanisms outlined in this book. Some exhibit some, and others others. In chapter XX, I illustrate how different belief systems employ different combinations of the eight mechanisms, or place different emphasis on them. Belief in a dubious alternative medicine, for example, can be turned into something approaching an intellectual black hole by heavy reliance on just two mechanisms in particular: APRA and playing the mystery card. On the other hand, many religious belief systems employ many of the eight mechanisms – in some cases, all of them. Also note that intellectual black holes tend to be dynamic – the mechanisms used to sustain them are likely to shift and develop in response to new and differing rational threats to the belief system at their cores.

6. It is worth emphasizing that intellectual black holes lie at one end of a sliding scale. The fact is, almost all of us engage in these eight strategies to some extent, particularly when beliefs to which we are strongly committed are faced with a rational threat. And in fact, under certain circumstances, there is little wrong in using at least some of them in moderation (as I will explain). But that is not to say that every belief system is, then, an intellectual black-hole, (in fact, defending a belief system against the charge that it is an intellectual black hole by maintaining that all belief systems are intellectual black holes, and thus no less reasonable/unreasonable, is itself a warning sign that one is dealing with an intellectual black hole – it is an example of the strategy I call “Going Nuclear”). What transforms a belief system into an intellectual black hole is the extent to which such mechanisms are relied upon in dealing with rational threats and generating an appearance of “reasonableness”. The more we start to rely on these kinds of strategy to prop up and defend our belief system, the more black-hole-like that belief system becomes, until a black hole is clearly what we have got. However, even if we have not fallen victim to an intellectual black hole, some of our belief systems may still exhibit an unhealthy reliance on the same strategies.

7. Lastly, like any analogy, the black hole analogy breaks down if pushed too far. There are differences between physical black holes and their intellectual equivalents. Here’s an obvious illustration: a physical black hole is something from which you can never escape, but people can and do sometimes escape from intellectual black-holes. Individuals do occasionally find the resources to think themselves clear of even some of the most seductive and powerful examples.

Other explanations for why we believe

This book examines eight key mechanisms by which belief systems can be transformed into intellectual black holes. It doesn’t attempt to explain why we are drawn to particular belief systems in the first place. Why, for example, is belief in a god or gods, and in other supernatural beings, such as ghosts, angels, dead ancestors, and so on, so widespread? These kinds of belief appear to be universal, and there is some evidence that a propensity or disposition towards beliefs of this kind may actually be innate – part of our natural, evolutionary heritage. The psychologist Justin Barrett (REF XX), for example, has suggested that the prevalence of beliefs of this kind may in part be explained by our possessing a Hyper-Active Agent Detection Device, or H.A.D.D. The H.A.D.D. HypothesisHuman beings explain features of the world around them in two very different ways. For example, we sometimes appeal to natural causes or laws in order to account for an event. Why did that apple fall from the tree? Because the wind blew and shook the branch, causing the apple to fall. Why did the water freeze in the pipes last night, because the temperature of the water fell below zero, and it is a law that water freezes below zero.

However, we also explain by appealing to agents – beings who act on the basis of their beliefs and desires in a more or less rational way. Why did the apple fall from the tree? Because Ted wanted to eat it, believed that shaking the tree would make it fall, and so shook the tree. Why are Mary’s car keys on the mantelpiece? Because Mary wanted to remind herself not to forget them, so put them where she thought would she spot them.

Barrett suggests that we have evolved to be overly sensitive to agency. We evolved in an environment containing many agents – family members, friends, rivals, predators, prey, and so on. Spotting and understanding other agents helps us survive and reproduce. So we evolved to be very sensitive to them – overly sensitive in fact. Hear a rustle in the bushes behind you and you instinctively spin round, looking for an agent. Most times, there’s no agent there – just the wind in the leaves. But, in the environment in which we evolved, on those few occasions when there was an agent present, detecting it may well have saved your life. Far better to avoid several imaginary predators than be eaten by a real one. Thus evolution will select for an inheritable tendency to not just detect – but over-detect – agency. We evolved to have (or, perhaps more plausibly, to be) hyper-active agency detectors.

If we do have an H.A.D.D., that would at least partly explain the human tendency to feel there is “someone there” even when no one is observed, and so may at least partly explain our tendency to believe in the existence of invisible agents – in spirits, ghosts, angels or gods.

Now I am not here endorsing this particular explanation for widespread belief in such invisible agents (though I suspect there is some truth to it). The fact is that, even if we do possess an H.A.D.D. that would at best only explain the attractiveness of the content of some of the belief systems we will be examining. Many wacky belief systems, such as crystal healing or palmistry or numerology, involve no hidden agents at all. I mention the H.A.D.D. hypothesis only to illustrate the point that the eight mechanisms identified in this book for turning a belief system into an intellectual black hole are not intended to rival such psychological and evolutionary explanations for why we believe what we do. My claim is that once we find ourselves drawn to a belief system, for whatever reason, then these eight mechanisms may come into play to bolster and defend it.

Note that the H.A.D.D. hypothesis does not say that there are no invisible agents. Perhaps at least some of the invisible agents people suppose exist are real. Perhaps there really are ghosts, or spirits, or gods. However, if the H.A.D.D. hypothesis does correctly explain why we suppose that such invisible agents exist, then the fact that large numbers of us believe in the existence of such invisible agents supplies no evidence that any such agents exist. It will no longer do to say “Surely not all these people can be so very deluded? Surely there must be some truth to these beliefs, otherwise they would not be so widespread?” The fact is, if the H.A.D.D. hypothesis is correct, we are going to likely to believe in the existence of such invisible agents anyway, whether or not they exist. So the fact that they do exist is no evidence that they are real. Of course, there was already good reason to reject such appeals when it comes to beliefs of a religious, supernatural or paranormal character… we know already that A{{NOT FINISHED THIS BIT YETfter all, 130 million citizens of one of the richest and best-educated populations on the Planet believe the entire universe is six thousand years old. If the H.A.D.D hypothesis is correct, then it. …adds a further nail to the coffin of that kind of justification for belief in invisible agents.}}

Theory of Cognitive DissonanceAnother psychological theory that may play some role in explaining why we are drawn to the kind of strategies described in this book is the theory of cognitive dissonance. Cognitive dissonance is the psychological discomfort we feel when we hold beliefs or attitudes that conflict. The theory of cognitive dissonance says that we motivated to reduce such dissonance by either adjusting our beliefs and attitudes or rationalizing them.

Aesop’s story of The Fox and The Grapes is often used as an illustration. The fox desires those juicy-looking grapes, but then, when he realizes he will never attain them, he adjusts his belief accordingly to make himself feel better – he supposes the grapes are sour.

How might the theory of cognitive dissonance play a role in explaining why we are drawn to using the kind of belief immunizing strategies described in this book? Here’s an example. Suppose, for the sake of argument, that our evolutionary history has made us innately predisposed towards both a belief in supernatural agents, but also towards forming beliefs that are, broadly speaking, rational, or at the very least not downright irrational. That might put us in psychological bind. On the one hand, we may find ourselves unwilling or even unable to give up our belief in certain invisible agents. On the other hand, we may find ourselves confronted by overwhelming evidence that what we believe is pretty silly. Under these circumstances, strategies promising to disarm rational threats to that belief and give it at least the illusion of reasonableness are likely to seem increasingly attractive. Such strategies can provide us with a way of dealing with the intellectual tension and discomfort such innate tendencies might otherwise produce. They allow true believers to reassure themselves that they are not being nearly as irrational as reason might otherwise suggest - to convince themselves and others that their belief in ghosts or spirits or whatever is, even if not well-confirmed, at the very least, not contrary to reason.

So we can speculate about why certain belief systems are attractive, and why such strategies are employed to immunize them against rational criticism and give them a veneer of “reasonableness”. Both the H.A.D.D. hypothesis and the theory of cognitive dissonance may have a role to play. The extent to which they do have a role to play would be the subject of a rather different sort of book.Two threats to the rationality of theism

Our discussion will include several examples of how our eight mechanisms are used to deal with intellectual challenges to theism – to belief in God. I will focus on two challenges in particular: (i) the evidential problem of evil, and (ii) the problem of non-temporal agency. Because it is easy to underestimate the power of these objections, it’s worth clarifying them at the outset.

The focus on mechanism is a powerful one, but one of the commonest repudiations I receive in talking people out of their absurd beliefs is that I'm somehow spoiling it all for them. "After all," they say, "what harm does it do someone to believe things that make them excited or happy? If someone shows me how a conjuring trick is done, doesn't that destroy the sense of elation I get from the original trick? Aren't you destroying our fun, spoilsport?!"

I'm not sure whether you're going to address this point, but I think the introduction would be a good place for it because in my experience that's going to be a lot of people's immediate reaction - as a reaction to feeling a fool as much as anything.

Perhaps mention *why* you want to alert people to these signs of an approaching event horizon: what it will mean to their lives to be constantly sucked into an intellectual black hole.

I use the river metaphor quite a lot: if you're constantly washed down a river, and into every stagnant side-pool or sucking whirlpool, your character will remain undeveloped, and you'll be victim to all sorts of scams. Learn to spot the mechanisms, and you can steer yourself in the river, going where you want. Even getting out, if that's not stretching the metaphor too far...

I haven't read the entire chapter, but I second those who calls for a correction of the number of virgins (houri's). While the number of 72 migth be contested by some contemproary islamic scholars, 46 is simply erronous.

As mentioned, I think quite a few modern islamic apologists realise that this concept is quite problematic. They will therefore try to downplay / reinterprete etc.

It is therefore difficult to find a unifying scholary account on the issue.A fairly good condensed approach can be found here: http://www.wikiislam.com/wiki/72_Virgins

It does what an intro should do: tell you what's to come in a way that makes you want to read what is to come.

Minor drafting things:1. "So the fact that they do exist is no evidence that they are real." Reads very oddly. Is the first "they" different from the second, or are you saying that just because some exist does support any old agent you think of?

2. You've duplicated astrology in the first paragraph.

3. I agree with Anonymous that "bullshit" is perhaps not the best word. I think it jars, not because its "lightweight", but because it sits badly with Frankfurt's use of the term and Frankfurt's analysis of bullshit is so wonderful it'd be a shame not to adopt it!

I do see what you are trying to say with bullshit. That some of the things proctected by Intellectual Black Holes are so knuckle-draggingly, droolingly, headbangingly shit that it takes a black hole to preserve them. And I do think you're right, to analyse the pathology of the beliefs you need to establish that these beliefs are pathologically shit. (DM's post suggests a real pathology). Not that I can think of a pithy expletive to use instead of bullshit.

"defending a belief system against the charge that it is an intellectual black hole by maintaining that all belief systems are intellectual black holes, and thus no less reasonable/unreasonable, is itself a warning sign that one is dealing with an intellectual black hole"

If "going nuclear" is sufficient for an intellectual black hole wouldn't it be a supermassive one? One that includes:1. Any sociology department that pursues the Strong program2. Anybody to whom both the predicates "intellectual" and "French" apply.3. Any department with "studies" in it ("media studies", "international studies" etc.)4. Critical Theory5. Anyone who uses the word "paradigm" in a non-scientific discipline6. To be frank, anyone who uses the word "paradigm"

Just to disagree with Cassanders slightly - the article is worded so that it simply claims that *some preachers* offer the reward of virgins. It doesn't say that the 'correct interpretation' is 72 virgins, or that it's an undisputed piece of Islamic theology - only that the claim is made and needs evaluation.

So although the number itself needs correcting, imo the general context seems fine.

I love "intellectual black holes" as an analytic framework for the close examination of bullshit. All I could think of while reading was how much evidence I'd seen of multiple black hole strategies put to use by Hegelians, Heideggerians, and postmodernists of all stripes.

I've had a chance to look through this more closely - sounds promising. I found a couple of typos, plus one 'serious' point:

Small point, but you have points (i), (ii) and (ii), just after you list the 8 arguments.

>The fact that many religious people rely pretty heavily on many – in some cases all – of these eight strategies in order to generate the impression that their particular belief system is, at the very least, not unreasonable, would of course be neatly explained by the fact that their particular religious system of belief is, in fact, pretty unreasonable.

Obviously I haven't read the book, but from the titles it sounds like you might be engaging in Argument 2 - "but it fits!" there! It may simply be that these strategies are more effective than telling the unvarnished truth - and effective memes spread.

I really like what you've written here, Steven. The metaphor of black holes is vivid.

Errata:"The central aim of this book...a veneer of “reasonableness” and rendering them immune to rational criticism."

This sentence should be two.

"two mechanisms in particular: APRA and playing the mystery card"

Should be APPRA.

"a Hyper-Active Agent Detection Device, or H.A.D.D."

Should be H.A.A.D.. (I like this: "you've been H.A.A.D!").

"I mention the H.A.D.D. hypothesis only to illustrate the point that the eight mechanisms identified in this book for turning a belief system into an intellectual black hole are not intended to rival such psychological and evolutionary explanations for why we believe what we do."

Maybe you can leave out "for turning a belief system into an intellectual black hole" here, it's unnecessary.

"So the fact that they do exist is no evidence that they are real."

Perhaps "So the fact that such beliefs are commonplace is no evidence that they are real".

I'm unclear on what the "Other explanations for why we believe" section is for. The H.A.A.D. hypothesis seems to be doing the preliminary work of showing "why we are drawn to particular belief systems in the first place", but Cognitive Dissonance explains something different, our propensity to use the illicit strategies you'll go on to describe. Perhaps you need two sections here? Or a better intro to the section?

Don't thank me just yet - I just realized that it would be H.A.A.D.D. (Hyper.Active.Agent.Detection.Device) or, as you originally said, H.A.D.D. (Hyper-active.Agent.Detection.Device). But maybe H.A.A.D. OR H.A.D. would be better (Hyper.Active.Agent.Detector. or (Hyper-active.Agent.Detector.). Sorry to confuse things.

Should you not include "isolation" as one of the stratagies? Both physical isolation of followers from outside influenece and hiding beliefs behind a code of secrecy. These secret "truths" are then used to support other beliefs that cannot be undermined since the supporting "truths" are not exposed to the light of day. Perhaps this is covered in "Playing the mystery card".

It may be worth saying something about the isolation of some of a given person’s beliefs from his other beliefs. (This may be implicit in what you say anyway, in which case, apologies.) It is an important defence of some crazy beliefs that one does not draw too many conclusions. That is, belief must not be closed under even the most obvious implication. For example, what would the world have to be like if some people (Nostradamus) had a special power to predict the future in detail? There would be no bookmakers, because they would all go bust. Since there are bookmakers, people do not have special powers of prediction. If you avoid pursuing the implications of your beliefs, you do not suffer the pain of cognitive dissonance.

This point about not drawing out consequences seems to be to be quite widely applicable in relation to isolated beliefs, for example in soothsayers or in the power of crystals. I am not sure how far it would apply to whole, sophisticated, religious systems. You can pick off specific beliefs in this way (“What would the world have to be like in order for there to be a mechanism of transubstantiation?”), but the believer can draw the alternative conclusion, that the divine world is the real one and the mundane world a mere shadow. Some believers will, however, not feel comfortable with that conclusion, so they will have to fall back on not drawing out consequences.

You may wish to bear in mind that the view of religion as a mental illness could itself be properly classified as a wacky and ridiculous belief system in the absence of convincing evidence to support it.

If you can wheel out some psychiatrists citing peer-reviewed research to assent to that hypothesis you would be on rather stronger ground; without it you are in much the same position as a stage clairvoyant assuring a gullible member of the audience that he really can read his mind...

'These are precisely the sort of moves an intelligent schizophrenic might make to defend their beliefs (in fact, aren't such evasive, self-sealing patterns of thought actually rather typical of certain forms of mental illness?).'

might be considered reasonable if you have peer-reviewed research by psychiatrists to support it, but you don't.

Which brings us back to wacky and ridiculous belief systems; I differ from you.

My wacky and ridiculous belief system suggests that the diagnosis of schizophrenia or unspecified other forms of mental illness is one which should be made by people who possess the scientific skills to do so.

I do indeed agree that one does not need peer-reviewed research to reasonably claim that birds do indeed sing.

One can establish this in a general way by listening to birds sing.

One cannot, however, reasonably assert that "schizophrenics do x", not least because there is profound disagreement as to what schizophrenia actually is, allied with profound disagreement as to whether someone in particular has whatever form of schizophrenia the particular psychiatrist doing the diagnosing regards as constituting schizophrenia.

And all of that is just the lead in to the almost total lack of scholarly concensus as to what actually happens in the brain of someone with schizophrenia.

This contrasts extremely starkly with Stephen's implicit claim that he knows so much about what happens in the brain of someone with schizophrenia that he can assert that this is 'precisely the sort of moves an intelligent schizophrenic might make'.

Whatever else may be said about that statement, precision is not a term which springs immediately to mind...

1. Unusual as this may seem to you, I looked back at your previous observations on this topic so that I could get a fuller picture; if you have changed your mind on this it would be helpful to say so. Otherwise the reasonable inference is that you continue to believe this to be the case.

2. No, you didn't assert that all religion is a form of mental illness; your observations do, however, suggest that you believe yourself to be capable of diagnosing certain individuals as suffering from schizophrenia and allied mental illnesses, on the basis of their religious beliefs.

You bolster your assertion that this is the way that 'intelligent schizophrenics' act by yet another unsupported claim of fact:

You are purporting to write a book as a useful guide for those who may find themselves suckered by snake-oil merchants, yet you assert yourself as an authority in an area in which you have no scientific expertise, and dispense with the rules of evidence based discourse which you urge us to apply.

Which probably explains why so many otherwise rational people believed Greenspan when he assured them that market strategies for risk management couldn't possibly go wrong; he had, in fact, taken the markets' word for it without ever asking them to explain just how this perfection had been achieved...

You're asking way too much before a simple statement can be made: that we should know what makes a schizophrenic and we should know why the schizophrenic does x before saying "schizophrenics do x".

There is no need to fully and accurately define schizophrenia to identify schizophrenics. Deuteronomy 14:18 famously confuses bats with birds, the author has his definition of "birds" wrong but still knew that birds sing.

You need know nothing of the workings of the brain of a schizophrenic to judge or categorise the arguments they make. Statements and arguments that are put forward are external to the mind and the statements categorisation ("type x" or "type y") does not depend on access to the internal factors that explain them. "Liverpool FC are great" is a statement that can be categorised as "bloody stupid" without knowledge of why the person said it (I have no idea why anyone would).

So there is no implicit claim in saying "schizophrenics do x" beyond the joint claims that "there are a group of people called "schizophrenics"" and that this group often does "x".

I think Stephen's original comment was false: I think schizophrenic's are the wrong group. I have a friend who suffers from depression and her recollection of her thought processes (together with a detailed explanation of why the pixies were changing the road signs) fits way better with Stephen's descriptions in that post. Wouldn't it be better though to criticise what was being said ("you mixed up depresseives/mental illness generally with schizophrenia") rather than inventing what was being said?

You are seeking to establish that an intelligent lay-person can reliably diagnose one or more mental illnesses; there is, of course, no evidence in the literature to support this. So you are taking a different route and attempting to support it by calling upon anecdotal evidence.

I have no doubt that you are perfectly sincere in your belief that an intelligent lay-person can reliably and accurately diagnose schizophrenia, or depression and so forth. But that belief is not based on any reliable evidence.

I don't know where you live but here in England medical training involves 5 years at medical school, followed by two years as a ‘foundation programme trainee’. After those seven years:

'Specialist training in psychiatry is 6 years, divided into 3 years of core training (CT1–3) and 3 years of specialty training (ST4–6). Trainees are called ‘core trainees’ and ‘specialty registrars’ respectively. In the first 3 years there are experiences available to build competencies in various aspects of psychiatry, with specialisation starting in the 4th year.'

That specialisation may, or may not, include schizophrenia, or depression, though in the course of medical training the doctor would have learned about the various reasons for, say, hallucinations which do not involve mental illness; people with severe pneumonia can describe things which make pixies changing road signs seem positively sensible by comparison.

Psychiatrists don't make remarks like yours, or indeed Stephen's; as the the Royal College of Psychiatrists points out,

'psychiatrists must be able to tolerate ambiguity and uncertain­ty'

which is a far cry from

'These are precisely the sort of moves an intelligent schizophrenic might make to defend their beliefs (in fact, aren't such evasive, self-sealing patterns of thought actually rather typical of certain forms of mental illness?)'

where Stephen apparently doesn't even recognise that he is in an area where ambiguity and uncertainty are the norm, not the exception.

And if he wishes to encourage people not to fall into intellectual black holes he will also have to accept that people will look at what he has said on this topic in the past, and consider in the light of that further evidence whether he is, in fact, adopting the principles he urges us to follow...

"You are seeking to establish that an intelligent lay-person can reliably diagnose one or more mental illnesses"

No, I am not.

I am claiming that non-specialists can identify those with a mental illness. Take my friend with depression. She (and her husband) tell me that she has been diagnosed (by a professional) with depression I then identify her as "someone with depression". Notice how a professional, and not me, made the diagnosis. This is rather like birds. Some professional tells me that all and only all birds have feathers. So when I see a thing with feathers I think "well, that's a bird". I don't have to have expertise (I have none) but I do, reliably, identify birds. But even if I (or the prefessional) are wrong about birds and feathers and that is an "area where ambiguity and uncertainty are the norm" I still identify sufficient numbers of birds (who happen to be feathered) that sing to make "birds sing" a reasonable statement.

"So you are taking a different route and attempting to support it by calling upon anecdotal evidence."

No, I am not.

I do not claim to support my ability to diagnose mental illnesses as I have no ability to diagnose mental illnesses (otherwise the whole of Anfield on a Saturday would be sectioned). Neither did I use anecdotal evidence in my analysis of the meaning of Stephen's comment. I used anecdotal evidence to dispute Stephen's comment (post analysis). My friend with depression is, fortunately, intelligent and self aware enough to have given me a good description of her thought processes when unwell. It was fascinating and gave an insight into thinking that has an outward appearance of rationality but is, somehow, very wrong. But she suffered from depression, not schizophrenia. That is a falsifying statement and anecdotal evidence is pretty good for falsification. Part of the problem with anecdotal evidence is that it gives just the one data point. “I saw a black swan, once” is anecdotal and lends little support to the proposition that all swans are black, but it’s pretty effective evidence against “all swans are white”. This was what my anecdotal evidence was used for.

Which brings me to warning people not ”to fall into intellectual black holes”. There is a difference between having insufficient justification for a statement and advancing a statement that contradicts the evidence. The former is, at worst, bullshit. And we already have a detailed elucidation of that: http://tinyurl.com/2eydjor. What, I think (and I hope) Stephen means by a “black hole” is the latter. We all hold beliefs that are wrong. We all hold beliefs that we have insufficient justification for. Some of us, though, hold truly irrational beliefs. Beliefs that are contrary to reason. What characterises these beliefs? What is their pathology? That’s the question. I hope Stephen’s book will go some way to answering it.

For anyone who is interested my friend, at one point, found herself losing her way on a car journey. She was using road signs to navigate. For some reason (that she, herself, was unable to explain) she did not adopt the belief that she had made a miscalculation.

So she had a quasi-foundational belief that she had navigated correctly. So, if she had read the signs correctly but was still lost, then the signs must be wrong. Now, if the signs had been put up wrong, then lots of people would have got lost and it would have got in the papers. It had not got in the papers, so lots of people did not get lost, so the signs were not put up wrong. So the signs must have changed.

But signs do not just "change". If you just leave a correct sign alone then you still have a correct sign. But if pixies had changed the signs then they would be suddenly wrong, with no history of people getting lost and she would be lost despite having navigated correctly. There was no history of people getting lost and she was lost despite having navigated correctly thus: pixies had changed the signs.

It seems rational, it's reasoned and I defy anyone to give a good reason why she should prefer "I have made a mistake in navigating" to "the signs are wrong".

Hi Pascal - so your initial suggestion that I hold the view that religion is a form of mental illness is, we now agree, wrong. Never said it.

As to a view I expressed elsewhere, that certain patterns of thought may be characteristic of certain forms of mental illness - note I did not claim this to be true. It was merely a suggestion, which I made after having actually talked briefly to a psychiatrist about it (and also after having interacted with some people who were institutionalized). But it's no big deal to retract it. I would still maintain however that if you are delusional, for whatever reason, then these mechanisms are likely to appeal.

But this is not a distinguishing factor; hallucinations/delusions also occur in people with depressive illness, even when that illness does not amount to major depression.

There is a very long list of conditions which may give rise to similar symptoms; I've already mentioned pneumonia, but any illness which profoundly raises the temperature will do the same thing, just as anything which reduces oxygen supply to the brain will.

And then there's Vitamin B12 deficiency which produces a wide range of psychiatric symptoms including hallucinations/delusions, as do certain endocrine abnormalities.

There comes a point where possible explanations for symptoms like hallucinations are so diffuse that one cannot reasonably assert that this means this, because this may not mean anything of the sort.

But we can agree to differ on this, and also agree that more reason and less belief would probably be an improvement on the staus quo...

It's the references to 'precisely' and 'fact' in your observations which suggested to me that you were going somewhat beyond a mere suggestion.

'I would still maintain however that if you are delusional, for whatever reason, then these mechanisms are likely to appeal.'

I have to say that, whilst I certainly have been delusional on a number of occasions, I have never found any mechanisms appealing, but then severe pneumonia tends to fry the synapses somewhat so I may merely have no recollection of it...

Popular Posts

What is Humanism? “Humanism” is a word that has
had and continues to have a number of meanings. The focus here is on kind of atheistic
world-view espoused by those who organize and campaign under that banner in the
UK and abroad. We should acknowledge that
there remain other uses of term. In one of the loosest senses of the expression,
a “Humanist” is someone whose world-view gives special importance to human
concerns, values and dignity. If that is what a Humanist is, then of course most
of us qualify as Humanists, including many religious theists. But the fact
remains that, around the world, those who organize under the label “Humanism” tend
to sign up to a narrower, atheistic view. What does Humanism,
understood in this narrower way, involve? The boundaries of the concept remain somewhat
vague and ambiguous. However, most of those who organize under the banner of Humanism
would accept the following minimal seven-point characterization of their world-view.

The vast majority of Biblical historians believe there is evidence sufficient to place Jesus’ existence beyond reasonable doubt. Many believe the New Testament documents alone suffice firmly to establish Jesus as an actual, historical figure. I question these views. In particular, I argue (i) that the three most popular criteria by which various non-miraculous New Testament claims made about Jesus are supposedly corroborated are not sufficient, either singly or jointly, to place his existence beyond reasonable doubt, and (ii) that a prima facie plausible principle concerning how evidence should be assessed – a principle I call the contamination principle – entails that, given the large proportion of uncorroborated miracle claims made about Jesus in the New Testament documents, we should, in the absence of independent ev…

Here is a first half only of a rough draft of a chapter for someone else's book. Feedback please...

Theme is Humanism: reason, science and skepticism

What are science and reason? Humanists
expound the virtues of science and reason. But what are science and reason? And
we should we think it wise to rely on them? By
science, I shall mean that approach to finding out about reality based on the
scientific method. This is a method that was fully developed only a few hundred
years ago. Science, as I’ll use the term here, is a comparatively recent
invention, its development owing a great deal to 16th and 17th Century thinkers
such as the philosopher Francis Bacon (1561-1626). So what
is the scientific method? Here’s a rough sketch. Scientists collect data
through observation and experiment. They formulate hypotheses and broader
theories about the nature of reality to account for what they observe.
Crucially, they then test their
theories. Scientists derive from their theories predictions th…