The Believing Brain: Reviews

Endorsements

Michael Shermer has long been one of our most committed champions of scientific thinking in the face of popular delusion. In The Believing Brain, he has written a wonderfully lucid, accessible, and wide-ranging account of the boundary between justified and unjustified belief. We have all fallen more deeply in his debt.

—Sam Harris, author of the New York Times bestsellers The Moral Landscape, Letter to a Christian Nation, and The End of Faith.

The physicist Richard Feynman once said that the easiest person to fool is yourself, and as a result he argued that as a scientist one has to be especially careful to try and find out not only what is right about one’s theories, but what might also be wrong with them. If we all followed this maxim of skepticism in everyday life, the world would probably be a better place. But we don’t. In this book Michael Shermer lucidly describes why and how we are hard wired to ‘want to believe’. With a narrative that gently flows from the personal to the profound, Shermer shares what he has learned after spending a lifetime pondering the relationship between beliefs and reality, and how to be prepared to tell the difference between the two.

—Lawrence M. Krauss, Foundation Professor and Director of the Origins Project at Arizona State University and author of The Physics of Star Trek, Quantum Man and A Universe from Nothing

Michael Shermer has long been one of the world’s deepest thinkers when it comes to explaining where our beliefs come from, and he brings it all together in this important, engaging, and ambitious book. Shermer knows all the science, he tells great stories, he is funny, and he is fearless, delving into hot-button topics like 9-11 Truthers, life after death, capitalism, Barack Obama, Sarah Palin, and the existence of God. This is an entertaining and thoughtful exploration of the beliefs that shape our lives.”

—Paul Bloom, author of How Pleasure Works

A tour de force integrating neuroscience and the social sciences to explain how irrational beliefs are formed and reinforced, while leaving us confident our ideas are valid. This is a must read for everyone who wonders why religious and political beliefs are so rigid and polarized—or why the other side is always wrong, but somehow doesn’t see it.

We might think that we learn how the world works, because we take the time to observe and understand it. Shermer says that’s just not so. We just believe things, and then make our world fit our perceptions. Believe me; you don’t have to take my word for it. Just try clearing some space in your own believing brain.

A fascinating account of the origins of all manner of beliefs, replete with cutting edge evidence from the best scientific research, packed with nuggets of truths and then for good measure, studded with real world examples to deliver to the reader a very personable, engaging, and ultimately convincing set of explanations for why we believe.

—Professor Bruce Hood, Chair of Developmental Psychology, Bristol University and author of Supersense: Why We Believe in the Unbelievable

Reviews

Wall Street Journal(September 10, 2011)

MAYBE WE’RE ALL CONSPIRACY THEORISTS(by Matt Ridley)

Michael Shermer, the founder and editor of Skeptic magazine, has never received so many angry letters as when he wrote a column for Scientific American debunking 9/11 conspiracy theories. Mr. Shermer found himself vilified, often in CAPITAL LETTERS, as a patsy of the sinister Zionist cabal that deliberately destroyed the twin towers and blew a hole in the Pentagon while secretly killing off the passengers of the flights that disappeared, just to make the thing look more plausible.

He tells this story in his fascinating new book, The Believing Brain. In Mr. Shermer’s view, the brain is a belief engine, predisposed to see patterns where none exist and to attribute them to knowing agents rather than to chance—the better to make sense of the world. Then, having formed a belief, each of us tends to seek out evidence that confirms it, thus reinforcing the belief.

This is why, on the foundation of some tiny flaw in the evidence—the supposed lack of roof holes to admit poison-gas cans in one of the Auschwitz-Birkenau gas chambers for Holocaust deniers, the expectant faces on the grassy knoll for JFK plotters, the melting point of steel for 9/11 truthers—we go on to build a great edifice of mistaken conviction.

I say “we” because, after reading Mr. Shermer’s book and others like it, my uneasy conclusion is that we all do this, even when we think we do not. It’s not a peculiarity of the uneducated or the fanatical. We do it in our political allegiances, in our religious faith, even in our championing of scientific theories. And if we all do it, then how do we know that our own rational rejections of conspiracy theories are not themselves infected with beliefs so strong that they are, in effect, conspiracy theories, too?

There was a time, when I was younger, when I was confident that I knew how to tell a barmy belief from a rational deduction. I have lost some of that confidence.

This has been caused partly by the frequent experience of having friends who share my view on one issue but then suddenly reveal a view on another issue that is anathema to me. I don’t believe in ghosts, says a friend, but there is definitely something to homeopathy; or God does not run evolution, but the government should run the economy. Like me, Mr. Shermer is an economic conservative and a social liberal, so he encounters this dissonance a lot.

Mr. Shermer offers a handy guide for those who are confused. Conspiracy theories are usually bunk when they are too complex, require too many people to be involved, ratchet up from small events to grand effects, assign portentous meanings to innocuous events, express strong suspicion of either governments or companies, attribute too much power to individuals or generate no further evidence as time goes by.

Sure. But those are the easy cases. What about the harder ones?

Take climate change. Here is Mr. Shermer’s final diagnostic of a wrong conspiracy theory: “The conspiracy theorist defends the conspiracy theory tenaciously to the point of refusing to consider alternative explanations for the events in question, rejecting all disconfirming evidence for his theory and blatantly seeking only confirmatory evidence to support what he has already determined to be the truth.”

This describes many of those who strive to blame most climate change on man-made carbon dioxide emissions. Of course, they reply that it also describes those who strive to blame most climate change on the sun.

That’s how belief systems work: On both sides, there is huge belief, buttressed by confirmation bias, and equally huge belief that the belief and the conspiracy are all on the other side. Rick Perry, Al Gore—each thinks the other is a mad conspiracy theorist who will not let the facts get in the way of prejudices. Maybe both are right.

Wall Street Journal(July 27, 2011)

A TRICK OF THE MIND Looking for patterns in life and then infusing them with meaning, from alien intervention to federal conspiracy (by Ronald Bailey)

Superstitions arise as the result of the spurious identification of patterns. Even pigeons are superstitious. In an experiment where food is delivered randomly, pigeons will note what they were doing when the pellet arrived, such as twirling to the left and then pecking a button, and perform the maneuver over and over until the next pellet arrives. A pigeon rain dance. The behavior is not much different than in the case of a baseball player who forgets to shave one morning, hits a home run a few hours later and then makes it a policy never to shave on game days.

Beliefs come first; reasons second. That’s the insightful message of The Believing Brain, by Michael Shermer, the founder of Skeptic magazine. In the book, he brilliantly lays out what modern cognitive research has to tell us about his subject—namely, that our brains are “belief engines” that naturally “look for and find patterns” and then infuse them with meaning. These meaningful patterns form beliefs that shape our understanding of reality. Our brains tend to seek out information that confirms our beliefs, ignoring information that contradicts them. Mr. Shermer calls this “belief-dependent reality.” The well-worn phrase “seeing is believing” has it backward: Our believing dictates what we’re seeing.

Mr. Shermer marshals an impressive array of evidence from game theory, neuroscience and evolutionary psychology. A human ancestor hears a rustle in the grass. Is it the wind or a lion? If he assumes it’s the wind and the rustling turns out to be a lion, then he’s not an ancestor anymore. Since early man had only a split second to make such decisions, Mr. Shermer says, we are descendants of ancestors whose “default position is to assume that all patterns are real; that is, assume that all rustles in the grass are dangerous predators and not the wind.”

In addition, as evolved social creatures, we have brains that are attuned to trying to discern the intentions of others—and we look for patterns, there, too, and then try to infuse them with human intention and meaning, or what Mr. Shermer calls “agenticity.” Patterns in life are variously ascribed to the work of ghosts, gods, demons, angels, aliens, intelligent designers and federal conspirators. “Even belief that the government can impose top-down measures to rescue the economy is a form of agenticity,” the author says.

Mr. Shermer also delves into the neuroscience of “the believing brain.” For example, he cites research suggesting that people with high levels of the feel-good neurochemical dopamine “are more likely to find significance in coincidences and pick out meaning and patterns where there are none.” Even for folks with normal chemical levels, there’s a neurological upside to pattern-finding: When we come across information that confirms what we already believe, we get a rewarding jolt of dopamine.

The Believing Brain perhaps inevitably turns to religion, but a sign of Mr. Shermer’s all-purpose skepticism is his consigning of the chapter “Belief in God,” along with “Belief in Aliens,” to a section called “Belief in Things Unseen.” He doesn’t take religious faith seriously except as an object for explanatory debunking—God is simply the human explanation for pattern-making and agency on an epic scale.

“As a back-of-the-envelope calculation within an order-of-magnitude accuracy, we can safely say that over the past ten thousand years of history humans have created about ten thousand different religions and about one thousand gods,” Mr. Shermer writes. He lists more than a dozen gods, from Amon Ra to Zeus, and wonders how one of them can be true and the rest false. “As skeptics like to say, everyone is an atheist about these gods; some of us just go one god further.”

Readers who have enjoyed Mr. Shermer’s earlier books, such as Why People Believe Weird Things, will relish the pages devoted to puncturing many of the conspiratorial beliefs that lurk in our popular culture, from those about UFO cover-ups to the 9/11-was-an-inside-job lunacy. He also recounts, apparently not for the first time, his own supposed alien-abduction experience. In 1983, competing in the Race Across America bicycle challenge, he rode 1,259 miles in 83 hours without sleep and became delirious with exhaustion. When his support crew finally intervened to make him stop and get some rest, he became convinced that they were aliens forcing him into a mother craft—the interior of the UFO, it turned out, looked “remarkably like a GMC motor home.” A good long nap cured him of his delusion.

One of the book’s most enjoyable discussions concerns the politics of belief. Mr. Shermer takes an entertaining look at academic research claiming to prove that conservative beliefs largely result from psychopathologies. He drolly cites survey results showing that 80% of professors in the humanities and social sciences describe themselves as liberals. Could these findings about psychopathological conservative political beliefs possibly be the result of the researchers’ confirmation bias?

As for his own political bias, Mr. Shermer says that he’s “a fiscally conservative civil libertarian.” He is a fan of old-style liberalism, as in liberality of outlook, and cites The Science of Liberty author Timothy Ferris’s splendid formulation: “Liberalism and science are methods, not ideologies.” The “scientific solution to the political problem of oppressive governments,” Mr. Shermer says, “is the tried-and-true method of spreading liberal democracy and market capitalism through the open exchange of information, products, and services across porous economic borders.”

But it is science itself that Mr. Shermer most heartily embraces. The Believing Brain ends with an engaging history of astronomy that illustrates how the scientific method developed as the only reliable way for us to discover true patterns and true agents at work. Seeing through a telescope, it seems, is believing of the best kind.

Ronald Bailey is the science correspondent for Reason magazine.

Nature(June 23, 2011, vol. 474)

HOW WE FORM BELIEFS Religions and superstitions may stem from the brain’s ability to spot patterns and intent, finds A. C. Grayling (by A. C. Grayling)

Two long-standing observations about human cognitive behaviour provide Michael Shermer with the fundamentals of his account of how people form beliefs. One is the brain’s readiness to perceive patterns even in random phenomena. The other is its readiness to nominate agency—intentional action—as the cause of natural events.

Both explain belief-formation in general, not just religious or supernaturalistic belief. Shermer, however, has a particular interest in the latter, and much of his absorbing and comprehensive book addresses the wide spread human inclination to believe in gods, ghosts, aliens, conspiracies and the importance of coincidences.

Shermer is well equipped for this task. He is a psychology professor, the founder of Skeptic magazine and resident sceptical columnist for Scientific American. Once an evangelical Christian, he lost his faith largely as a result of his college studies of psychology and cognitive neuroscience.

The important point, Shermer says, is that we form our beliefs first and then look for evidence in support of them afterwards. He gives the names ‘patternicity’ and ‘agenticity’ to the brain’s pattern-seeking and agency-attributing propensities, respectively. These underlie the diverse reasons why we form particular beliefs from subjective, personal and emotional promptings, in social and historical environments that influence their content.

As a ‘belief engine’, the brain is always seeking to find meaning in the information that pours into it. Once it has constructed a belief, it rationalizes it with explanations, almost always after the event. The brain thus becomes invested in the beliefs, and reinforces them by looking for supporting evidence while blinding itself to anything contrary. Shermer describes this process as “belief-dependent realism”—what we believe determines our reality, not the other way around.

He offers an evolution-based analysis of why people are prone to forming super natural beliefs based on patternicity and agenticity. Our ancestors did well to wonder whether rustling in the grass indicated a predator, even if it was just the breeze. Spotting a significant pattern in the data may have meant an intentional agent was about to pounce.

Problems arise when thinking like this is unconstrained, he says. Passionate investment in beliefs can lead to intolerance and conflict, as history tragically attests. Shermer gives chilling examples of how dangerous belief can be when it is maintained against all evidence; this is especially true in pseudoscience, exemplified by the death of a ten-year-old girl who suffocated during the cruel ‘attachment therapy’ once briefly popular in the United States in the late 1990s.

Shermer’s account implies that we are far from being rational and deliberative thinkers, as the Enlightenment painted us. Patternicity leads us to see significance in mere ‘noise’ as well as in meaningful data; agenticity makes us ascribe purpose to the source of those meanings. How did we ever arrive at more objective and organized knowledge of the world? How do we tell the difference between noise and data?

His answer is science. “Despite the subjectivity of our psychologies, relatively objective knowledge is available,” Shermer writes. This is right, although common sense and experience surely did much to make our ancestors conform to the objective facts long before experimental science came into being; they would not have survived otherwise.

Powerful support for Shermer’s analysis emerges from accounts he gives of highly respected scientists who hold religious beliefs, such as US geneticist Francis Collins. Although religious scientists are few, they are an interesting phenomenon, exhibiting the impermeability of the internal barrier that allows simultaneous commitments to science and faith. This remark will be regarded as outrageous by believing scientists, who think that they are as rational in their temples as in their laboratories, but scarcely any of them would accept the challenge to mount a controlled experiment to test the major claims of their faith, such as asking the deity to regrow a severed limb for an accident victim.

Shermer deals with the idea that theistic belief is an evolved, hard-wired phenomenon, an idea that is fashionable at present. The existence of atheists is partial evidence against it. More so is that the god-believing religions are very young in historical terms; they seem to have developed after and perhaps because of agriculture and associated settled urban life, and are therefore less than 10,000 years old.

The animism that preceded these religions, and which survives today in some traditional societies such as those of New Guinea and the Kalahari Desert, is fully explained by Shermer’s agenticity concept. It is not religion but proto-science—an attempt to explain natural phenomena by analogy with the one causative power our ancestors knew well: their own agency. Instead of developing into science, this doubtless degenerated into superstition in the hands of emerging priestly castes or for other reasons, but it does not suggest a ‘god gene’ of the kind supposed for history’s young religions with their monarchical deities.

This stimulating book summarizes what is likely to prove the right view of how our brains secrete religious and superstitious belief. Knowledge is power: the corrective of the scientific method, one hopes, can rescue us from ourselves in this respect.

A. C. Grayling is professor of philosophy at Birkbeck College, University of London. His latest publication is The Good Book.

The author cites a 2009 poll in which more Americans admitted to a belief in angels and devils than in the theory of evolution. Shermer seeks to answer the question of why “so many people believe in what most scientists would consider to be the unbelievable?” While admitting that scientists often believe in unproven hypotheses—e.g., the origin of our universe and what might have preceded the Big Bang—the author holds firmly to the “built-in self-correcting machinery” that is inherent in the scientific method: e.g., double-bind controlled experiments which are replicable, testing results against the null hypothesis, etc. Shermer takes gleeful potshots at conspiracy theorists, including the 9/11-truthers, giving a detailed refutation of their claim that planted explosives brought down the Twin Towers, and the belief in extrasensory perception demonstrated by the apparent abilities of psychics and other mediums, which have been replicated by magicians. Nonetheless, the author fully recognizes the importance of belief in our lives. Jumping to false conclusions is an outgrowth of pattern recognition, an essential function of our brain that evolved to allow birds as well as mammals to anticipate danger and respond to their environment. “An emotional leap of faith beyond reason is often required,” writes the author.

A timely, reasoned reflection on the nature of belief, offering a level-headed corrective to the divisiveness of extreme partisanship.

New Scientist(May 28, 2011)

Reality is Relative: Our quest for an objective view of the
world is thwarted by our personal beliefs (by Amanda Gefter)

You are rushing to the airport when a tree falls and blocks the road, causing you to miss your flight. Hours later you learn the plane has crashed and all its passengers are presumed dead.

If you are religious, you may interpret the falling tree as a miracle, evidence that a loving God is watching over you. If you aren’t, you will likely see it as an incredibly fortunate fluke. These two interpretations of the same event exemplify Michael Shermer’s view that our beliefs come first and our explanations—or rationalisations—follow.

He dubs this concept “beliefdependent realism”, though it is far from a new idea: philosophers of science have long argued that our theories, or beliefs, are the lenses through which we see the world, making it difficult for us to access an objective reality.

So where do our beliefs come from? In The Believing Brain Shermer argues that they are derived from “patternicity”, our propensity to see patterns in noise, real or imagined; and “agenticity”, our tendency to attribute a mind and intentions to that pattern. These evolved skills—which saved our ancestors who assumed, say, a rustling in the bushes was a predator intending to eat them—are the same attributes that lead us to believe in ghosts, conspiracies and gods.

In fact, neuroimaging studies have shown that, at the level of the brain, belief in a virgin birth or a UFO is no different than belief that two plus two equals four or that Barack Obama is president of the US. “We can no more eliminate superstitious learning than we can eliminate all learning,” writes Shermer. “People believe weird things because of our evolved need to believe non-weird things.”

Yet belief-dependent reality is not fixed, and the views that frame our individual versions of the world can change. Shermer offers a very personal account of his transition from door-to-door evangelical Christian to publisher of Skeptic magazine. He also acknowledges that the pendulum can swing the other way—as in the case of Francis Collins, former head of the Human Genome Project and current director of the US National Institutes of Health. Collins began as a sceptic, then changed his mind and became “born again”.

The book is oddly organised and a chapter on politics strays from the point, but The Believing Brain should nonetheless be required reading. Shermer’s exploration of cognitive biases alone will make even the most rational readers recognise the flaws in their thinking and more closely evaluate their beliefs. His awareness that he too is subject to such flawed thinking makes him a perpetually trustworthy guide.

As for our quest for objective reality, Shermer argues that science is our greatest hope. By requiring replicable data and peer review, science, he says, is the only process of knowledge-gathering that can go beyond our individual lenses of belief.

Publisher’s Weekly(March 14, 2011)

As the founding publisher of Skeptic magazine, author of Why People Believe Weird Things, and a columnist for Scientific American, Shermer is perhaps the country’s best-known skeptic. His position is as clear as it is simple: “When I call myself a skeptic I simply mean that I take a scientific approach to the evaluation of claims.” But now Shermer is interested not only in why people have irrational beliefs, but “why people believe at all.” Our brains, he says, have evolved to find meaningful patterns around us. But why do people believe they see patterns—whether “evidence” of angels, conspiracy theories, or UFOs—where none exist? Drawing on evolution, cognitive science, and neuroscience, Shermer considers not only supernatural beliefs but political and economic ones as well. He demonstrates how our brains selectively assess data in an attempt to confirm the conclusions we’ve already reached. Informative and difficult to put down, this book adds a compelling and comprehensive case to the growing number of arguments about the importance of scientific reasoning, marred only by Shermer’s repeated citing of his own works and public appearances.

Science-Based Medicine Blog(May 31, 2011, by Harriet Hall)

A common question of skeptics and science-based thinkers is “How could anyone believe that?” People do believe some really weird things and even some obviously false things. The more basic question is how we form all our beliefs, whether false or true.

Michael Shermer’s book Why People Believe Weird Things has become a classic. Now he has a new book out: The Believing Brain: From Ghosts and Gods to Politics and Conspiracies: How We Construct Beliefs and Reinforce Them as Truths. It synthesizes 30 years of research into the question of how and why we believe what we do in all aspects of our lives.

Some of the content is repetitious for those of us who have read Shermer’s previous books and heard him speak, but the value of the new book is that it incorporates new research and it puts everything together in a handy package with a new focus.

Shermer says,

I’m a skeptic not because I do not want to believe, but because I want to know. How can we tell the difference between what we would like to be true and what is actually true? The answer is science.

He includes a pithy quotation from Richard Feynman that I had not seen before:

If it disagrees with experiment, it is wrong. In that simple statement is the key to science. It doesn’t make any difference how beautiful your guess is, how smart you are, who made the guess, or what his name is. If it disagrees with experiment, it’s wrong. That’s all there is to it.

Our schools tend to teach what science knows rather than how science works. The scientific method is a teachable concept. But

our most deeply held beliefs are immune to attack by direct educational tools, especially for those who are not ready to hear contradictory evidence.

This is a problem. Shermer does not offer a solution.

The brain is a belief engine. It relies on two processes: patternicity and agenticity. It finds meaningful patterns in both meaningful and meaningless data. It infuses patterns with meaning, and imagines intention and agency in inanimate objects and chance occurrences. We believe before we reason. Once beliefs are formed, we seek out confirmatory arguments and evidence to justify them. We ignore contrary evidence or make up rationalizations to explain it away. We do not like to admit we are wrong. We seldom change our minds.

Our thinking is what Morgan Levy has called “intelligently illogical.” If our ancestors assumed that the wind rustling the bushes was a lion and they ran away, that wasn’t a big problem. If there really was a lion and they didn’t run away, they were in trouble. Natural selection favors strategies that make many false causal assumptions in order to not miss the true ones that are essential to survival. Superstition and magical thinking are natural processes of a learning brain. People believe weird things because of our evolved need to believe nonweird things.

Belief comes quickly and naturally, skepticism is slow and unnatural, and most people have a low tolerance for ambiguity.

We rely on a feeling of conviction, but that feeling can be uncoupled from good reasons and good evidence. Science hopes to counteract false beliefs by recoupling through counterarguments with even better reasons and evidence.

As science advances, the things we once thought of as supernatural acquire natural explanations. Thunderstorms are caused by natural processes of electricity in clouds, not by a god throwing thunderbolts.

Belief in God is hardwired into our brains through patternicity and agenticity. We see patterns even when they are not there (the Virgin Mary on a toasted cheese sandwich), and we interpret events as having been deliberately caused by a conscious agent (the AIDS virus was created in a government lab for genocidal purposes). God is the ultimate pattern and agent that explains everything. And religious belief had survival value for human groups, encouraging conformity, group cooperation, and altruism.

Shermer covers a variety of subjects, from alien abductions to cosmology, from economics to politics, from belief in the afterlife to evolution, from ESP to morality, with a lot of entertaining examples. He doesn’t give much space to medical topics but he does mention AIDS denial, the vaccine/autism brouhaha, and alternative medicine, which he calls “a form of pseudoscience.”

Conspiracy theories abound, from Holocaust denial to 9/11 Truthers to the spread of AIDS. This is a result of wide-open pattern detection filters and to the assumption that there must be a conscious agent behind everything. Shermer provides a handy list of 10 characteristics of a conspiracy theory that indicate that it is likely to be false; for instance, the more people who would have to have been involved in a cover-up, and the longer the alleged cover-up has lasted, the less likely that no one would have spilled the beans by now.

He provides a useful discussion of the various biases we are prone to, from confirmation bias to the status quo bias, and points out that science is the ultimate bias-detection machine. He revisits the “Gorillas in our midst” video to remind us that we don’t see things that we’re not looking for. (In case you don’t know, that was an experiment demonstrating inattentional blindness: a gorilla walks through a group of people playing basketball and we don’t see him because our attention is fixed on counting the number of times the players in white shirts passed the ball.) He quotes Upton Sinclair:

It is difficult to get a man to understand something when his job depends on not understanding it.

When I read that, Dana Ullman came to mind.

I particularly got a kick out of one of Shermer’s examples. Galileo used an early telescope to observe 4 moons around Jupiter. One colleague of Galileo’s refused to even look through the telescope, calling it a parlor trick, saying he didn’t believe anyone else would see what Galileo saw, and saying that looking through glasses would only make him dizzy. Other colleagues who did look were similarly dismissive; one tested the telescope in a series of experiments and said it worked fine for terrestrial viewing, but when pointed at the sky it somehow deceived the viewer. One professor of mathematics accused Galileo of putting the moons of Jupiter inside the tube.

We are beginning to develop a new understanding of how the brain generates beliefs and reinforces them. Mr. Spock is science fiction; humans are often illogical and emotional. We need emotion to motivate us and help us function. An emotional leap of faith beyond reason is often required for us to make decisions or just to get through the day.

This thought-provoking book is a good read and a good reference. Takeaway lessons:

Beliefs come first, reasons follow.

False beliefs arise from the same thought processes that our brains evolved to enable them to learn about the world.

Our faulty thinking mechanisms can’t be eliminated but our errors can be corrected by science.

The Economist(June 16, 2011)

Irrational belief: A medley of aliens and conspiracy theories

Michael Shermer is a psychologist, cyclist, one-time fundamentalist Christian, founder of Skeptic magazine and, currently, the author of a monthly column with the same name published in Scientific American. He has built a professional career out of casting a rationalist’s eye over some of the wackiest beliefs that humanity has to offer.

But his latest book is more than just a display case full of specimens collected by a man fascinated by the paranormal. Mr. Shermer is interested in how such beliefs come to be held, and why they can persist even in the face of what, to others, can seem to be the overwhelming evidence that contradicts them.

The first part of the book is a mixture of psychology and trendy neuroscience research that presents the evidence for Mr. Shermer’s central claim: that, instead of shaping belief around painstakingly gathered, soberly judged evidence, people most often decide upon their beliefs first, and then use an impressive range of cognitive tricks to bend whatever evidence they do discover into support for those pre-decided acts of faith.

In the second part of The Believing Brain Mr. Shermer applies those observations to the almost infinite variety of weird and wonderful beliefs that people hold, from alien abductions to government conspiracies to bring down the World Trade Centre—and, inevitably, to religion (a chapter on politics, by contrast, feels misplaced and forced). He is an able skewerer of sloppy thinking. The section on conspiracy theories, for instance, memorably exposes the bizarre leaps of logic that adherents often make: “If I cannot explain every single minutia [about the collapse of the twin towers]…that lack of knowledge equates to direct proof that 9/11 was orchestrated by Bush, Cheney, Rumsfeld and the CIA.”

A common risk with this kind of book is that the author comes across as overly smug and superior; just look at how the duke of debunkers, Richard Dawkins, is sometimes perceived, even by his fans. Mr. Shermer is aware of this risk, and is at pains to reassure readers that his conclusions apply to everyone, even himself. In a chapter on alien abductions, he recounts an abduction story of his own. Exhausted after cycling 1,259 miles in 83 hours as part of an endurance challenge called the Race Across America, he becomes convinced that the motorhome carrying his support team is actually an alien spacecraft, and that his team’s pleas for him to come inside and get some rest are merely a cunning pretext to get him to co-operate with a spot of alien probing. Surprised when the interior of the mothership turns out to closely resemble a General Motors motorhome, Mr. Shermer consents to lying down. On waking a couple of hours later, he is able to joke about the experience with his team-mates.

That experience gives one useful definition of a sceptic, as Mr. Shermer understands the term: one who is aware of the fallibility of intuitions, and willing to take steps to minimise them. It remains, sadly, an uncommon combination.