Most people fully accept paranormal and pseudoscientific claims without critique as they are promoted by the mass media. Here Be Dragons offers a toolbox for recognizing and understanding the dangers of pseudoscience, and appreciation for the reality-based benefits offered by real science.

In this video — the fifth in our series of videos that promote science and critical thinking through the use of humor, wit, and satire — we present a Con Academy mini course in the techniques of New Age Spiritual Gurutry.

Do you know someone who has had a mind altering experience like the examples that we list in this FREE PDF booklet? If so, you know how compelling they can be. A life can be changed or an entire religion founded on the basis of a single brain-generated hallucination. These phenomena are so powerful that throughout history seekers of knowledge have sought to induce them. They are one of the foundations of widespread belief in the paranormal. But as skeptics are well aware, accepting them as reality can be more than a waste of time and energy. It can be dangerous for both the individual and larger society.

While science has made considerable progress in discovering how the brain is hard-wired to produce these illusions, the public is largely unaware of much of this research. This is where your Skeptics Society comes in—we provide the scientific explanation.

How you can help a friend or loved one with a potentially harmful pseudoscientific belief

It’s the #1 most common question I get: My wife, my friend, my mom, my boss, is investing their health or their money in some magical or fraudulent product/scheme/belief. What can I do about it?

This is a tough situation to be in. Whether it’s a loved one who’s ill and is being taken advantage of by a charlatan selling a magical cure with no hope of treating the illness, or a friend who’s out of work and is going into deeper debt to buy into a hopeless multilevel marketing plan, it’s really hard to watch. The hardest is when they have a real problem and are expending their limited resources trying to solve it with a medieval, magic-based system that you know can’t possibly help. But all too often, they think it’s helping. Cognitive biases, anecdotal thinking, placebo effects and cognitive dissonance combine to build a powerful illusion that our brains are hardwired to believe in. At some point, it falls to a caring friend to try and rescue them with a candle of reason.

You’re up against a foe who’s far more formidable than you might think. This isn’t like settling a bet with a friend where you can look up the answer on Wikipedia, see who’s right, then buy each other a beer. You’re going after someone’s religion. You’re setting out to talk someone out of believing something that they know to be true, for a fact, from their personal experience. That right there makes your task nearly impossible, but it’s worse. Their belief has spiritual underpinnings that make it deeply moral and virtuous. Imagine if someone came to you and flashed a magazine article that said it’s best to turn your children out into the street and never talk to them again. It’s not only unconvincing, it’s laughable. Your effort to talk someone out of their belief in their sacred cow is likely to be just as laughable.

So what should you do, give up? You may be surprised to hear it from me, but I advise you to do just that, in many cases. Know which battles to fight. Weigh the risks. Consider the context of your friend’s belief: Is he in imminent danger of harming himself or others? Probably not; and if not, this may not be the time to take what might be your only shot. So I want to make this a rule: Before you decide what to do, consider the risks and the context. How terrible are the consequences of your friend’s belief? Think that through comprehensively. Make sure you have a good understanding of the risks to your friend if you do nothing, and the risks to your relationship if you attack their beliefs and (in all probability) fail to convince them. It may well be that this first strategy I’m going to present is the safest.

Strategy #1: Do Nothing

Doing nothing now doesn’t mean giving up. When you choose not to confront your friend’s current weird belief, there’s still an effective strategy for helping him out that you can follow. By accepting and tolerating your friend’s weird belief, you’re actually setting yourself up to be in a position of great influence the next time something weird comes down the line. Your friend likely knows that you’re a skeptical person, and eventually he’ll recognize that you’ve been putting up with his weird belief and saying nothing. In fact he may someday ask you, “Hey, you know I believe in this weird thing, how come Mr. Cynical Skeptic has never tried to talk me out of it?”

Ask “Is it important to you?”

“Yes.”

“You’re important to me.”

Think what a powerful message that sends. It may sound corny, but it’s a statement that your friend will always remember. You’ve just communicated that your friendship is more important than your “evil debunking hobby”. You’ve made it clear, unequivocably, that you don’t want such differences to come between you.

And now look at the position you’re in. You’re trusted. You’re an ally at the most important and fundamental level. This is exactly where you need to be if you want to be influential on someone. You can now begin to introduce critical thinking using topics that are more about exploration than confrontation, and this is a journey you should take together. Next time you’re in the car together, play a few Skeptoid episodes. Play episodes like The Baigong Pipes, Is He Real or Is He Fictional, The Missing Cosmonauts, and When People Talk Backwards. Topics such as these do not attack or challenge anyone, they instill an appreciation and a passion for the value of critical thinking. Once introduced, I find that most people want more.

Gather every bit of skeptical material you can find that you know will interest your friend, and that does not attack or challenge his belief. So long as you remain a trustworthy friend and not an irrational adversary, you’re in a position to introduce him to the fundamentals of critical thinking, and to the value and tangible rewards of reality. Don’t underestimate the value of seeds that are well planted in a good environment. If your friend comes around on his own, his growth is far more complete than any that’s forced upon him.

Always remember the story of the little boy who couldn’t get his pet turtle to come out of his shell. He tried to pull on its head, he shook it, he squirted water, he did everything he could think of. But the turtle wouldn’t come out. Then his grandfather took the turtle and placed it on the warm hearth, and within a minute the turtle was out of his shell. The little boy never forgot that lesson.

Strategy #2: The Intervention

Sometimes the situation is urgent and you don’t have time to do things the easy way. There might be a medical crisis, an emotional crisis, or a financial crisis, and an immediate intervention is needed. Sometimes a friend’s situation is dire enough that helping him is worth the loss of the personal relationship. In these cases, and probably only in these cases, would I suggest a confrontational approach. And to do this effectively, draw on the established principals of the counseling intervention.

First you want to gather a group of friends or family, and you need to meet with them separately. Try to get a group, but even if there are only two of you, it’s worlds better than just you by yourself. Your next task is to present your evidence to the group that the magical system your friend is relying on is pseudoscientific and cannot help him. Do not expect them to accept what you say at face value, and do expect that some of them might buy into the magical system as well. Be prepared. Show your work. Print out pages from the web. Use the Science Based Medicine blog, use Skeptoid, use Quackwatch, use Swift. Search the best sources and have all your ducks in a row. The most important thing you need to do at this stage is to be certain that everyone in the group is united in their understanding of the useless, pseudoscientific nature of the magical sacred cow.

Susan Blackmore is a psychologist and writer researching consciousness, memes, and anomalous experiences, and a Visiting Professor at the University of Plymouth. She is the is author of a number of books, including The Meme Machine and Zen and the Art of Consciousness.

Is there a group of conspiracy theorists more deluded and irritating and offensively irrational than the 9/11 Truthers? I think so. Their name? Well . . . I don’t know all their names, there’s too many of them — collectively, let’s call them the Moon Landing Hoaxers. And they believe some pretty stupid things.

This is volume 1 of The Con Academy videos—another resource in the Skeptics Society‘s arsenal of Skepticism 101 for teaching critical thinking and promoting science through the use of humor, wit, and satire. In this faux commercial for The Con Academy you’ll see how psychics count on the confirmation bias to convince people that their powers are real when, in fact, they are just remembering the hits and forgetting the misses. We also demonstrate how psychic “organizations” con people by taking their money for services that are not real.

“There’s no way he could have known my grandmother’s name?” “How do you explain his predicting the lights would go off at the shop?” “How did he know my uncle’s name?” “There’s no way he could have known my father died of a heart attack.” “How could he possibly know that my brother collects cuckoo clocks?“

John Edward has been described as a fraud by James Randi [Skeptic, v. 8, no. 3] and Leon Jaroff [Time, March 5, 2001].

These and millions more like them represent the kinds of statements we get from people who say they’re skeptical, but who’ve been to a psychic and have come away as believers in the paranormal. Many times I’ve been asked to try to explain the “paranormal” experiences of people who tell me they’re skeptics, but who can’t think of any other explanation for something than that it was paranormal. I call it the “Explain That!” game. I’ve posted responses to some of these requests, but I can’t say I’ve been able to persuade any of the believers to consider alternative explanations, even though they ask me to provide them with one. [Some of my explanations for various psychic readings are here, here, here, and here.]

George Anderson, a former switchboard operator, now claims he talks to the dead via his psychic switchboard.

How do psychics know so much about me? I’ve heard or read many times variants of that question asked by people who are intelligent and educated, but naive. For example, a local sports writer visited a psychic to get a story about her predictions for the local high school athletic teams. He ended up writing two stories. I didn’t read the second one, but the first revealed how amazed he was at how much she knew about him and how accurate she was. It made him think, he wrote, that maybe there’s something to this psychic business. There is, but it’s not what he thinks. In my letter to the editor of the local paper where the sports writer plies his trade I said:

Bruce Gallaudet is an experienced journalist, but he seems to know nothing about cold reading and subjective validation, the two tarot cards up the sleeve of a working psychic. He’s dazzled within 60 seconds and befuddled when she tells the old man that she’s sorry he had to cancel a trip. Did she ask about your knee injury? Or about the outdated calendar you keep at home, along with the box of newspaper clippings? Did she mention your business venture setback (but you’ll do well in new endeavors) or the health problems a loved one is having?

Stick to local sports, Bruce. You were in way over your head with Ms. Mertino, the Davis Psychic.

James Van Praagh plays a kind of twenty-questions game with his audience.

The fact is, psychics may know certain things about you in the same way that many people know many things about others by knowing their age, sex, occupation, education, where they live, how they dress, what kind of jewelry they’re wearing, or their religion. Does anyone have perfect knowledge of others based on what are sometimes called warm reading techniques? Of course not. We’re dealing with probabilities, not absolute certainties here, but it doesn’t matter. The psychic is not obligated to stop the reading when she makes a mistake. If she misinterprets your wearing black as a sign of grieving for someone who has died, she doesn’t have to say “oops, wrong again.” No, she just slithers on to the next question or statement, ignoring her “miss” and counting on you to ignore it as well. Eventually, she’ll hit something that resonates with you, that you can validate. The key to a psychic reading is not the psychic’s ability to tap into a world you are not directly privy to. The key to a psychic reading is your willingness to find meaning or significance in some of the statements she makes or questions she asks. If mentioning the death of a loved one evokes no response from you, the psychic will move on to another statement, another question.

“Psychic” Sally is seen removing a microphone from her right ear, and what appears to be an earpiece from her left ear.

It is also possible that the psychic you are dealing with is a very sleazy professional fraud who investigates her clients before she does the reading. Doing a hot reading, however, is not likely if you are a drop-in. Although, even drop-ins can be conned by distracting the client and looking through her purse or wallet. Some psychics who work fairs, for example, have a colleague who walks by those in line trying to pick up information about various clients who are in conversations. The colleague passes on the info to the “psychic” via a wireless device. Most people who visit psychics on a whim are probably not going to be a victim of someone using hot reading, however. Why? Because it’s really unnecessary. Cold reading works just as well. (For a special case of using hot readings by sharing information in order to con wealthy clients who go from psychic to psychic, see Lamar M. Keene. The Psychic Mafia. Prometheus, 1997).

There are a lot of stereotypes that conspiracy theorists believe about skeptics, and for the most part they’re just not true. Most of the time these beliefs are either the result of manipulation, or just misunderstandings.

Here are some of the most common claims that conspiracy theorists have against skeptic, and why these claims are not true:

• All skeptics work for the government.

One of the most common claims by conspiracy theorists about skeptics is that skeptics work for, or at least are being paid by the government, or to a lesser extent, private companies, to run debunking websites (they’re usually referred to by conspiracy theorists as “dis-information agents”). Usually these accusations are followed up with a joke by a skeptic, usually something like, “I’m still waiting for my check.”

The reality is that most skeptics don’t work for the government, and most likely never would. Those that do work for the government are not being paid by the government to run these skeptic websites, and they are doing what they do on their own free will.

• Skeptics believe whatever the government or media says.

No they don’t. In fact skeptics are highly critical of both the government and the media.

Skeptics know that the government lies to the public all the time to try to make itself not look as bad, and that the media tends to report things way to early, or sensationalizes stuff, so bad information gets to the public, rather then correct information.

• Skeptics don’t believe in conspiracies.

Skeptics actually do believe in conspiracies. The difference is between skeptics and conspiracy theorists is that the conspiracies that skeptics believe in either have been proven to be true, or has enough evidence (real evidence, not made up evidence) to prove the conspiracy to be true, or at least likely to be true.

• All skeptics are alike.

One of the biggest misconceptions about skeptics in general is that we are all alike, and that we have similar beliefs and education, and that we all see things exactly the same, but in reality this is not true at all.

We all debunk things differently, and we sometimes come to different conclusions on things, and there are fights within the skeptics community.

Most people fully accept paranormal and pseudoscientific claims without critique as they are promoted by the mass media. Here Be Dragons offers a toolbox for recognizing and understanding the dangers of pseudoscience, and appreciation for the reality-based benefits offered by real science.

Pseudoscience is what one might call a two-dollar word. Skeptics often throw it around because of its weightiness and the values it transmits. We need to talk about this word, where it came from, and why we should be cautious about using it.
Pseudoscience is a pejorative term that is bestowed upon a set of ideas, not used by choice by the holder of those ideas. It’s “false” science, fake science, an imitation missing a vital part, the knockoff, the wannabe, the cheap imitation… OK, you get what I mean.

Contrary to what we think we can say constitutes pseudoscience, there are no set criteria to identify it. It’s not a simple thing but a sticky wicket. It’s whatever scientists say doesn’t belong to legitimate science – a problematic definition. We often must use examples to explain what we mean. Wikipedia has a list of topics that have been characterized as pseudoscience by someone, sometime. Common examples include: astrology, cryptozoology, paranormal investigation, ufology, parapsychology, psychoanalysis, alternative medicine, homeopathy, and creationism.

Because science has authority in our society, it is worthy of imitation. Pseudoscience is science’s shadow, which makes it hard to separate from the real thing. It can’t exist without science. This science imitation was evident to me upon researching paranormal investigation and cryptozoology. I concluded it was useful to have a set of reasonable guidelines one could use to determine if the methodology and the resulting body of knowledge was scientific or lacking in a critical way. The more of these characteristics one can attribute to the field in question, the more likely it is fairly categorized as “pseudoscience.” But since those are fuzzy, subjective criteria, some theories we now regard as legitimate might have qualified as “pseudoscience” at one time, such as Einstein’s special theory of relativity, Mendel’s heredity, meteorites, and Wegener’s continental drift.

There is no continuum between science and pseudoscience. Nor is there a clear boundary or litmus test for what qualifies as science and what lies outside the lines. This is called the “demarcation problem”—a term that Austrian philosopher Karl Popper coined in the late 1920s to describe the issues of marking a solid line between what is scientific and what is nonscientific. It turns out Popper’s solution to the problem was not so hot. He thought it lay in the criterion of testability/falsifiability. But that fails since several theories are not practically falsifiable.

Any series of characteristics commonly attributable to pseudoscience, such as my pet list of criteria, also fails for various reasons. After all this searching for a demarcation criterion without success, philosopher Larry Laudan remarked that we might fairly conclude that “the object of the quest is nonexistent.”

Pseudoscience is just what pseudoscientists do, say the scientists. That is not very helpful for someone who wants to make heads or tails out of a controversial area of research.

Many conspiracy theorists seem very keen on the idea of hidden messages or codes secretly embedded within ancient writings. Believers claim hidden prophecies of significant world events and disasters can be uncovered and deciphered by analysing the Bible. By simply selecting a random paragraph and taking out the punctuation and merely inserting the passage into a matrix a skeptic, if suitably motivated, and with the benefit of hindsight is easily able to uncover whatever it is they fancy. Believers see predictions of the assassination of President Kennedy and the 9/11 twin towers terrorist attack uncovered in the bible as irrefutable evidence of divine revelation even though rational thinkers can locate predictions of the death of Leon Trotsky and Princess Diana secreted within “Moby Dick”.

It is … not enough to have a generally skeptical outlook, or even to call oneself a skeptic. Skepticism is a journey of self-knowledge, exploration, and mastering the various skills that comprise so-called metacognition – the ability to think about thinking.

[…]

As an example of the need for metacognitive skills in navigating this complex world there is confirmation bias. This is definitely on my top 5 list of core skeptical concepts, and is a major contributor to faulty thinking. Confirmation bias is the tendency to perceive and accept information that seems to confirm our existing beliefs, while ignoring, forgetting, or explaining away information that contradicts our existing beliefs. It is a systematic bias that works relentlessly and often subtly to push us in the direction of a desired or preexisting conclusion or bias. Worse – it gives us a false sense of confidence in that conclusion. We think we are following the evidence, when in fact we are leading the evidence.

Part of the illusion of evidence created by confirmation bias is the fact that there is so much information out there in the world. We encounter numerous events, people, and bits of data every day. Our brains are great at sifting this data for meaningful patterns, and when we see the pattern we think, “What are the odds? That cannot be a coincidence, and so it confirms my belief.” Rather, the odds that you would have encountered something that could confirm your belief was almost certain, given the number of opportunities.

Another factor that plays into confirmation bias is using open-ended criteria, or ad-hoc or post-hoc analysis. This means that we decide after we encounter a bit of information that this information confirms our belief. We retrofit the new data into our belief as confirmation.

Confirmation bias is further supported by a network of cognitive flaws – logical fallacies, heuristics, and other cognitive biases – that conspire together to reinforce our existing beliefs. In the end you have people who, based on the same underlying reality, arrive at confidently and firmly held conclusions that are directly opposing and mutually exclusive.

I encounter examples of confirmation bias every day. (My now favorite quote about this is from Jon Ronson, who said, “After I learned about confirmation bias I started seeing it everywhere.”) Of course, at first it is easy to see confirmation bias in others, and only later do we learn to detect it in ourselves, which forever remains challenging.

Magical thinking is a belief in the interconnectedness of all things through forces and powers that transcend physical connections. Magical thinking invests special powers and forces in things and sees them as symbols on various levels. According to anthropologist Dr. Phillips Stevens Jr., “the vast majority of the world’s peoples … believe that there are real connections between the symbol and its referent, and that some real and potentially measurable power flows between them.” He believes there is a neurobiological basis for this, though the specific content of any symbol is culturally determined. (“Magical Thinking in Complementary and Alternative Medicine,” Skeptical Inquirer, 2001, November/December.)

One of the driving principles of magical thinking is the notion that things that resemble each other are causally connected in some way that defies scientific testing (the so-called law of similarity, that like produces like, that effect resembles cause). Another driving principle of magical thinking is the belief that “things that have been either in physical contact or in spatial or temporal association with other things retain a connection after they are separated” (the so-called law of contagion) (James GeorgeFrazer, The Golden Bough: A Study in Magic and Religion; Stevens). Think of relics of saints that are supposed to transfer spiritual energy. Think of psychic detectives claiming that they can get information about a missing person by touching an object that belongs to the person (psychometry). Or think of the pet psychic who claims she can read your dog’s mind by looking at a photo of the dog. Or think of Rupert Sheldrake’s morphic resonance, the idea that there are mysterious telepathy-type interconnections between organisms and collective memories within species. (Coincidentally, Sheldrake also studies psychic dogs

“…magical thinking is “a fundamental dimension of a child’s thinking.” –Zusne and Jones

According to anthropologist Dr. Phillips Stevens Jr., magical thinking involves several elements, including a belief in the interconnectedness of all things through forces and powers that transcend both physical and spiritual connections. Magical thinking invests special powers and forces in many things that are seen as symbols. According to Stevens, “the vast majority of the world’s peoples … believe that there are real connections between the symbol and its referent, and that some real and potentially measurable power flows between them.” He believes there is a neurobiological basis for this, though the specific content of any symbol is culturally determined. (Not that some symbols aren’t universal, e.g., the egg, fire, water. Not that the egg, fire, or water symbolize the same things in all cultures.)

One of the driving principles of magical thinking is the notion that things that resemble each other are causally connected in some way that defies scientific testing (the law of similarity). Another driving principle is the belief that “things that have been either in physical contact or in spatial or temporal association with other things retain a connection after they are separated” (the law of contagion) (Frazer; Stevens).

On a recent episode of Fringe (a rather mediocre, in my opinion, series on the supernatural), the investigators come across apparent cases of spontaneous human combustion (SHC). Popular Mechanics decided to write an article about the science behind SHC (it turns out there isn’t any) and luckily contacted me to give them the skinny. You can read the article here.

SHC is a fun pseudoscience in that there is nothing concrete at stake – no health claims, no products, no concerns about squandering limited research funds. It’s purely a scientific question, one highly amenable to skeptical analysis.

In order to understand SHC imagine the following scene: An elderly woman who lives alone is found dead in her apartment. She is the victim of fire; her body is mostly reduced to ash, and only the ends of her arms and legs remain. The ashen outline of her head lies upon the hearth of her fireplace, the iron grill of which has been knocked to the side. There are signs that a fire recently was burning in the fireplace. A brown greasy substance coats the walls and ceiling near the body, but otherwise the room is unharmed.

Now set aside all common sense and reason, and you’ll have a typical case of spontaneous human combustion.

A podcast called “Life, The Universe and Everything Else,” a program put on by the Winnipeg Skeptics association, has turned its sights on Thrive. I spent the morning listening to the podcast, and I recommend it very highly. You can play it from your computer here. The host of the show is Gem Newman (founder of Winnipeg Skeptics, computer science expert), and the guests include Gary Barbon, Mark Forkheim, Robert Shindler, Richelle McCullough and Greg Christiansen. You can see information on who these people are, and what their backgrounds are, here.

The Winnipeg Skeptics are a group of skeptics and critical thinkers who apply fact, logic and critical thinking to wild claims made on the Internet. Just as this blog has done since the beginning, the Skeptics have exhaustively examined Thrive and their review is, needless to say, highly negative. While they find some things to praise in…