10 Logical Errors That Fuel Conspiracy Theories

A quick note on why I wrote this: I have had several discussions with friends and family recently about conspiracy theories. A few months back I listened to a course about critical thinking on Audible, which addressed conspiratorial thinking. In these debates I have continously wished the other person had listened to the same course, and have often suggested that they do, but most people aren’t willing to sign up for Audible, and buy the course just to understand the point I am trying to make, so I decided to write about some of the most important ideas discussed in relation to conspiracies.

1. Anomaly Hunting

Using a group of unrelated anomalies to criticize a story’s legitimacy.

Anomaly hunting consists of looking for small details within a story that cannot be explained. These anomalies are then stitched together, even if they are unrelated, to provide evidence to an alternate theory than the one provided.

There is one major problem with anomaly hunting, and that is that anomalies are actually to be expected. This is due to the law of large numbers: When the number of variables is extremely high, unexpected events and coincidences are actually likely to happen. 9/11 conspiracy theories are particurly likely to exhibit anamolies, because of the wide spread effect of the attacks in a highly condensed area, both of which make the chances of coincidental occurrences higher.

I like to explain the law of large numbers with a simple question: How often do you think that someone in the world wins the lottery for the second time? If this happened would you consider the lottery rigged, or suspect a greater force in the world made it happen?

The answer is that someone should win the lottery for the second time at least every 7 years in the US alone. Calculate in the rest of the world and a it should happen even more often. Google the phrase “people that have won the lottery twice” and you’ll get pages of stories about two time winners.

The point is this: with the amount of people involved in massive historical events, there should be things that stand out, and seem unbelievable. Some people should have been called away from the World Trade Center last minute. Some of the employees there should have had kids sick that they stayed home to take care of. Even moving beyond people, with the amount of different events occurring on that day, it is very probable that some of them would happen in a way that is unexpected and seeminly unexplainable.

2. Violation of Occam’s Razor

Among competing hypotheses, the one with the fewest assumptions should be selected.

More simply put, Occam’s Razor says that if you are trying to explain a circumstance, and two different solutions can explain the same circumstance, the solution that is the simplest should be preferred. I’ve found the following example to be the best way of helping people understand this:

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Which is more probable?

A. Linda is a bank teller.

B. Linda is a bank teller and is active in the feminist movement.

The more common answer is B, but the correct answer is A. Here’s why: let’s say that with what we know about Linda’s past activism we decide that there is a 90/100 chance that she is active in the feminist movement. Then because we have very little information about her possible career, we decide that there is only 20/100 chance that she is a bank teller. Statement A’s probablity then is simply 20/100. Statement B’s probablity is (20/100 x 90/100), which is equal to 18/100. No matter what probability you assign to the possibilities of her being a bank teller, and of her being active in the feminist movement, option A will always have a higher probablity.

Occam’s Razor works in a similar way. The first possibility A is more likely because it assumes less. If a solution is simpler or requires less assumptions, and can explain an event then it is superior to other solutions with equal explanatory power.

Conspiracy theories violate Occam’s Razor in several ways. Firstly this demonstrates the issues around any conspiracy remaining a secret. As conspiracy theories are challenged they will often try to add more and more potential conspirators to explain away potential problems. Each of those participants though makes the theory less likely, as each participant is another potential failing point to have the “truth” leaked. Any major conspiracy theory would collapse under it’s own weight.

A good example of this is once again exhibited by 9/11 conspiracies. It has been suggested that the towers may have been rigged for demolition. This suggestion stems from the fact that they didn’t collapse in the way that some might expect, or that perhaps they shouldn’t have collapsed in the first place. But consider the complexity of rigging 2 skyscrapers for demolition. This presents several difficulties:

It would have been a major task, requiring engineers as well as people to actually rig the buildings. All of those people have the potential to leak what “actually” happened.

It would risk one of thousands of civilians frequenting those towers noticing massive crews rigging it.

Security for both towers either has to be tricked, or complicit in the conspiracy, which adds even more complexity to the explanation.

There is the possibility of explosive material being detected in the ruins. If the people investigating the site are in on the conspiracy too, that increases the complexity again.

I could go on, but you get the idea. Often to explain many of the anomalies mentioned in section 1, people will suggest solutions that though do technically have the power to explain what happened, but are even more unlikely and complicated than the original story.

Which is more likely: that a massive conspiracy involving a crew of demolition experts, engineers, security personal, wreckage cleanup crews, and investigators pulled off a masterful plan to hide two of the worlds biggest demolitions ever, or that the towers just fell in an way that looks similar to how they might have fallen in a demolition by coincidence?

In conspiracies an important question to ask at every step is this “Is the alternate solution proposed vastly more complicated than what is accepted as the normal course of events?” If so, it is violating a critical rule of logic.

3. Closed Belief System

A set of beliefs which includes the belief that criticism of the set is wrong and must be discouraged.

Closed belief systems promote the idea that any criticism of the belief itself is wrong, and conspiracies use this to suggest that any criticism of the conspiracy is in fact part of the conspiracy itself. The lack of evidence for an event becomes itself evidence of someone trying to cover up the event.

Some good examples of this are:

The lack of evidence against the JFK conspiracy being evidence of an incredibly well crafted cleanup.

The millions of satelite images of the earth from space being proof that NASA is determined to trick us into believing that the earth is round.

Anti-vaccination advocates claiming that doctors who promote vaccines are evidence that there is a vaccination conspiracy.

This line of thinking is extremely dangerous because if you fall into it, it is nearly impossible to break out. Once you are inside a closed belief system all evidence against your belief, all people trying to convince you otherwise become a part of the conspiracy itself. This ends in the ridiculous result that the more information you are shown against your belief, the stronger that belief becomes. All reasons to reject your belief become reasons to keep it. Closed belief systems are immune to debate, logic, or any possible refutation.

4. False Dichotomy

A situation in which two alternative points of views are presented as the only options, whereas others are available.

Conspiracy theorist will often present false dichotomies in partnership with anomaly hunting as a way to advance their particular conspiracy of what actually happened, all while ignoring various other possibilities. As stated in in section 1 on anomalies, there should be events in the world that don’t make sense. Sometimes these are of no consequence, but sometime they are.

But even if an anomaly suggest something different than the official story, that doesn’t mean that the story presented by the conspiracy theorist has to be true either. That is the false dichotomy.

I debated using this example because of how recent this attack occurred, but I decided to because it is a very good example of this kind of manipulation, and because conspiracies about this event were what finally pushed me to write this. Various initial reports, especially of those at the scene in Las Vegas reported more than one gunman. Now ignoring that these reports are at best anecdotal, that others from the scene say there was just one gunman, that people could have been confused by gunshots echoing, and that, finally, memory and eye witness testimony aren’t actually very accurate in the first place (especially under stress )— let’s assume for this example that there was considerable evidence of a second gunman.

This possibility has been used to suggest that the attack was perpetrated by: ISIS, The NRA, The US Government, The Democratic Party, and various other ridiculous suggestions. There have been suggestions that the known shooter wasn’t even involved at all but was a scapegoat for the real killers. This is an example of using a false dichotomy to support a view. If there was a second gunman the most likely conclusion from that information is “looks like there were two gunman that worked together, one got away, and one killed himself”. But instead others have used that same fact to suggest much more insidious motives and grand conspiracies for various political puroposes. All of which are ridiculous. They suggest these, comparing them only to the official story, which is “obviously also part of the conspiracy”, when in reality far simpler explanations can explain the discrepancy.

5. Confirmation Bias

The tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses.

When you want to prove your point, do you google “evidence that x is true” or “evidence that x is false”. Nearly all people will search for the first. This is a cognitive bias. We only look for information that affirms what we already believe (incidentally, a scientific approach is to do the exact opposite, start with your belief and then try your hardest to disprove it).

As people start to become convinced of a conspiracy theory, they will inadvertently seek out information that confirms what they already are starting to believe. This result of this is pretty obvious. Instead of looking for information to confirm your belief, look for information that disproves it.

6. Shifting the Burden of Proof

When two parties are in a discussion and one makes a claim that the other disputes, the one who makes the claim typically has a burden of proof to justify or substantiate that claim especially when it challenges a perceived status quo.

As a rule, when someone makes a spectacular claim, they have the burden of proving that it is true. This is because proving something not to be true, is nearly impossible.

For example, if I said that there was a small cup between here and Mars floating somewhere in space, you would likely ask me for proof of this. If I in turn said that I couldn’t prove it, but that you have to prove me wrong, I would have given you an impossible task.We don’t have any technology that can accurately detect every small object between here and Mars. But the claim I am making is spectacular, and not at all widely agreed upon. Therefore the burden of proof is on me to provide evidence that what I am saying is true, not on you to disprove my statement.

I see this kind of fallacy used most often by two very common conspiracy groups: anti-GMO and anti-vaccination groups. The common line is “Prove that GMOs/vaccinations are safe. But that misunderstands who has the burden of proof. The default assumption is that GMOs and vaccinations are safe, because we have no reason to believe that they aren’t, doctors and health professionals overwhelmingly agree they safe, and because no repeatable study has shown that they are not. Is it possible that a specific version of a GMO food will be more dangerous than it’s natural counterpart? Sure. But there’s no reason to think so without evidence. The naturalness of an item is largely irrelevant in regards to safety (in fact suggesting that natural implies being healthier or safer is a logical fallacy of it’s own, the Naturalistic Fallacy). If you want to suggest that either GMOs/vaccinations are unsafe, the burden is on you to prove it.

7. Moving the Goal Posts

Previously agreed upon standards for deciding an argument are arbitrarily changed once they have been met. The methaphor stems from the idea of one sports team moving the goal posts as a means of stopping the other team from scoring.

Moving the goal posts is an extremely frustrating fallacy to deal with, but I think I have a pretty good way of managing it. Often when a subject is being debated (especially politics) one person who is continuously proved wrong with evidence or arguments that effectivly counter theirs will refuse to admit that their argument has failed. This fallacy allows someone to continue holding their view long after it has been disproven, and makes them resistant to any kind of change on the issue. No matter what you show them, they will move the goal posts farther back and deny that you scored.

The way I handle this when discussing complicated issues is to always explicitly ask where the goal posts are: what evidence I would have to show to convince them of an opposing view. Then before presenting that evidence I will ask if they would accept the evidence if it came from a specific source. Only once they have explicitly agreed that they would change their mind if I was able to present that piece of evidence, will I give it to them. Most people agree to this quite easily, because they already have the opinion that they are right, therefore they doubt that such evidence exists. This method has been quite effective.

8. Fundamental Attribution Error

The tendency to explain other’s behaviors based on their decisions and not on their situation, but to explain our behavior based on our situation and not our decisions.

The fundamental attribution error results in believing that everything a person did in relation to an event was intentional, rather than accidental, or possibly completely outside of that person’s control.

In a conspiracy every event tends to be analyzed as intentional, and thus it all seems to be a part of some plan. This ignores that most of our actions are based mostly on the situation we are in, and that in most events those involved are reacting in the moment. People might suggest that they themselves would have acted differently than those actually involved, but having not actually been in that situation they don’t really know.

9. Criticism of Disagreements Within a Consensus

Criticizing small disagreements between those that make up a consensus.

Many scientific and historic facts are backed by a consensus, or by a near perfect consensus, but people will still doubt those facts. One method of discrediting the consensus is to criticize any disagreements within the group to suggest that the consensus isn’t as uniform as it seems.

Historians uniformly agree that the Holocaust occured. No serious professional in the field could question that without destroying their reputation. However, historians don’t agree on exactly how many Jews died during the holocaust. Nearly all agree that there was between 6–12 million deaths, but the exact number is debated.

This disagreement over the exact number however shouldn’t in anyway discredit the consensus itself. There are several variables to take into account when trying to determine the death count, and each historian will have differing opinions on the validity of specific sources. In fact differing opinions in a consensus is actually a sign of legitimacy. If a consensus agreed on absolutely every little detail, that would be more suspicious than disagreements within the group.

This sort of criticism is also commonly used against Climate Change scientists, doctors and FDA officials in support of vaccinations, and many other scientific bodies that are in near perfect agreement on various issues.

10. Naive Assumptions

Assuming without evidence or good reason that things should have happened in a certain way.

After various events some will try and state how events “should have happened” if the “official story” was true. Sometimes there are valid criticisms, which should be addressed by considering anomaly hunting, and if they can be explained by random chance. But other times the assumption that we can predict what should have happened is simply naive and misguided.

I’m trying to avoid using too many examples from 9/11 conspiracies, but 9/11 conspiracies are almost a perfect case study in the methods that conspiracies use. Some examples of naive criticisms of events from the 9/11 attacks are:

The plane wreckage from the Pentagon should have stayed together more.

Neither tower should have collapsed from the planes impact, or from the subsequent fires.

In each case, the assumption that we could even know the outcome of these events is ridiculous.

We don’t have any information about planes crashing into the side of reenforced concrete buildings, and what the wreckage should look like after. To say that the wreckage should have ended in any specific way is ludicrous. Especially because we don’t know so many variables about the event: the planes speed, angle of impact, where the plane initially hit, etc. Even with this information predicting the debris field is insanely complex.

It is often said that the two towers shouldn’t have collapsed because it was the only time that a building of that size has ever collapsed from a fire, however it was also the only time that a plane of that size hit a building, and the only time a building of that size had massive fires involving jet fuel.

Ultimately we need to realize another problem. Our intuitions are not necessarily accurate, especially in relation to physics. If they were, it wouldn’t have taken someone until the 1600s to figure out how most physical laws work (Isaac Newton). We don’t really know how complicated events should happen, and pretending that we do is naive.

Thanks for reading! Next time you are debating or perhaps considering a conspiracy, please keep these ideas in mind.