Why do people believe in a Flat Earth?

On 17th February Asheley Landrum, of Texas Tech University, gave a talk at the AAAS conference titled “Believing in A Flat Earth“.

The text for that presentation was as follows …

Who believes in a Flat Earth? It seems like a virtual impossibility that there are individualas in this day and age that do not accept that the Earth is a globe. Yet, for some, accepting that the Earth is flat is a necessary belief to support a broader worldview. Psychology and communication literature has shown over and over again that people are motivated reasoners. We quickly accept information that fits our values and prior beliefs and carefully scrutinize and find reasons to diminish or reject evidence that challenges them. Though many intuit that the real problem is one of knowledge: ‘if we just improved science literacy, them more people would accept what science knows’; this is only part of the equation. When people are motivated to resolve cognitive dissonance that they face when information conflicts with their own values, they do this not by adjusting their deeply held worldviews, but by rejecting the conflicting information. The so-called ‘Flat Earthers’ are doing just that. In this presentation, I discuss preliminary data my lab collected from the first International Flat Earth conference in Raleigh, N.C, addressing some of the common beliefs underlying the flat earth movement as well as what this can teach us about conspiracy theories and motivated reasoning more generally.

What was the reveal?

Basically she are her colleagues interviewed attendees at a Flat Earth conference to try and understand how that arrived at the conclusion that Diskworld is a documentary. They discovered a common pattern …

Of the 30, all but one said they had not considered the Earth to be flat two years ago but changed their minds after watching videos promoting conspiracy theories on YouTube. “The only person who didn’t say this was there with his daughter and his son-in-law and they had seen it on YouTube and told him about it,” said Asheley Landrum, who led the research at Texas Tech University.

The interviews revealed that most had been watching videos about other conspiracies, with alternative takes on 9/11, the Sandy Hook school shooting and whether Nasa really went to the moon, when YouTube offered up Flat Earth videos for them to watch next.

In other words these were all victims of the YouTube Algorithm that strives to keep you hooked.

The Algorithms at play: It was Professor Plum that did it?

Exposure to just one view inside the bubble can persuade.

Prof Plumb was caught in the library standing over the body and holding the gun, so clearly he is guilty. Well I’m convinced, case closed. Oh but wait, there is more to this. It turns out the the Rev Green was very depressed and also left a note explaining why he had shot himself. Poor old Prof Plum had simply heard the shot, was first on the scene and picked up the gun, he is totally innocent.

So the point is this – if you step into a bubble and expose yourself to just just one side of an argument then you will be potentially convinced, hence it is important to understand the full conversation and not just one side.

YouTube has a problem with Algorithms. If you watch a specific clip, then the algorithms will offer up similar related videos for you to watch. Pick the right set and you will soon find yourself deep inside a bubble for some whacky idea where the only offerings you get will promote the utterly absurd as “truth”. All you are seeing is clip after clip that keep promoting the idea. Once inside the bubble you can become emotionally invested in the idea due to what you are being told again and again is compelling evidence.

YouTube Warnings

… During an interview at South by Southwest in Austin, Tex., YouTube’s chief executive, Susan Wojcicki, announced that the company she leads would enlist Wikipedia’s help to deal with the proliferation of conspiracy theories and misinformation on its platform….

we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.

“Begin Reducing”?

To be brutally blunt, “Begin Reducing” is not going to cut it. They are a $100 billion company and they are not going to damage their revenue stream. They will do enough to create the illusion of “taking action”, but it will come nowhere close to even begin to actually address the problem. The proliferation of BS will not just continue, it will thrive and grow.

Traditional Standards

We live in an age where concepts such as “Legal, Decent, Honest, and True” simply do not apply. Instead we are awash with a torrent of misinformation. There is no editor in the middle to filter anything. If it tickles the imagination then it will be out there. “Secrets” and “Conspiracies” will thrive and flourish because such click-bait generates revenue for both the content creators and also the content platforms. Unless compelled by law then “action” by content providers will not stem this tide.

Content publishers do have some responsibility here, and yet I seriously doubt that they have either the motivation or the willingness to seriously tackle the challenge. We now live in a world where traditional sources are rejected and instead a YouTube worldview prevails for many.

Red Flags

If you care about what is actually true then it does of course create a new “Red Flag”. If discussing anything with anybody, and they suggest that you check some “compelling” evidence on YouTube – then you are more or less done, the conversation is over. That’s akin to somebody citing Fox News or Sara Huckabee Sanders as a “reliable” source.

Some Content Providers do get it right

Wikipedia is an example of a content provider getting it right. Jimmy Wales will be remembered, not just because he made a very conscious decision not to personally enrich himself, but instead because he has created a source of information that has grown into a reliable credible source of information.

You might doubt the sincerity of those that promote such daft ideas. Don’t, they truly do believe the utterly absurd – welcome to the age of algorithms that create nurseries for these misinformation bubbles.

We are going to have to get better at teaching people how to think critically.