Satire is under attack, but are the fears justified?

Facebook recently announced that it will display warnings beside satirical content. In this post we look at the flaws and implications of recent research on the spread of false information on Facebook.

In an ideal world, we would all get our science news from well-informed, impartial discussion of replicated, double-blind, randomised, controlled trials published in peer-reviewed journals. You don't need me to tell you that we don't live in an ideal world. Facebook made headlines recently by testing a warning label that would inform users that an article may contain content that is meant to be satirical. The move has been broadly criticised for spoiling everybody's fun. According to a series of papers published this year however, a staggering proportion of people are really quite terrible at identifying the difference between genuine news, misinformation and outright trolling. The researchers are concerned that what begins as a harmless hoax can form the foundations for full-blown conspiracy theories. This isn't the first time we've looked at research that suggests satire regularly flies over the heads of its intended audience: five years ago a paper was published that suggested many conservative viewers of the Colbert Report believe that "Colbert only pretends to be joking and genuinely meant what he said."

This year, a team of researchers led by Walter Quattrociocchi set out to investigate online misinformation by analyzing publicly available data from 2.3 million Italian Facebook users. The researchers found that posts from mainstream news sources, alternative news sources and political movements reverberate online for a similar amount of time and accrue a similar amount of traction in terms of comments and likes on posts. So far so good: the fact that alternative news sources and political movements are able to compete with mainstream news sources for our attention in the online space has to be a good thing. The information becomes more concerning, however, when we consider that the researchers' definition of "alternative news" included not only news sources covering material neglected by mainstream agencies but also pages promoting bogus cancer cures, debunked conspiracy theories, and extremist content.

One of the most dangerous pieces of misinformation addressed in the research is AIDs denialism, a conspiracy theory that has resulted in an incalculable number of deaths. Only recently this blog looked at how Natural News, a mind-bogglingly popular alternative medicine website, pushes this conspiracy theory to an audience of millions of Facebook users. The resulting online footprint dwarfs reliable resources such as the US government's website on alternative medicines. This particular conspiracy theory resulted in 330,000 premature deaths and 35,000 babies born with HIV in South Africa between 2000 and 2005 when the nation's president Thabo Mbeki was taken in by misinformation that appeared online.

Quattrociocchi's team addressed political misinformation by looking at users who responded to "troll" posts by an Italian Facebookpage that exclusively posts demonstrably false information, much of which lacks any obvious display of humour. The researchers concluded that individuals who shared "troll posts" were more likely to interact frequently with "alternative information pages." Unfortunately, what isn't clear from the research is whether these users understood the joke they were sharing, which renders these findings by themselves not particularly meaningful. Indeed, we have absolutely no way of knowing what proportion of users liked a post because they appreciated the satire. Both of my Italian friends who helped me translate the content from the "troll" page thought it would be near impossible for someone to believe any of the posts were actually true. The posts consist of obviously false news stories and blatantly misattributed quotes scattered between viral memes of fluffy animals and bad Photoshop jobs. One of the latest posts, for example, claims that Idaho and Washington have left the United States. I find it hard to imagine how anyone could fail to realise such a post was not factual, but I'd be intrigued to see a similar study conducted on better disguised operations such as The Onion or The Daily Currant. Preliminary evidence appears in a blog that charts amusing instances of people falling for satirical stories on The Onion:

Many of the stories have comment threads with multiple different commenters falling for the hoax en masse:

It's very easy to poke fun at anonymous individuals on the internet, but the list of "mainstream" news agencies The Onion has fooled is extensive. Fox News reported that President Obama sent a rambling 127 page long email to the nation, Iran's Fars News Agency reported Iranian president Ahmadinejad was more popular among rural white Americans than Obama, and China's Communist Party newspaper reported Kim Jong Un was voted the "Sexiest Man Alive." Two Bangladeshi newspapers even reported that Neil Armstrong held a press conference in which he admitted the moon landings were a hoax. It seems Poe's Law might need updating:

"Without a blatant display of humor, it is impossible to create a parody of extremism or fundamentalism that someone won't mistake for the real thing.

Back to the Quattrociocchi study, another important issue is the vague application of the term "alternative news." The researcher's alternative news category includes everything from political extremism to groups simply aiming to share information that isn't widely reported. If the sources were divided based on the reliability of the content rather than the status of the news source, the results would be more meaningful. As regular readers of this blog will know, the implication that mainstream news is necessarily more reliable than an "alternative news" source is a fallacy. This was most recently demonstrated by the media furore around the alleged hoax of the three-breasted girl, which was debunked by an "alternative news" website.

In another paper, Quattrociocchi's team looked at the polarisation of Facebook activity among readers of scientific news and alternative news. Polarized users were defined as users for whom 95% of their "like activity" was on one category of page. The researchers demonstrated how polarised followers of science news occasionally comment on alternative news, but polarised followers of alternative news hardly ever comment on science news, suggesting they reside within a very narrow echo chamber:

The researchers again looked at online satire and "trolling" intended to mock followers of conspiracy theories. For example, the researchers cited the false claim that Viagra could be found in "chemtrails" and the idea that a source of "infinite energy" had been discovered. The researchers again found that the majority of these types of posts were liked by polarised followers of alternative news sources whilst only a minority of these posts were liked by polarised followers of science news. The results suggest that much of the satire shared on social networks may be from people who are oblivious to the satirical nature of their own posts. Unfortunately, the results of the study are not clear enough to let us know if this is actually what is going on.

The next study looked at the effects of trolling on conspiracy theorists compared with the effects of legitimate debunking efforts. The researchers looked at 1.2 million Italian Facebook users and again sorted the sample, this time isolating users for whom 95% of their likes were on either conspiracy posts or science posts. The researchers determined that 225,225 users were polarised consumers of science news whilst a whopping 790,899 users were classified as polarised consumers of conspiracy theories. This number seems staggeringly high, which raises the question of how the researchers defined conspiracy theories. (I've emailed the researchers and will update this post if I get a response). The researchers concluded that efforts to satirise, troll and debunk conspiracy theories all in fact bolstered conspiracy theorists' commitment to their narrative:

"The more a user is engaged, the more a contact with a troll post will reinforce the probability to remain a polarized user in his category."

The conclusion above provides further evidence for the backfire effect - a phenomenon we recently looked at in quite some depth on this blog. While this conclusion is hardly controversial, the next conclusion is one I'm not so sure about:

"Conspiracy theories seem to come about by a process in which ordinary satirical commentary or obviously false content somehow jump the credulity barrier, mainly because of the unsubstantiated nature of conspiracy related information [sic]."

I have not yet seen any convincing evidence that content originating in satire has spawned lasting conspiracy theories. I am certainly not convinced the harm done by satire to the overly credulous part of its audience (presumably a tiny fraction of the population) is greater than the power satire has to encourage us to think critically and improve our understanding of the world. Satire and hoax news are good business because it gets people clicking and gets them engaged in discussions—which, in my eyes, can only be a good thing. The latest studies are interesting as a proof of concept. They are also evidence of the birth of a new era of research on how we come to believe what we believe, enabled by the massive amounts of data becoming available to researchers from social networks such as Facebook.

As for understanding how misinformation really originates and why it tends to spread like wild fire, I'm inclined to come down on the side of the king of satire himself, John Cleese:

Update (8th Oct 2014 16:19): I have received a response from Walter Quattrociocchi stating: "I read your post on the blog and some points you claim as misleading have been corrected during the peer review process." In response to my inquiry whether the preprints discussed in this post have been accepted for publication I received the response: "Collective attention in the age of (mis) information is currently under review to Computers in Human Behavior (publication expected for the end of the year), Science Vs Conspiracy in the age of (mis)information is currently under review on Plos One (publication expected for the end of the year), Social determinants of content selection in the age of (mis)information has been accepted at Socinfo 2014".

Big Think Edge helps organizations by catalyzing conversation around the topics most critical to 21st century business success. Led by the world’s foremost experts, our dynamic learning programs are short-form, mobile, and immediately actionable.