Did older Facebook users sharing fake news really help elect Trump?

Did social media change the outcome of the 2016 presidential election in the US and the 2016 Brexit referendum in the UK? There is much concern about the role of Facebook and Twitter, particularly in promoting fake news, but little research to prove what impact it has.

Now we have another small piece of the puzzle. A team in the US has looked at how many Facebook users shared fake news in 2016 and who they were.

Andrew Guess at Princeton University and colleagues sent an online survey to 3500 people in the US during the 2016 election campaign. They persuaded 1300 to temporarily share their Facebook profile data showing what stories they shared. “That’s actually way higher than we expected,” says Guess.

Advertisement

The results suggest that Republicans over the age of 65 were seven times more likely to share fake news than respondents of any political leaning aged between 18 and 29. “You want to be careful not to confirm your biases,” says Guess. “But this could be one of those cases where the data confirm what people suspect.”

Blame the elderly?

There are a few complicating factors. While Republicans were more likely to share fake news, this could be because most fake news had a pro-Donald Trump bias – like the story that falsely claimed that the Pope had endorsed Trump.

Guess also points out that elderly people were more likely to share fake news regardless of their political views. Around 11 per cent of over 65s shared fake news stories compared with just 3 per cent of those aged between 18 and 29.

Overall, 9 per cent of respondents shared fake news. That might seem a high proportion to some. “It’s a lot,” says Charlie Beckett of the London School of Economics.

Depending on how you define fake news, the proportion of people sharing it could be even higher. Guess and colleagues only looked at “knowingly false or misleading content created largely for the purpose of generating ad revenue”. “That’s a quite a small subset,” says Beckett.

Despite this, both the paper and accompanying press release explicitly suggest the sharing of fake news has been exaggerated, using terms such as “Less than you think”, “relatively rare”, and “a small percentage”.

Wildly inflated

“The focus tends to be on headline-grabbing engagement numbers that are likely wildly inflated,” he says. “So we’re trying to push against such perceptions with good representative data… The point is that both consumption and sharing [of fake news] seem to be concentrated among a relatively small percentage.”

This brings us to the controversial issue of whether social media really is swinging elections. This study did not address that but it cites a 2017 study suggesting that fake news stories tend to reinforce existing opinions and are unlikely to be persuasive enough to change opinions and swing elections.

Maybe. But even small effects can swing elections. What’s more, social media studies like these tend to look at snapshots in time. There could be a cumulative effect over years. “In the short term you are not seeing people being converted. But you are building up the framing, often in a more divided, heated and polarised way,” says Beckett.

But completely made-up stories are just part of a much broader issue with misinformation, says Beckett. For instance, if people read and share lots of stories about, say, immigrants committing crimes, they can be left with the false impression that immigrants as a whole commit more crimes. The individual stories may be true, but their overall impact can build a mistaken view of the world.