Switch edition

‘The social networks are engineered so that we are constantly assessing others – and being assessed ourselves.’ Computer screens display fake tweets generated on a Chinese website
Photograph: Ng Han Guan/AP

The Collins Dictionary word of the year for 2017 is, disappointingly, “fake news”. We say disappointingly, because the ubiquity of that phrase among journalists, academics and policymakers is partly why the debate around this issue is so simplistic. The phrase is grossly inadequate to explain the nature and scale of the problem. (Were those Russian ads displayed at the congressional hearings last week news, for example?) But what’s more troubling, and the reason that we simply cannot use the phrase any more, is that it is being used by politicians around the world as a weapon against the fourth estate and an excuse to censor free speech.

Definitions matter. Take, for example, the question of why this type of content is created in the first place. There are four distinct motivations for why people do this: political, financial, psychological (for personal satisfaction) and social (to reinforce our belonging to communities or “tribes”). If we’re serious about tackling mis- and disinformation, we need to address these motivations separately. And we think it’s time to give much more serious consideration to the social element.

Social media force us to live our lives in public, positioned centre-stage in our very own daily performances. Erving Goffman, the American sociologist, articulated the idea of “life as theatre” in his 1956 book The Presentation of Self in Everyday Life, and while the book was published more than half a century ago, the concept is even more relevant today. It is increasingly difficult to live a private life, in terms not just of keeping our personal data away from governments or corporations, but also of keeping our movements, interests and, most worryingly, information consumption habits from the wider world.

The social networks are engineered so that we are constantly assessing others – and being assessed ourselves. In fact our “selves” are scattered across different platforms, and our decisions, which are public or semi-public performances, are driven by our desire to make a good impression on our audiences, imagined and actual.

How do we pop our filters bubbles when those actions are public?

We grudgingly accept these public performances when it comes to our travels, shopping, dating, and dining. We know the deal. The online tools that we use are free in return for us giving up our data, and we understand that they need us to publicly share our lifestyle decisions to encourage people in our network to join, connect and purchase.

But, critically, the same forces have impacted the way we consume news and information. Before our media became “social”, only our closest family or friends knew what we read or watched, and if we wanted to keep our guilty pleasures secret, we could. Now, for those of us who consume news via the social networks, what we “like” and what we follow is visible to many – or, in Twitter’s case, to all, unless we are in that small minority of users who protect their tweets. Consumption of the news has become a performance that can’t be solely about seeking information or even entertainment. What we choose to “like” or follow is part of our identity, an indication of our social class and status, and most frequently our political persuasion.

When we try to understand why people are sharing misleading, manipulated and fabricated information, we need to appreciate that those shares and retweets are playing an incredibly important function, which is less about their veracity or truth. The act of sharing is often about signalling to others that we agree with the sentiment of the message, or that even if we don’t agree, we recognise it as important and worth paying attention to. We want to feel connected to others, and these mini-performances allow us to do that.

Understanding this is easier if we read the work of media scholar James Carey. He argued that the dominant lens through which we understand communication is a “transmission model”, with a focus simply on the mechanics through which a message is transmitted from Sender A to Receiver B. However, he said, we should actually view communication through the lens of ritual if we want to understand why people seek out, consume and make sense of information. From this vantage point, Carey argued: “News is not information, it is drama.” A ritual view of communication views “reading a newspaper less as sending or gaining information and more as attending a mass”, where “a particular view of the world is portrayed and confirmed”.

When we consider many of the solutions being proposed to tackle the spread of disinformation, it certainly seems that the focus is on this transmission model. Ideas such as flagging disputed content are founded on the idea that information consumption is rational. If we are serious about slowing down the dissemination of mis- and disinformation, we need to start recognising the emotional and social drivers that shape people’s relationship with information.

There has been much discussion over the past year about the need for us to pop our filter bubbles, to follow a much more diverse set of people and accounts. But how do we do this when those actions are public? Do we need to explain to our network why we are following that hyper-partisan Facebook page that sits at the opposite end of the political spectrum from our own views? And how to “heart” a tweet to go back to later for research when that action is public? Seeing Twitter tell you that your most ardent Trump-hating friend just “liked” one of his tweets can be jarring.

As a French thinker of the 1960s, Guy Debord, would second, we’ve historically evolved from being informed to having information, and then to appearing informed. While the architecture of the platforms isn’t the root cause of why mis- and disinformation are being created on the scale we’re now seeing, these features are a significant reason that they are being disseminated. And when the algorithms that power these networks are designed to capitalise on our emotional responses, but proposed solutions require rational responses. Unfortunately, no significant change is likely.

• Claire Wardle is a research fellow at the Shorenstein Center on Media, Politics and Public Policy, and leads the non-profit First Draft. Hossein Derakhshan is a writer and researcher. They recently co-authored the report Information Disorder, commissioned by the Council of Europe