At war in social media

Social media has become the arena for an information war, where private companies offer fake net identities to use for influencing. This is how the world of bots and propaganda works, writes Karin Pettersson and Martin Gelin in their book “Internet är trasigt” (The Internet is Broken).

Alabama is in the Deep South of the United States, where racism is at its liveliest. It was here that Donald Trump held an open meeting in the fall of 2017 and chose to vent his spite on the black American football players who, instead of honoring the flag ­during the national anthem, knelt in protest against racism. “Wouldn’t you love to see one of these NFL owners, when somebody disrespects our flag, to say, ’Get that son of a bitch off the field right now’”. Trump’s words were clearly intended for his core voters in Alabama. His comments exploded in social media and later also in the traditional media. The topic was ideal for Trump, who feeds on war on cultures, polarization and wrath.

But it was also perfect for the Russian president Vladimir Putin. That could be noticed in real time on the surveillance site Hamilton 68, which is tracking more than 600 Russian propaganda accounts. The Russian propaganda apparatus loved the story and did everything it could to enhance it. The accounts were tweeting like mad, making up new hashtags and spreading every possible bit of the story.

They most likely contributed to the topic ­attracting so much attention and growing so big in the public domain for such a long time. The Kremlin is very astute at propaganda, as it were. And the structures that Adrian Chen and Jesikka Aro identified in 2014 have only grown more powerful and efficient since then. Around the world, countries are building social media armies in order to protect their own populations from foreign influence or efforts to influence them in a certain direction – alternatively, as in the case of Russia, to control the debate at home and abroad.

“Around the world, countries are building social media armies.”

There is an example of cyber defense in Estonia where they, after having been subjected to heavy IT attacks in the spring of 2007, formed something that resembles a digital home guard: “Estonian Cyber Defense League”. Ordinary people who, in collaboration with the armed forces, counteract Russian propaganda. In a report about Russian influencing ­operations in the Baltic states, Stratcom (Nato’s strategic communications center) showed that 85 percent of all tweets in Russian about Nato came from bots.

One of the reports produced by Oxford Internet describes the possibly hardest thing to get at, but is becoming more frequent. It has to do with private companies producing fake net identities which then can be used to influence what impact a company or political idea has on the net. An example from Poland reveals how a marketing and communication firm over several years created 40,000 false identities, all of them with real names, IP addresses and personalities. Each one of these “persons” then has several accounts in different social media, offering the possibility of making a heavy impact for someone who can afford it.

15 fake accounts at a time

The theory behind this business is called “false amplification”, to give material and opinion a distribution that they would otherwise not have. The way it works is that real persons manage false accounts, as many as 15 accounts at a time. The trick is to be very careful to use real photos and make the accounts look “real” so that they are not detected by the filters at the social media companies that try to weed out fake identities. Then these fake accounts are used to write in comments boxes, Facebook groups or on Twitter. They use VPNs or false IP addresses and because they are so cleverly masked they become practically impossible to detect, both for the social media companies and for the world around us. This in turn creates safety and distance for the buyer of the service. The strategy applied by the companies is not to inundate social media with their message and get hashtags trending, but much subtler. It is about influencing opinion leaders, including journalists, politicians and activists. This is done by infiltrating important Facebook groups, writing comments and interacting directly with people who are important to reach. The aim is to influence these people, making them believe that the false identities truly ­believe in what they argue for.

Still many people, including journalists, ­believe that what you see on social media ­expresses the will of the people or mirrors the population, but without carrying out qualitative research it is very hard to draw any conclusions as to what it means when certain topics are trending or not on social media.

n the fall of 2017 Facebook, Twitter and Google were summoned to the US Congress to testify on how their platforms had been used by Russia in order to influence the election. The testimony from Facebook showed, among other things, Russian advertisements. In Pennsylvania, one of the states where the result of the election was unclear, the Facebook account “Being Patriotic” posted ads for a demonstration that was going to gather “miners for Trump”. The ad was aimed at male voters. “Donald Trump has said that he is going to give miners their jobs back” was the message. He had not said that, but that didn’t bother Moscow. Actually, it emerged, after an internal investigation at Facebook, that the ads were a part of a propaganda push controlled and paid for by Russia. Trump won Pennsylvania with a margin of less than 45,000 votes.

A frightening naivety

The weaknesses of the internet have been used by countries for several years. In many places, however, there is a frightening naivety in this field. This also goes for journalists who still are too uncritical when referring to the content of social media – regardless of the fact that the far right has organized Twitter storms for years, with the aim of influencing the media and the public opinion. Simply put, social media is not a place where the interests and views of people are reflected in a neutral manner. It is an arena for organizing, chatter, entertainment and information war. It is also a world where those with the loudest, most polarizing and most controversial opinions make the heaviest impact. In many (Swedish) newsrooms the awareness of influencing campaigns and the need for fact-checking and verification has increased. But in other places there is still a great naivety concerning the propaganda war of the new age, and what role the media outlets themselves risk playing if they don’t understand the new rules of the game.

NAME: Karin Pettersson

TITLE: Director of Public Policy

YEARS IN SCHIBSTED: 9

I LOOK FORWARD TO: Helping to strenghten Schibsted’s voice in the discussion on data monopolies and the future of news.

NAME: Martin Gelin

TITLE: Former freelance contributor, New York Correspondent for Dagens Nyheter.