How Social Media Played a Role in the 2016 Presidential Election

Tons of fake accounts were made.

Representatives from Facebook, Twitter, and Google were in the hot seat on Capitol Hill last week for the tech companies’ roles in spreading Russian propaganda and disinformation. Facebook had already disclosed that Russian-backed content reached at least 126 million people through its platform during the 2016 U.S. presidential election. Twitter identified 2,752 accounts on its platform controlled by Russians and more than 36,000 bots that tweeted 1.4 million times during the election.

Some of these fake accounts, backed by the Internet Research Agency (IRA), a “troll farm” with ties to the Kremlin, intentionally posted divisive and false content at key points during the campaign. These included “Black Matters,” a Facebook page meant to rile supporters of the Black Lives Matter movement, and “Heart of Texas,” a page that promoted the secession of Texas.

At the hearing, Colin Stretch, Facebook’s general counsel, acknowledged, “In hindsight we should have had a broader lens. There are signals we missed.” Facebook, Twitter, and Google pledged to do more to prevent political manipulation by foreign agencies in the future. Twitter even acknowledged that it knew about the accounts, but did not take action.

When asked for comment on the issue, a representative for Twitter directed Teen Vogue to the company’s @policy account for “data, insights, and full public statements from the hearings.” One tweet found on the page said that they suspended all 2,752 accounts linked to IRA and gave committee investigators the handles (or names) of those accounts.

A representative for Facebook told Teen Vogue, “The foreign interference we saw is reprehensible and outrageous and opened a new battleground for our company, our industry, and our society. That foreign actors, hiding behind fake accounts, abused our platform and other internet services to try to sow division and discord—and to try to undermine our election process—is an assault on democracy, and it violates all of our values.”

The reason Facebook took down Russian-linked ads after the election was because the operators of the accounts misrepresented who they were, not because of the content they posted. Facebook could not commit to banning future Russian agencies unless they violated the terms of service.

At the heart of this issue lies a difficult reality: These technology companies don’t face the same regulations as news organizations. As Representative Adam Schiff of California pointed out during the hearings, Facebook and Twitter utilize an algorithm that rewards viral content, much of it fear- or anger-based. This puts some of the responsibility of evaluating content on the reader.

Ohio State University communications professor Kelly Garrett tells Teen Vogue that social media most likely did shape people’s beliefs on “fake news” and the candidates during the presidential election, but it was a small effect. “Social media is influencing how people feel about their opponents — the polarization of the two parties — and social media seems to be driving that polarization up,” Garrett says.

This happens because people often read things that confirm their beliefs and what they want to believe. “People are biased,” Garrett says. “With social media, people won’t hesitate to accept a claim they already believe — they will be tempted to share based on the headline alone. You don’t need to read it to know that you want to believe it. But when we are faced with something that challenges our beliefs, we are much more skeptical. We become a much more critical thinker and we look at the source. We often don’t realize how quickly we accept the things we want to believe and think it’s quite reasonable to question things we disagree with.”

Garrett recommends thinking critically about everything we read. Ask yourself before you share content if it has a factual claim. Are you prepared to stand up for this claim based on what you know? If the answer is no, then that tells you something about the source. Spend some time looking at another credible source or two.

By doing this, while social media giants and legislators consider their options, readers can make their own informed decisions and not be manipulated.