How Russian trolls got into your Facebook feed

On Wednesday, Congress released some of the 3,000 Facebook ads and Twitter accounts created by Russian operatives to sway American voters. You can explore them in an analysis the Post published here.

These disturbing messages, seen by up to 126 million Americans, raise thorny questions about Silicon Valley’s responsibility for vetting the information it publishes. Beyond Washington, it leaves all of us who use social media to keep up with friends, share photos and follow news wondering: How’d the Russians get to me?

The short answer is Silicon Valley made it very easy.

Facebook’s top lawyer told Congress on Wednesday the Russian effort was “fairly rudimentary.” Here’s what he meant: Ever notice a Facebook ad that’s eerily relevant to something you’ve been talking about? Had an ad for a pair of sneakers follow you around the Internet for a week? Or seen an ad that says your friend “liked” it?

That’s the occasionally creepy handiwork of advertising tech, which covertly tracks much of what you do online—and then sells access to you to the highest bidder. We’re just now waking up to the fact that not only traditional marketers and legitimate political campaigns are buying in. It’s also Russian trolls hoping to manipulate you.

You were in Russia’s crosshairs if you liked the Facebook page of Donald Trump or Hillary Clinton. Same goes for people who said they were fans of Martin Luther King, Jr. Russians even targeted people who shared enough stuff about the South that Facebook tagged them being interested in “Dixie.”

There’s no way to tell if you personally saw a Russian post or tweet. I’d certainly like to know, but Facebook so far hasn’t disclosed to individuals if they were exposed to posts from a troll farm called the Internet Research Agency. (Ads paid for by that group made up the bulk of a trove published on Wednesday.)

Facebook lawyer Colin Stretch on Wednesday told Congress the social network had notified Facebook members broadly about the issue, but it would be “much more challenging” to identify and notify specific people.

Facebook’s advertising systems are largely automated, so no human had to check before these ads went online. Often they originated from groups with legitimate-sounding names, such as “Donald Trump America.” Facebook and Twitter have now taken down posts they suspect to have “inauthentic” Russian roots and instituted new review systems. Legislators are threatening new laws that could further rein them in.

Of course, you didn’t have to click on these posts, or believe what they were pitching. But social media tech is particularly good at making messages irresistible. The Russian trolls didn’t have to spend much money on these marketing techniques to have an impact thanks to precision targeting—and free promotion for buzzy content.

The most basic tool they used is called targeted advertising. By watching what you and your friends share and do on—and off—the social network, Facebook slots you into categories. Some are demographic (age, state, gender) and others are based on things you’ve “liked” and the assumptions Facebook draws about your interests. Facebook will actually show you what it thinks of you, if you click here. (It also lets you edit the categories; doing so could make its ad targeting even more effective.)

The Internet Research Agency bought ads targeted to people with diverse criteria, ranging from gay and lesbian groups to the Muslim Brotherhood.

The Russian agents also used an ad technique based on tracking and following certain people around the Web. For example, if you at some point clicked on a troll website masquerading as legitimate, the site’s tech could identify your web browser and allow the trolls to “re-target” ads to you elsewhere around the web. On Facebook, Russian operatives used a tool called Custom Audiences to target people in such ways.

Most effective of all: Russian trolls used celebrities—and our own friends—to get to us. For free. For example, in April of 2016, rapper Nicki Minaj retweeted a message about an upsetting shooting from the twitter handle @Ten_GOP. That account looked like it was the Tennessee Republican Party, but it was actually a Russian troll interested in inflammatory content. Minaj’s post was retweeted and “liked” more than 24,600 times. (For the record, the actual Tennessee Republican Party told The Washington Post that they had contacted Twitter three times about their impersonator problem).

You or your friends might have shared one of these posts on Twitter, Facebook, Pinterest or beyond, which the industry calls “organic” promotion. These posts reached way more than the 10 million people who saw paid ads. On Facebook alone, they found their way in front of the eyes of 126 million Americans.

Comments

Geoffrey A. FowlerGeoffrey A. Fowler is The Washington Post’s technology columnist based in San Francisco. He joined The Post in 2017 after 16 years with the Wall Street Journal writing about consumer technology, Silicon Valley, national affairs and China. Follow