It was only a few minutes after my imaginary Trump supporter “Todd White” began exploring Facebook that he learned filmmaker Michael Moore was staging a coup d’etat against president-elect Donald Trump. Todd also learned that Trump won the popular vote. And that there were people paid to protest at Trump rallies.

None of that is true, of course. That’s the sort of fake news that was disseminated by Facebook—bogus content that many believe was written by partisan groups to influence the election. That belief was apparently confirmed Thursday, as The Washington Post reported that Facebook had sold ads to Russian "troll farms," presumably to influence the election. Previously, Senator Mark Warner (D-Va.) had warned that potentially thousands of "trolls" had published posts to spread disinformation about Hillary Clinton.

But how do Facebook users end up seeing it? During November 2016, we decided to test who’s seeing this partisan fake news, who’s supplying it, and just how obvious it is.

We began our investigation on Nov. 21, 2016, as the fake-news controversy gained momentum—and Facebook and Google began blocking sites that traffic in disinformation from their respective advertising networks. We set up two Facebook accounts, one favoring Hillary Clinton, and the other supporting Trump, then let Facebook recommend a series of news pages. In effect, we were asking Facebook to be our news service.

Then we sat back and watched the news roll in. We looked closely at each post to determine whether it was real news, fake news, or something in between.

Fake news is a real problem

Questions about Facebook’s role in spreading fake news were raised almost as soon as Trump shocked the world with his victory. BuzzFeed and other news sites began publishing reports about how a small town in Macedonia turned fake election news into a cottage industry.

It appears the authors behind the fake news reports had no partisan agenda. They were just in it for the money. One creator claimed he could make $10,000 per week in ad revenue from stories that were shared among Trump supporters.

Mark Hachman

Fake or just partisan?

The Macedonians may still be at it, because our Republican supporter, Todd White, was flooded with partisan posts. Worse, over a little more than two days, we counted 10 such posts in his feed that were fake, most accusing Democrats or their supporters of illegal activity. In all, White was clearly exposed to more spin than his Democratic counterpart, Chris Smith, who saw exactly zero fake news stories.

But the problem goes beyond fake news. As Facebook’s feeds prove, we live in a “post-truth” world, where the line between partisan spin and outright lies is practically indistinguishable.

What our Democratic persona, “Chris Smith,” sees when using Facebook.

Letting Facebook choose the news

To conduct our experiment, I opened Google Chrome in Incognito mode, then created two Gmail addresses. I then used both email addresses to register for new Facebook accounts—“Chris Smith” for Clinton, and “Todd White” for Trump. To eliminate hidden biases, I registered them both as white males, each with the same birthday.

For Smith, I then Liked three people: Hillary Clinton, Joe Biden, and President Barack Obama. For White, I Liked Donald Trump, Mike Pence, and Newt Gingrich.

I then asked Facebook to recommend Pages to follow. Facebook provides two mechanisms for doing this: a “Like Pages” page in the left nav bar, which provides a visually compelling tiled layout of suggested Pages, and a similar list of suggested Pages next to the Pokes section. For each of my test profiles, I systematically selected the first, fourth, and seventh from the list of Pages next to Pokes. Then I added the first seven suggestions from Like Pages later that night, for a total of 10 across both avatars.

Note that I deliberately didn’t Like pages like alt-right news service Breitbart.com, as I wanted to see if other pages would reference them. (Surprisingly, they often didn’t.) I was testing what Facebook offered my avatars, more than what these avatars might actively solicit. I also made no friends on the service—again, to test Facebook, not other humans.

Facebook suggests Pages like these to follow. For someone new to the system, this is what they might click upon. Note that some conservative sites snuck in.

Smith ended up with Pages like “Exposing Facts to the Misinformed Viewers of Fox News,” “Hillary Clinton, Democratic News,” and “Rude and Rotten Republicans.” White landed such gems as “Hillary for Prison,” “TRUMP TRAIN,” and “I hate Hippies and their stupid light bulbs.”

I was putting my trust in Facebook. Would Facebook show me Pages that believed in trusted news sources? Or would Facebook toss me into the maelstrom of partisan news, some of it fake?

What do you think?

Into the cesspool

Immediately I saw some clear distinctions between my two Facebook users, Smith and White. For one, Trump fan White saw many, many more posts compared to Smith: 129 versus just 41, over the course of about two and a half days. Granted, this was partially due to the whims of the Pages recommended by Facebook. It’s likely (or at least possible) that White’s news sources are more prolific posters. Nonetheless, it appears that conservative Facebook viewers are being flooded with posts.

Another post in the conservative Facebook feed.

Second, rarely did conservative Pages reference so-called mainstream media. Instead, they tended to regurgitate blog posts from other sites, Facebook posts, and right-wing blogs—sites like AmericasFreedomFighters.com and USASupreme.com. Facebook didn’t show my avatar any outright hate sites, though Photoshopped images of a “sickly” Hillary Clinton certainly wandered into that territory.

Third, although Clinton lost, my pro-Clinton page was bombarded not by anti-Trump messaging, but rather pro-Clinton messaging. The pro-Trump page was split about 50-50, I’d say, between pro-Trump posts and insults directed at Clinton and other Democrats and liberals.

The question that we set out to answer, though, was how many partisan fake news stories we saw. In our study, 10—and that’s 10 too many if you believe that Facebook should be held to accuracy standards.

Fake news and propaganda

As I skimmed through each post on the feeds of Smith and White, I tried to characterize each post: Was it politically neutral? Was it clearly partisan? Fake? Or simply a non-political post that would qualify as none of the above?

One of the posts in the feed of “Chris Smith,” our Democratic Facebook user.

A significant number of posts on both sides were largely neutral, or slanted so slightly that I gave them the benefit of the doubt. Of those, Smith, the Democrat, saw 12 political posts, 23 slanted posts, and six posts which I characterized as non-political. None were fake.

White, the Republican, saw 33 political posts and 79 slanted posts—many more posts in general, but a higher percentage of slanted posts within his overall News Feed. Facebook also chose to show White the 10 fake posts, as well as seven that weren’t political.

We’ve listed all the fake posts we found at the end of the article. While a couple of them were obviously faked, most were plausible—just as plausible as stories that I thought were false, and turned out to be completely true. Will carrying a medical marijuana card prevent you from owning a gun? Sounds incredible, but yes, that story is true. Then there’s the piece on Paul Schrader, the writer of Taxi Driver, who apparently advocated violence after Trump’s election. That’s true as well, and he apologized for it on Nov. 15. Discovering that such outlandish stories are indeed factual helps reinforce the idea that other seemingly dubious articles can be factual, too.

Is this factual? Depends on how you see it.

But it’s the stories that fall somewhere in between that can be confusing. Is Paul Ryan really trying to get rid of Medicare? He may not have said so explicitly, but if you’re a Democrat, you probably believe he is. Picture “memes” add another element: They may not explicitly tell an untruth, but they can imply as much through innuendo. Most of Facebook’s political posts fall somewhere in this middle ground between truth and fiction, and it can be exhausting trying to label them as one or the other.

“People Also Shared” posts typically either confirm the post above or simply take the topic in new directions. In this case, we weren’t able to confirm or deny the first story that Clinton was too intoxicated to speak on Election Night, so it went into our “slanted” category.

One important problem is that Facebook doesn’t just show you posts from Pages you’ve Liked. The site also suggests posts that other users have shared, as well as what it calls Related Articles. In both cases, that means certain posts are “reinforced” by other similar posts placed directly beneath them, with stories that seemingly back up what’s being shared as actual truth. (Occasionally, Facebook also promotes fact-checking sites like Snopes.com to either back up or debunk the story in question, but that’s far rarer.) The upshot, though, is that the post in question seems to be true, because of this apparent confirmation by other reports.

Major change is needed at Facebook

Facebook chief executive Mark Zuckerberg has scoffed at accusations that fake news affected the election. “Personally, I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way—I think is a pretty crazy idea,” Zuckerberg said on Nov. 11.

Zuckerberg’s numbers may be right. But he seems to be conflating the volume of fake news with the impact of fake news, ignoring the power of half-truths, omissions, and outright lies to spread misinformation and confusion.

Negativity—and half-truths—aren’t just confined to the conservative side.

Even President Barack Obama has voiced his concern about fake news. Speaking at a November 18 press conference in Berlin during a visit with German Chancellor Angela Merkel, Obama remarked, “If we are not serious about facts and what’s true and what’s not—and particularly in an age of social media where so many people are getting their information in soundbites and snippets off their phones—if we can’t discriminate between serious arguments and propaganda, then we have problems.”

Facebook has not returned an emailed request for comment. Recent reports indicate the company is aware of the problem, but it may be struggling to address it while also distancing itself from earlier allegations of liberal bias.

Facebook vice president of product management Adam Mosseri has acknowledged that the company’s efforts to verify stories don’t go far enough. “It’s important that we keep improving our ability to detect misinformation,” he said. “We’re committed to continuing to work on this issue and improve the experiences on our platform.” Most recently, over the weekend, Facebook said it would employ third-party fact-checkers to verify news posted on its site.

This sounds like one of the Big Problems that Silicon Valley companies are forever setting out to solve. And it’s not going away. The clock’s ticking on the midterm elections, meaning Facebook has less than two years to make real changes around fake news.

Fake posts

Here’s a list of the fake news that Todd White, our fake GOP supporter, encountered while on Facebook:

”Soros Can Face Prison Under U.S. Code › Title 18 › Part I › Chapter 115”Is George Soros planning the next American Revolution? Not really.