Did Facebook Really Tolerate Fake News to Appease Conservatives?

Facebook has spent the past week denying that fake pro-Trump news on its platform played a role in the outcome of the U.S. election. On Monday, Gizmodo published a report that, if true, would severely puncture Facebook’s credibility. The tech site reports:

According to two sources with direct knowledge of the company’s decision-making, Facebook executives conducted a wide-ranging review of products and policies earlier this year, with the goal of eliminating any appearance of political bias. One source said high-ranking officials were briefed on a planned News Feed update that would have identified fake or hoax news stories, but disproportionately impacted right-wing news sites by downgrading or removing that content from people’s feeds. According to the source, the update was shelved and never released to the public. It’s unclear if the update had other deficiencies that caused it to be scrubbed.

Facebook was quick to dispute the report. A spokesperson emailed this statement:

The article’s allegation is not true. We did not build and withhold any News Feed changes based on their potential impact on any one political party. We always work to make News Feed more meaningful and informative, and that includes examining the quality and accuracy of items shared, such as clickbait, spam, and hoaxes. Mark himself said,“I want to do everything I can to make sure our teams uphold the integrity of our products.” This includes continuously review updates (sic) to make sure we are not exhibiting unconscious bias.

So who’s telling the truth here? Without knowing for sure, I can offer a few insights that might be helpful in assessing the competing claims.

First of all, any report that relies so heavily on a single anonymous source should be regarded warily. Yes, there are two anonymous sources for the first major claim in the Gizmodo post, which is that Facebook undertook a review aimed at eliminating the appearance of bias. But that part is not particularly controversial, even if it’s true. Facebook itself said earlier this year it would begin to train employees on unconscious political bias, among other unconscious biases. That’s just one of several moves it made in response to charges of liberal bias in its trending news section.

In fact, a company spokesperson told me that the news feed team had built two different options for a ranking update to address clickbait this past summer. The first relied on user reporting and behavior to identify stories as clickbait and limit their distribution, much as the company already does with hoaxes. The second approach was the machine-learning classifier. The spokesperson told me the machine-learning system performed better, so the company ended up shipping that one.

The spokesperson did not elaborate on the exact criteria by which it performed better. Did the user-reporting system fail to reduce clickbait and other misleading content? Or was the problem that it resulted in overreporting, or incentivized users to report legitimate stories that happened not to align with their views?

It is true, generally speaking, that the company’s decisions about news feed changes are guided by large amounts of both behavioral and survey data on how they impact user engagement and satisfaction. It has shown itself willing over the years to endure major PR backlashes in its quest to optimize for those goals. It would thus be out of character for Facebook to scrap a proposal that performs well on its user metrics out of concern for its political ramifications. If even Gizmodo’s anonymous source couldn’t say whether politics played a role in the decision to go easy on fake news, the implication should be treated with due skepticism.

Still, that doesn’t mean it’s impossible that Facebook considered the political fallout of various approaches to fake news. And it becomes a little more plausible when you consider the degree to which Facebook appears to have freaked out over charges of liberal bias in its trending news section earlier this year. (Symmetry alert: Those charges were themselves lodged most vocally by a single anonymous former contractor in a May 9 Gizmodo post by the same author, Michael F. Nunez.) As that tempest unfolded, Facebook backpedaled furiously, issued a rare apology, held a summit with conservative leaders, and ultimately reworked the entire trending news feature. It laid off the whole team of journalists who were editing the section, dispensed with headlines and context, and further automated the process of story selection and source verification. It did this, as I reported, even though it was clear to those involved that this would result in an inferior news product, including fake news stories and the inclusion of topics that weren’t news at all.

Facebook’s trending section was always a sideshow. The news feed is its core product, and its algorithm is the company’s most precious asset. The company’s willingness to defile its own trending news section to appease conservatives doesn’t necessarily mean it would do the same to the news feed. That said, it does lend the latest charges of political cowardice a little more fuel than they might have had otherwise.

So here, with the Gizmodo piece, we have an epistemologically fascinating case in which Facebook is claiming that a news story about its efforts to crack down on false news stories is, in itself, a false news story. If nothing else, it provides a vivid example to support CEO Mark Zuckerberg’s point that appointing Facebook the arbiter of truth in journalism might come with some pitfalls of its own.