Report: Facebook has the tools to fight fake news but isn’t using them

Facebook has a fake news problem. The company itself insists fabricated stories account for an insignificant amount of content on its site, but its critics aren’t convinced, and calls for action have only grown louder since election night.

Facebook founder and CEO Mark Zuckerberg has issued two responses playing down the impact his platform had on the election outcome. His most recent statement reads: “Of all the content on Facebook, more than 99 percent of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics.”

A new report alleges the opposite. According to the article, Facebook is concerned about the spread of misinformation on its site via unreliable “news outlets” but feels like it can’t take action at the risk of being labelled biased.

The problem is one of partisanship, sources with direct knowledge of Facebook’s policymaking claim. The news sites in question are “disproportionately” right wing or conservative-leaning in terms of their editorial content, allege the unnamed individuals who spoke to Gizomodo.

Despite creating News Feed updates with the aim of stamping out fake news, Facebook did not implement the changes because it feared “upsetting conservatives,” claim the sources. Meanwhile, media companies including BuzzFeed and The New York Times have brought to light the falsehoods these sites (many of which are based outside the country) have been peddling.

The company’s crippling inability to tackle the issue is allegedly a direct result of an earlier controversy regarding its Trending Topics feed. In May, a number of ex-Facebook employees charged with overseeing its algorithmic system claimed conservative news was being suppressed on the site. The revelation, and its resulting backlash, led to much soul-searching within Facebook. An internal investigation followed, but produced no evidence of systemic bias. Yet it is now being implied that the damage had already been done.

The Trending Topics incident “paralyzed” Facebook’s drive to alter its News Feed in fear of its objectivity once again being questioned, employees recently told the The New York Times. On the other hand, the changes the company has pushed through (including a crackdown on clickbait and an emphasis on “newsworthy” items) have led to it repeatedly being compared to a media company — much to its chagrin.

Facebook said in a statement that it “did not build and withhold any News Feed changes based on their potential impact on any one political party.” It continued: “We always work to make News Feed more meaningful and informative … This includes continuously reviewing updates to make sure we are not exhibiting unconscious bias.”

However, the contradictions in its recent string of statements do the company no favors. Facebook insists fake news on its service had very little impact on swaying the election, but also touts its voter registration drive that resulted in 2 million people registering to vote. The company puts the onus on its users to flag fake content; the same people its CEO claims tune out information they don’t agree with instead of acting on it. These are exactly the types of inconsistencies that continue to mire Facebook in controversy.

To its credit, it has been transparent when modifying its products. This openness should be integral to its fight on fake news, which continues to tarnish its essential News Feed.