Facebook Ditches Its "Fake News Flag" After People Shared Flagged Articles Even More

Roughly one year ago Facebook promised to eradicate "fake news" from its platform by "flagging" articles that were deemed inaccurate by their hand-selected "fact checkers" (A.K.A. "liberal propagandists")...we wrote all about it in a post entitled "Facebook Launches Campaign To Combat 'Fake News'".

Among other things, the crusade attempted to dissuade people from sharing certain content, like the $100,000 of Russia ads that apparently changed the course of human history, by shaming them with a big red caution flag underneath their post. Per The Telegraph, it looked something like this:

Unfortunately, after a year of trials, it seems that Facebook users were not shamed by Zuckerberg's 'scarlet letter' but actually wore it as a badge of honor and shared those articles even more than they otherwise would have...oops.

Facebook is getting rid of its fake news red flags because they were making fabricated media reports appear more believable to its users.

The U-turn was prompted by research suggesting users would actually believe fake news even if it was flagged as incorrect or a misleading.

"Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs – the opposite effect to what we intended," Facebook product manager Tessa Lyons wrote in a blog-post.

As a result, Facebook is ditching the "fake news flag" and instead force feeding users articles that more closely align with Zuckerberg's political beliefs before allowing them to share articles that have been deemed 'inconvenient' for one reason or another.

It conducted research which suggested that false news stories with "related articles" next to it were shared fewer times than those highlighted with a red flag.

"False news undermines the unique value that Facebook offers: the ability for you to connect with family and friends in meaningful ways. It’s why we’re investing in better technology and more people to help prevent the spread of misinformation," Lyons wrote.

"Overall, we’re making progress. Demoting false news (as identified by fact-checkers) is one of our best weapons because demoted articles typically lose 80 percent of their traffic. This destroys the economic incentives spammers and troll farms have to generate these articles in the first place."

Facebook's latest attempt to censor its users will look something like this:

Of course, the "fake news" task force inside Facebook better come up with an effective censorship tool soon or they're going to miss an opportunity to run an effective campaign for Democrats in the mid-term elections.