“As a former Facebook Trending News writer and current actual-human, Facebook has always had an answer to the fake news problem: our team,” wrote Mythili Sampathkumar. “They treated us like garbage because we were all contractors, had zero leadership, and thought we were just pre-cursors for an algorithm.”

They treated us like garbage because we were all contractors, had zero leadership, and thought we were just pre-cursors for an algorithm.

Criticism of fake news on Facebook has intensified following this year’s election, with several writers suggesting misinformation spread on the social network played a role in Donald Trump’s victory. CEO Mark Zuckerberg has directly denied that fake news affected the results, describing those stories as “a very small amount of the content” on the site. An analysis by Buzzfeed, however, found that top election stories from fake news sources outperformed stories from legitimate sources in the weeks preceding the election.

Sampathkumar had similarly harsh words for Gizmodo, which she blamed for the “hack job” that led, in part, to Facebook firing the entire trending news team. In May, Gizmodo published a story revealing that the trending section was operated by human curators (not an algorithm, as the company had previously claimed) with one worker claiming that they saw colleagues regularly suppressing conservative news.

“Instead of fact checking and letting the news team full of real reporters do our job, they cowed to right-wing pressure and advertisers,” wrote Sampathkumar. “They believed the hack job, false Gizmodo stories instead of their own reporters and their own algorithm which tracked everything we did. Result was rampant proliferation of fake news. It could have easily been avoided had they treated their human team of writers...as humans.”

On Friday, Zuckerberg announced a new plan for dealing with fake news, but Sampathkumar expressed scepticism that the company would be able to solve the problem with better programming.

“So let’s not act like solution to FB’s news problem is better algorithm,” concluded Sampathkumar. “Some aspects of news judgement just can’t be done by bro-grammers.”