Is Facebook Biased In Deleting Racial And Religious Posts?

Facebook removed a post by Black Lives Matter activist Didi Delgado and disabled her account for seven days in May; but did nothing when a U.S. congressman wrote a post calling for the slaughter of “radicalized” Muslims following the terrorist attack in London.

ProPublica recently reviewed multiple internal documents and concluded the company’s hate-speech rules tend to favor elites and governments over grassroots activists and racial minorities. Or you decide. Look at the post by the white Congressman and decide how it differs from the one by the Black Lives Matter leader?

“Hunt them, identify them, and kill them all,” U.S. Rep. Clay Higgins, a Louisiana Republican said in a recent post. “For the sake of all that is good and righteous. Kill them all.”
Now look at the post by Black Lives Matter activist Didi Delgado. “All white people are racist. Start from this reference point, or you’ve already failed.”

Aiming to explain the distinction between the two recent posts, ProPublica noted in a June 28 story, “Higgins’ incitement to violence passed muster because it targeted a specific sub-group of Muslims – those that are “radicalized” – while Delgado’s post was deleted for attacking whites in general.”

Facebook uses what it calls content reviewers. They have deleted posts by activists and journalists in disputed territories such as Palestine, Kashmir, Crimea and Western Sahara.

And one document tells content reviewers how to apply the company’s global hate speech algorithm. Black children, female drivers, and white men are identified in one slide. And it asks: Which group is protected from hate speech? The correct answer: white men.

When a post is removed, users are typically not told what rule they broke. They typically cannot appeal Facebook’s decision. Appeals are currently only available to people whose profile, group or page is removed.

Launched in 2004, Facebook’s censorship rulebook has increased from a single page in 2008 to a 15,000-word rulebook in 2013.

Facebook users who stridently criticize racism and police killings of racial minorities say that their posts are often taken down.

Two years ago, Stacey Patton, a journalism professor at historically Black Morgan State University in Baltimore, posed a provocative question on her Facebook page. She asked why “it’s not a crime when White freelance vigilantes and agents of ‘the state’ are serial killers of unarmed Black people, but when Black people kill each other then we are ‘animals’ or ‘criminals.’”

Although it doesn’t appear to violate Facebook’s policies against hate speech, her post was immediately removed, and her account was disabled for three days. Facebook didn’t tell her why. “My posts get deleted about once a month,” said Patton, who often writes about racial issues. She said she also is frequently put in Facebook “jail” – locked out of her account for a period of time after a posting that breaks the rules.

“It’s such emotional violence,” Patton said. “Particularly as a Black person, we’re always having these discussions about mass incarceration, and then here’s this fiber-optic space where you can express yourself. Then you say something that some anonymous person doesn’t like and then you’re in ‘jail.’”

Meanwhile, Delgado, who works with the Black Lives Matter movement, said she has been banned from Facebook so often that she has set up an account on another service called Patreon. There, she posts content Facebook deleted. In May, she wrote an article for Medium titled “Mark Zukerberg Hates Black People.”

Curiously, Facebook didn’t delete Donald Trump’s comments on Muslims. Days after the Paris attacks, Trump, then running for president, posted on Facebook “calling for a total and complete shutdown of Muslims entering the United States until our country’s representatives can figure out what is going on.”