Last month, Mark Zuckerberg defended Holocaust deniers, stating in an interview that he doesn’t “think that they’re intentionally getting it wrong,” and in turn, Facebook shouldn’t take down the dangerously false content. On Wednesday, the platform did, however, temporarily remove a post advocating for Holocaust education from the Anne Frank Centre.

These numbers are alarming, but this is why we do what we do. Currently only 10 states mandate Holocaust and Genocide Education. How do we counter ignorance about the Holocaust with knowledge, compassion, and understanding? https://t.co/1xtsNLAKEx

The post, which Facebook later restored, included a photo of Jewish children, stripped of their clothes, at a concentration camp. “We don’t allow nude images of children on FB, but we know this is an important image of historical significance and we’ve restored it,” Facebook wrote in response to the Anne Frank Centre’s tweet pointing out the removal. “We’re sorry and thank you for bringing it to our attention.”

The post was published on August 21. The Anne Frank Centre was informed on Monday, nearly a week later, that it had been removed, and it reached out to Facebook immediately after they were notified. A spokesperson for the organisation told Gizmodo that they didn’t hear back from Facebook until they publicised the removal on Wednesday.

“We understand the difficulty in assessing the context of potentially controversial content,” Alexandra Devitt, a spokesperson for the Anne Frank Centre, told Gizmodo in an email. “That said, it shouldn’t have taken us publicly calling out Facebook to restore our post. Hopefully, Facebook can revise their protocols.”

Facebook spokesperson Sarah Pollack reiterated the company’s mistake in removing the Anne Frank Centre post in an email to Gizmodo and explained that it will make exceptions to its content policy if an offending post is “newsworthy, significant or important to the public interest.”

“We recognise that the image shared by the Anne Frank Centre is historically significant and important, and was restored on this basis,” Pollack said.

This isn’t the first time Facebook has mistakenly taken down a historically significant photo after mistaking it for a nude photo violating its terms of service. In September 2016, the social network removed two posts including the Pulitzer-winning photograph of Vietnamese children running from a South Vietnamese napalm attack, which included a photo of a severely burned young girl. Facebook reinstated the photo after the editor-in-chief of the Norwegian newspaper that posted the image wrote an open letter to Zuckerberg. And after some bad press.

That was two years ago, and based on the incident with the Anne Frank Centre this week, Facebook still has yet to implement a system that doesn’t mistakenly flag profoundly important images. But Facebook says that’s about to change.

Pollack explained that Facebook plans to soon roll out a system that allows users to appeal post removals.

“How it works is that if your photo, video or post has been removed because we found that it violates our Community Standards, you will be notified, and given the option to request additional review,” Pollack said. “This will lead to a review by our team (always by a person), typically within 24 hours. If we’ve made a mistake, we will notify you, and your post, photo or video will be restored.”

For Devitt, the issue is not simply about what is removed but what is permitted to find an audience on Facebook.

“While Facebook removes the AFC’s post promoting the need to educate on the past, it continues to allow pages and posts that directly deny the reality of the deaths of more than six million people,” Devitt said. “Holocaust denial dehumanises people. It makes thousands feel unsafe. It violates the very standards Facebook lays out for it users. Yet these hate-filled propaganda pages remain.”

She added that the Anne Frank Centre has reached out to Facebook in the past “offering to work with them to tackle the spread of Holocaust denial and hate on its platform and to promote education.”

It remains to be seen whether Facebook will take these types of threats seriously. It did announce last month that it would take down misinformation that stokes real-world violence, a policy update largely attributed to the attacks on ethnic minority populations in Sri Lanka, Myanmar, and India, the New York Times reported. The violence was linked to Facebook’s mishandling of misinformation and hate speech in the regions.

The company also announced in June that it would lean on machine learning to stop the spread of fake news and misinformation. It’s unclear whether a human or an algorithm flagged the Anne Frank Centre’s post for removal, but as we’ve seen in the past, machines are still pretty bad at understanding the nuances of human language, the historical context of images, and are not free from bias. And human moderators themselves have a thorny past when it comes to making the right call.

“If Facebook is serious about its community standards,” Devitt said, “it should start tackling Holocaust denial and not the organisations who are trying to educate people on discrimination, facts, and history.” [TechCrunch]