Social media companies claim the best response to hate speech is more speech. The use of counter speech is an important tool in combating all forms of online hate, but it comes with some unavoidable limitations. In addition to these limitations, we are now seeing a systemic failure at Facebook which is resulting in counter speech being censored and the accounts of those advocating against hate being suspended. In the last few weeks we have seen three examples of wrongful censorship and account suspensions involving Australia.

The first case involved anti-Fascist campaigner Andy Flemming and his page “slackbastard” which was reported en masse through an organised effort by members of the far right group Reclaim Australia. The page was deleted by Facebook, and Andy’s account suspended. The warnings Andy received related to nudity, although there was no nudity on the page. Facebook ultimately corrected this mistake.

The second case involved the wrongful censorship of content on the anti-racist “Reclaim What” page and the 30 day suspension of its administrator. In this case one of the items of content removed by Facebook was a video posted by the page. It was removed on the basis of violating the community standards for nudity. We’re waiting for Facebook to explain how a video which was in reality only an audio track with no visuals managed to contain nudity. The final straw with this page apparently related to their cover image, shown below, which included a shattered swastika apparently reported as hate system by the neo-Nazis. The page is still down.

The third case involved Clementine Ford, a freelance writer, broadcaster and public speaker. Her Facebook account has been suspended for 30 days and her page is under threat of being closed.

This action by Facebook comes in response to Ford’s publication on her page of abusive comments and messages sent to her on Facebook. She is currently being flooded with abuse following an impassioned Facebook post about women’s rights which has gone viral and been liked by 203,697 people, shared by 40,327 people and attracted 20,482 comments over the last 48 hours. Some of those reports have also been for nudity, on posts which contained pictures of text.

The attached image:

Other examples of abuse:

Clementine Ford’s post referred to Sunrise on Channel 7 and their feature on the publication on an American run website site of hundreds of private explicit photographs of women from South Australian. The Sunrise account has shared the story with the comment: “What’s it going to take for women to get the message about taking and sending nude photos?” Clementine Ford’s post stressing that what should be condemned is the sharing of private pictures without consent was part of a wider public outrage.

Ford’s message that the Sunrise post was “saying it’s the responsibility of victims of crime and assault to prevent it and not the responsibility of society to make such crimes intolerable and unacceptable” hit a nerve. She highlighted women’s right to control over their own bodies, and who gets to see or touch them. The message went far beyond the question of privacy and to the root issue of violence against women. The message and the inclusion of a top-less photograph, cropped to ensure it complied with Facebook’s community standards, made the post both powerful and controversial helping it to spread rapidly. The Sunrise post mean time was revised to say “A stern warning for people who share risqué photos online”.

Clementine Ford’s post is the sort of counter speech which Facebook should be protecting and promoting. The removal for nudity in each of the three cases, including when no reasonable human being could have reached that decision, suggests Facebook is removing content reported for nudity without a manual review. This opens the system up to gaming and abuse.

It’s unclear what Facebook’s position is on the people posting screenshots of abusive content, where they are the victim, in order to highlight both the abuse and the abuser. From a statement by Facebook to OHPI in 2013 we know Facebook permits people to link to abusive content for the purpose of urging people to report it. The statement explained, “our policy certainly does allow people to post links to hate speech in order to condemn it and encourage others to file reports. We would not take any action to penalise a user who posts such a link in these circumstances”. It’s possible that a similar position would apply to images of abuse posted by a victim, but we call on Facebook to clarify this.

In an ideal world OHPI’s recommendation would be to report the posts rather than exposing them publically. Our own approach has been to hide the identity of abusers in public content, while including it in confidential reports we share with social media platforms and police. When someone is under a barrage of abuse from a vast number of people (known as online griefing), it is unreasonable to expect them to go to the same lengths to protect those abusing them. Calling out the abusers in public is another form of counter speech, and is a relatively effective deterrent against further abuse.

We call on Facebook to:

Immediately reinstate the page of Reclaim What and unban its administrator

Immediately reverse the ban on Clementine Ford’s personal account, and reset the level of complaints against her page so it is not removed

Review the processing of complaints reported as nudity, ensuring there is a human review

Clarify the position on the posting, by a victim of abuse, of screenshots showing the abuse and the Facebook name of the abuser

Take action

As the action required here need to be taken by Facebook, you can best help by sharing this briefing. We will also be drawing the briefing, and our recommendations, to Facebook’s attention.