Though Facebook has its own comparatively small in-house moderating team, most of them work for subcontractors. There are moderating hubs around the world, but Facebook refuses to disclose their exact number or locations.

Revealed: Facebook's internal rulebook on sex, terrorism and violence

Read more

Moderators get two weeks’ training plus prescriptive manuals devised by Facebook executives based at the company headquarters in Menlo Park, California.

They show the breadth of the issues being dealt with by Facebook – from graphic violence to terrorism and cannibalism. If Facebook users are talking about a controversial issue, the company has to have a policy on it to guide moderators.

Facebook has automatic systems for rooting out extreme content before it hits the site, particularly on child sexual abuse and terrorism, but its moderators do not get involved in this proactive work.

A slide from Facebook’s guidance for moderators on threats of violence. Photograph: Guardian

Instead, they review millions of reports flagged to them by Facebook users and use the manuals to decide whether to ignore, “escalate” or delete what they see. When they escalate a report, it usually means it is sent to a more senior manager to decide what to do.

The Guardian's moderation policy

The Guardian's moderation approach is bound by guidelines, which we have published here, and our moderators are all directly employed by the Guardian and work within our editorial team. The moderation team regularly receives training on issues such as race, gender or religious issues, and applies that training in service of those public guidelines. When making decisions, our moderators consider the community standards, wider context and purpose of discussions, as well as their relationship to the article on which they appear. We post-moderate most discussions, and rely on a mixture of targeted reading, community reports and tools to identify comments that go against our standards. We have an appeals process and anyone wanting to discuss specific moderation decisions can email moderation@theguardian.com. When requested, reasons for removal may be shared with those affected by the decision.

All discussions on the Guardian site relate to articles we have published; this means we have specific responsibilities as a publisher, and also that we aim to take responsibility for the conversations we host. We make decisions about where to open and close comments based on topic, reader interest, resources and other factors.

For comments that seem cruel or insensitive, moderators can recommend a “cruelty checkpoint”; this involves a message being sent to the person who posted it asking them to consider taking it down.

If the user continues to post hurtful material, the account can be temporarily closed.

The files also show Facebook has developed a law enforcement response team, which deals with requests for help from police and security agencies.

The company has designed a special page to help moderators, called the single review tool (SRT). On the right-hand side of the SRT screen, which all moderators have, there is a menu of options to help them filter content into silos.

While this has speeded up the process of moderation, the Guardian has been told moderators often feel overwhelmed by the number of posts they have to review – and they make mistakes, particularly in the complicated area of permissible sexual content.

Part of Facebook’s guidance for moderators on sexual activity. Photograph: Guardian

The manuals seen by the Guardian are occasionally updated – with new versions sent to moderators. But small changes in policy are dealt with by a number of subject matter experts (SMEs), whose job is tell moderators when Facebook has decided to tweak a rule. The SMEs also oversee the work of moderators, who have to undergo regular performance reviews.

The Guardian has been told this adds to the stress of the job and has contributed to the high turnover of moderators, who say they suffer from anxiety and post-traumatic stress.

Contact the Guardian securely

Read more

Facebook acknowledged the difficulties faced by its staff and said moderators “have a challenging and difficult job. A lot of the content is upsetting. We want to make sure the reviewers are able to gain enough confidence to make the right decision, but also have the mental and emotional resources to stay healthy. This is another big challenge for us.”