Facebook released a detailed infographic Tuesday that takes users through the entire process of what happens when content or users are reported to the social network.

Facebook released a detailed infographic Tuesday that takes users through the entire process of what happens when content or users are reported to the social network.

In its blog post describing the infographic, Facebook stressed that dedicated teams are handling such reports “24 hours a day, seven days a week,” noting its offices throughout the world and saying that its user operations department is divided into four specific teams:

Safety

Hate and harassment

Access

Abusive content

Facebook wrote in its note:

At Facebook, we maintain a robust infrastructure that empowers our more than 900 million-person community to help us enforce our policies by using the report links found throughout the site. While it is very unlikely that you will have any problems with content on the site, it can be confusing and hard to know what happens once you do decide to click “report.” Today, we are excited to publish a resource that will give the people who use Facebook more insight into our reporting process.

In order to effectively review reports, user operations is separated into four specific teams that review certain report types — the safety team, the hate and harassment team, the access team, and the abusive content team. When a person reports a piece of content, depending on the reason for their report, it will go to one of these teams. For example, if you are reporting content that you believe contains graphic violence, the safety team will review and assess the report. And don’t forget, we recently launched our support dashboard, which will allow you to keep track of some of these reports.

If one of these teams determines that a reported piece of content violates our policies or our statement of rights and responsibilities, we will remove it and warn the person who posted it. In addition, we may also revoke a user’s ability to share particular types of content or use certain features, disable a user’s account, or, if need be, refer issues to law enforcement. We also have special teams just to handle user appeals for the instances when we may might have made a mistake.

It is not only our user operations team that provides support to the people who use our service, but also our engineers who build tools and flows to help you deal with common problems and help you get back into your account faster. On some rare occasions, users may lose access to their accounts by forgetting their password, losing access to their email, or by having their account compromised. We have built extensive online checkpoints to help these people regain access. By using online checkpoints, we can authenticate your identity both securely and quickly. This means there’s no need to wait to exchange emails with a Facebook representative before you can restore access to your account or receive a new password. Be sure to visit www.facebook.com/hacked if you believe your account has been compromised or use the report links to let us know about an impostor timeline.

The safety and security of the people who use our site is of paramount importance to everyone here at Facebook. We are all working tirelessly at iterating our reporting system to provide the best possible support for the people who use our site. While the complexity of our system may be bewildering, we hope that this note and infographic has increased your understanding of our processes. And even though we hope you don’t ever need to report content on Facebook, you will now know exactly what happened to that report and how it was routed.