Facebook faces lots of criticisms and questions, and while much of this is focused on privacy and security, there are also lots of queries about what is permitted on the platform. To help users to understand what's allowed and what's not, the company has published its Community Standards for everyone to read.

As well as making it clear exactly what sort of content is likely to attract the attention of censors, Facebook is also introducing a new appeals process, giving people the ability to fight back if their content is removed.

In revealing the rules surrounding the type of content that is permitted, Facebook is hoping that people will see it living up to its promise of increased transparency. The social networking giant's guidelines cover everything from violence and bullying to privacy and copyright. While the guidelines themselves are not necessarily new, this is the first time they have been made public, giving users the opportunity to scrutinize how Facebook polices content.

One of the questions we're asked most often is how we decide what's allowed on Facebook. These decisions are among the most important we make because they're central to ensuring that Facebook is both a safe place and a place to freely discuss different points of view. For years, we've had Community Standards that explain what stays up and what comes down. Today we’re going one step further and publishing the internal guidelines we use to enforce those standards. And for the first time we're giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we've made a mistake.

The introduction of an appeals process is an important change of tack by Facebook, and it's likely that the company will find itself utterly inundated with appeals from people who feel they have been unjustly censored. With a team of just 7500 content reviewers, Facebook has quite a task on its hands. To start with, appeals will be limited:

We know we need to do more. That's why, over the coming year, we are going to build out the ability for people to appeal our decisions. As a first step, we are launching appeals for posts that were removed for nudity/sexual activity, hate speech or graphic violence.

Here's how it works:

If your photo, video or post has been removed because it violates our Community Standards, you will be notified, and given the option to request additional review.

This will lead to a review by our team (always by a person), typically within 24 hours.

If we've made a mistake, we will notify you, and your post, photo or video will be restored.

We are working to extend this process further, by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up. We believe giving people a voice in the process is another essential component of building a fair system.

As for the community standards themselves, Facebook has kept things fairly short, limiting the guidelines to under 30 pages. The company says:

We recognize how important it is for Facebook to be a place where people feel empowered to communicate, and we take our role in keeping abuse off our service seriously. That's why we have developed a set of Community Standards that outline what is and is not allowed on Facebook. Our Standards apply around the world to all types of content. They're designed to be comprehensive -- content that might not be considered hate speech may still be removed for breaching our Bullying Policies.

The goal of our Community Standards is to encourage expression and create a safe community. We base our policies on input from our community and from experts in fields such as technology and public safety.