How Facebook moderates violent and racist posts is in the spotlight following the allegedly racially motivated killing of a young black man and document leaks showing what the social network allows to remain on its platform.

The Guardian revealed details of Facebook’s guidelines to moderators on Sunday, providing insight into how the company moderates graphic content including violence, hate speech, terrorism, pornography and racism.

The guidelines leak comes as news emerged that a young black man was allegedly killed by a man who is a member of a white supremacist Facebook group called “Alt-Reich Nation".

Richard Collins III was killed on the University of Maryland campus on Friday night in what police described as an “unprovoked” attack. The 23 year old was due to graduate from Bowie State University this week.

Sean Christopher Urbanski, 22, has been charged with first-degree murder for his killing. The University of Maryland’s Police Chief David Mitchell revealed that Urbanski was a member of the racist group on Sunday.

This is 2nd Lt. Richard Wilbur Collins III. Police say he was fatally stabbed by a suspect who belongs to "Alt-Reich" Facebook group #WBALpic.twitter.com/46pQU42kKn

“When I look at the information that's contained on that website, suffice it to say that it's despicable, it shows extreme bias against women, Latinos, persons of Jewish faith and especially African-Americans," Mitchell said, as cited by The Baltimore Sun.

The Alt-Reich Nation page appears to have been removed from Facebook following the high profile incident. However, numerous other white supremacist groups can easily be found on the social network, apparently operating within the company’s guidelines.

The leaked files provide detail on types of posts and groups that will be removed from the site as Facebook struggles to strike a balance between removing graphic content and allowing free speech.

Examples of groups allowed to remain on the site because they are not considered “credible threats” of violence are outlined in the files. “I hate it when I wake up and Sarah Palin is still alive” and “lets nuke the middle east” fell into this category.

Among other things the files show that the social network allows users to livestream acts of self-harm because it “doesn’t want to censor or punish people in distress". It also allows the sharing of non-sexual child abuse, and some depictions of animal cruelty.

RT.com has reached out to Facebook for comment regarding the leaks, but received no response at the time of writing.

Facebook moderators interviewed by The Guardian said the moderation policies are "inconsistent" and "peculiar." They also said they were "overwhelmed" by the work and only had seconds to make decisions on each post.

Earlier this month, Facebook CEO Mark Zuckerberg announced the company would hire 3,000 additional people to help monitor live videos and remove inappropriate content. Zuckerberg said the new staff will help Facebook “get better” at removing content like “hate speech and child exploitation.”