Facebook Clarifies Community Standards Stance On Nudity, But Will It Really Stop The Unnecessary Censorship Of Women's Bodies?

If Facebook's policies on nudity have seemed as unfathomable recently to you as they have to me, here's an item of interest: The Community Standards now include detailed information about what constitutes nudity on Facebook. Initially I had hopes that these clarifications would help prevent the unnecessary censorship of women's bodies that has become so common on social media… but after I spent some time with them, I'm not so sure they will. Call me a pessimist if you want, but here's why.

Pretty much every form of social media out there has gotten into hot water over their rules about nudity in the past — but not because of either the widespread allowance of nekkid bodies or the widespread banning of them. The issue has almost always been about the lack of consistency in terms of when those rules are upheld: Bikini shots of celebrities and other conventionally beautiful people are allowed to stay, while those of other people who don't conform to that type are taken down; unshaven bikini lines are cause for removal, even though no genitals are being shown; the 4th Trimester Bodies project has been removed from numerous social media platforms time and time again, even though the photos are in no way explicit; and so on and so forth.

This, then, is ostensibly what the clarifications of Facebook Community Standards are meant to address. “Today we are providing more detail and clarity on what is and is not allowed,” reads the news brief. “For example, what do we mean by nudity, or what do we mean by hate speech?” As the post goes on to note, the rules aren't changing; they are, however, getting expanded upon to prevent there from being any confusion about the matter.

What isn't allowed:

Genitals and butts — specifically, “photographs of people displaying genitals or focusing in on fully exposed buttocks.”

People having sex. “Explicit images of sexual intercourse” are major no-nos, as are “some verbal descriptions of sexual acts that go into vivid detail.”

Breasts with visible nipples. The nipple has not been freed yet, apparently.

Content that “threatens or promotes sexual violence or exploitation.” According to the New York Times, this includes revenge porn.

What is allowed:

Breastfeeding photos.

Photos of breasts with post-mastectomy scarring.

Photographs, paintings, and other forms of art depicting nude figures.

So, by these standards, the 4th Trimester Bodies project should be able to remain on Facebook without threat of removal; nipples, genitals, and exposed buttocks are never visible in the images. Breastfeeding photos should no longer face any opposition, nor should art projects in the name of causes like body positivity be under fire anymore. It sounds simple, doesn't it? And, moreover, it sounds like a victory, right?

But not so fast — it's still a slippery topic. Take, for example, the instance of the photo of the Gustav Courbet masterpiece The Origin of the World a French teacher posted recently It's art, right? Since photographs, paintings, and other forms of art depicting nude figures are allowed, it should be A-OK, correct? But wait — hang on. It also features a prominent vagina, which falls under the “no genitals” rule. So which guideline trumps the other? In this case, it was clearly the “no genitals” one, although it's not clear why.

Or what about an experience The Freaking Feminist had at end of February? The Facebook page posted images from Leonard Nimoy's Full Body Project as a tribute to the late actor on February 28. The project, which Nimoy published as a book in 2007, celebrated the female form and challenged unrealistic standards of beauty. It's amazing, and again, because it's an art project featuring nude figures, it should be fine, right? Nope — it, too, was removed from The Freaking Feminist's Facebook page. It might have had something to do with nipples being visible in some of the photos, but again, we don't know why one rule — the one in favor of censoring the images — was chosen as enforceable over the other.

It's worth noting that each instance will still be decided upon by an actual person, not an algorithm. Monika Bickert, head of global policy management, told the New York Times that there are review teams working at all hours of the day across the globe, with each report — because, as always, whether or not Facebook reviews something depends on whether or not it's been reported by another user — being examined by one of them before a decision is handed down. In some ways, that's a good thing; in theory, it allows for more leniency. At the same time, though, it likely also makes the inconsistencies even more pronounced: Maybe, for example, both the Courbet painting and the Full Body Project bans wouldn't have occurred had a different team member reviewed the cases. I don't think an algorithm is the answer either, though (robots can only do what we tell them to, and in these instances, the human touch is important), so there may not be a good solution to this problem — we may be looking at a case of it being as good as it gets.

The bottom line is that even though the Community Standards have been somewhat clarified, there's still a lot of gray area — and as a result, I'm not totally sure it's going to prevent the sorts of things that brought up the need for clarification in the first place. But maybe, as I mentioned before, I'm just being pessimistic. Even though I think the nudity rules remain somewhat problematic, what I do think will be useful are the clarifications regarding bullying, harassment, and hate speech — so at least there's that. The question of how we keep the Internet a safe place for all users without edging into dictatorial territory is an ongoing conversation, so the more we talk about it, hopefully the better we'll get at figuring it out.