Facebook censorship: The need to have the moral high-ground

Mainstream media loves a witch hunt, especially when the target is world famous. On Sunday, The Guardian published an article talking about the guidelines Facebook uses to deal with inappropriate content, and the piece inspired a series of posts explaining how the company is failing at protecting its users from nudity, violence, revenge porn, and more.

The thing is few of the outlets mention how difficult is to monitor what two billion users upload every day, and none of them hold users responsible for the content they are sharing.

For the detractors, the concept is simple. Facebook has grown so much that is no longer a regular tech or social media company. Instead, it is a new kind of “super influential organization” that has to protect every single one of its users from content that might offend them or make them feel uncomfortable.

Facebook Live sparks most of the controversy because people have used the service to live stream suicides, sexual content, and crimes, among other things.

So, does Facebook protect its users from distasteful media?

Yes, it does. They go by the name of “Basic Privacy Settings & Tools,” and the platform regularly reminds users, through “annoying” notifications, that they have to learn how to use these features to “enhance user experience.”

That last term refers to how people will consume the service. Facebook allows everybody to put an invisible filter between them and the world. As a result, if you see something on your Newsfeed, it’s because you told the platform you are OK with it, or because you have not personalized your privacy settings. In both cases, it’s your fault.

Now, the filters are not perfect, and sometimes, unpleasant content will appear on the Newsfeed. Facebook engineers know this too which is why people can click on the little arrow on the right-top corner of every publication and report it, hide it, etc.

Sure, that is far from enough to keep “everybody safe,” but when I created my account, I didn’t see a clause where Facebook swore to protect me from the ugly side of the Internet, nor I was expecting it.

Why are some outlets criticizing Facebook’s content guidelines?

Some writers have said the guidelines the moderators use to judge the publications on the social media platform are not strict enough. In fact, The Guardian included their own guidelines to give their readers contrast and gain the moral high-ground, even though the comparison makes no sense. They are both private organizations with different objectives in mind, and as long as they are not breaking the law, they can create the environment they want regardless of what others are doing.

To be fair, the article mentions things that are worthy of examination. For example, according to the piece, Facebook has a “credible violence” category meaning moderators sometimes find posts with threats on the platform but don’t take them down. They say people tend to use vulgar and aggressive language out of frustration but are unlikely to transform their words into action.

The manuals also explain that videos of people hurting themselves are not shut down immediately. The reasoning behind this is that somebody could help the people in the videos, but once it is too late to assist them, moderators will eliminate the entry.

A similar thing happens with “non-sexual sexual abuse or bullying of children.” The guidelines determine the publication could lead to a rescue or might make someone address the situation. However, if the entry is celebrating or encouraging the behavior in any way, the staff will remove it on the spot.

Also, things like footage of accidents or animal cruelty might receive the approval of the reviewers. The manual explains that kind of content might raise awareness on a particular issue like drinking and driving.

There are around 100 books of guidelines including instructions on how to handle posts that portrait cannibalism which should give an idea of how difficult it is to deal with humanity’s best exemplars. With that in mind, people are entitled to be offended by how the company deals with the issue and stop using the service.

The problem is that even its detractors don’t want to delete their Facebook accounts. Instead, they want a private organization to perfectly regulate the content hundreds of millions upload to the platform (supposing not all the two billion accounts are active) in real time which sounds unrealistic. But, that takes us to the next and most important question.

Is Facebook responsible for what the users publish?

No, it isn’t. Facebook gives its users a platform to “share themselves,” and many people will argue the company is responsible for the results. However, I have had my account for years, and I have never seen a “be disgusting and show it to everyone” campaign promoted by the Facebook team.

The organization does not encourage nor rewards violent, sexual, or any kind of inappropriate content. Users do. Moreover, some individuals have learned how to use the platform to harm others or commit crimes.

The most common form of attack is “revenge porn,” and it’s a real issue that can destroy someone’s live. Moderators do their best to deal with the problem, but the sheer amount of entries overwhelms them. As a result, bad content can stay on the platform for a long time before someone shuts it down.

The average user doesn’t care

Popularity is the most valuable asset in the digital world after huge amounts of consumer data, and there is an audience for everything on the Internet, including nudity and violence.

Political and public figures have criticized Facebook for failing to deal with bad content on the platform. However, Facebook is the most popular social media platform in the world by far which means people don’t really care about unsettling posts or being advertising products as long as it’s free and easy to use.

Whether any of us like it or not, Facebook is providing the world with a lot of valuable information on the human nature. Scientists have been using the data derived from the app to carry on relevant studies, and the content that is causing so much controversy can help law enforcers to spot dangerous individuals quickly.

Bottomline, Facebook is here to stay. So, the mainstream media and public figures should stop trying to capitalize on sensationalism. Instead, they could adapt and use such a powerful tool to educate and improve society.