Did you know that Facebook has a crack team of employees whose mission is to deal with offensive content and user complaints? Their ranks number in the hundreds. But while most websites have people on staff to deal with porn and violence, none of them have 350 million users to manage... Now the world's largest social network found a way to deal with this shortage of manpower, though. Facebook has begun testing a new feature called the Facebook Community Council [currently invite-only]. According to a guest post on the Boing Boing blog by one of the council's members, its goal is to purge Facebook of nudity, drugs, violence, and spam.

The Facebook Community Council is actually a Facebook app and tool for evaluating content for various offenses... The app's tagging system allows council members to tag content with one of eight phrases: Spam, Acceptable, Not English, Skip, Nudity, Drugs, Attacking, and Violence. If enough council members tag a piece of content with the same tag, action is taken, often a takedown.

What Facebook is doing here is nothing all that new. Many other social networking sites or platforms such MySpace, Ning, and many others, do much the same. Video hosting sites like YouTube do as well. [See my summary of YouTube's efforts down below]**

No doubt, some will be quick to decry "private censorship" with moves by social networking sites, video hosting sites, and others to flag and remove objectionable content within their communities, but such critics need to understand that:

Big communities require interest-balancing: Online communities like Facebook, MySpace, YouTube, etc., are broad-based communities with diverse interests and sensitives. Some forms of community policing are, therefore, necessary to achieve a reasonable balance among those interests. You are always free to "move" elsewhere if you don't like the standards set by a particular online community. The Internet is a big place; there's a community out there for every taste and interest!

Private community policing beats public censorship: If larger, more popular online communities fail to take steps to establish private community standards, policymakers will suggest they should do it for them. Better that the various private online communities police themselves by "flagging & tagging" objectionable content than to have 5 unelected bureaucrats at the FCC (or FTC) regulating online speech for us. As pointed out above, you can always escape private online communities. By contrast, you cannot escape blanket, one-size-fits all federal censorship efforts.

_________________________________

** In late 2008, YouTube created a new "Abuse and Safety Center" to make it easier for users to report abusive behavior or inappropriate content. The site also makes it easy for users to find helpful information from various expert organizations who deal with troubling behavior.

For example, if a YouTube user reports "hateful content," they are directed to tips from the Anti-Defamation League. Similarly, information from the National Suicide Prevention Lifeline is provided to those who report suicide concerns, and the National Center for Missing & Exploited Children provides information and links about sexual abuse of minors. YouTube also has strict "community guidelines" governing appropriate behavior on the site.

Finally, in May 2009, YouTube announced a new "Filter W*rds" program that lets users block profanity and racial slurs. According to the site: "Users can opt into this by clicking on 'Options' next to the Comments header and checking the 'Filter W*rds' box. Users can also choose to hide comments altogether by clicking on 'Hide Comments.'" Those user preferences will then be saved by the browser. According to YouTube, the site uses "a combination of feedback from users, proprietary technology, and a commonsense collection of words in English to decide what to filter." Incidentally, there's also a free Firefox extension called "YouTube Comment Snob" that that filters out undesirable comments from YouTube comment threads.