Facebook and Fluoride: Social Media Are Not Exempt From Social Responsibility

If I, as an individual, have the opportunity to prevent someone, particularly a child, coming to harm, I should. If society as a whole has the opportunity to prevent this, it should. And if a company has the same opportunity, it should too.

For example, some years ago in the UK, the government agreed that fluoride should be added to mains water in areas where it did not occur naturally. The decision followed findings of higher incidences of tooth decay in areas lacking fluoride and few, if any, negative effects in areas where it was present. So for the good of society, fluoride was added to the water, despite concerns from some about government interference.

Another dispute over individual freedom and what is best for society has also recently made the headlines, with Facebook announcing its decision to allow everyone who uses the site - not just those over the age of 18 - to share the content they post to the site with everyone, not just friends. This means that even the youngest people allowed to use Facebook - 13-year-olds - can post their photos, status updates, personal information and contact details for all the world to see if they choose (the default setting remains 'Friends Only').

There's a world of difference between what an adult and a child might consider appropriate to share beyond their circle of friends. In addition, someone at the tender age of 13 might not perceive dangers in the same way, or understand who might be looking at their content and for what purpose.

Coincidentally, in the same week, it came to light that Facebook was once again allowing people to post videos of decapitation, notably one of a woman being beheaded by a drug cartel. A raft of public criticism ensued, with even the Prime Minister David Cameron weighing in on Twitter to call Facebook "irresponsible". Facebook first softened its line, covering the video with a warning screen and declaring only such videos condemning the violence depicted could be published. Within days however, it reversed its decision all together, removing the contentious video. This is a good decision, albeit belated.

But Facebook must do more; it should not be up to just parents to police social networks in order to protect their children. Facebook must protect children from not only images on the net but also from the harm that can come from what they themselves put on the net. Until it commits to this responsibility, parents should activate the Facebook controls that block the kind of information (addresses, phone numbers, etc) that they don't want their children to share.

Responsibility does not rest solely with parents as individuals or even society as a whole; corporations must recognise their part. Facebook is not immune to this social responsibility and should seriously think about the consequences of allowing children to post publicly. It needs to put the fluoride back in its drinking water.