UK government’s guidelines for social sites

The Home Office's guidance is a significant step of progress in online-child-safety consensus-building. By Anne Collier
April 2008

As I mentioned last week, two milestone documents out of the UK have just been released, one a 200-page report requested by Prime Minister Gordon Brown and called "The Byron Review," and the other a set of guidelines for social-networking-service best practices issued by the Home Office itself . Both have worldwide relevance not just because they're about a worldwide medium that's universally popular with youth but also one that allows for ever increasing interaction, social action, and collaborative media-producing and -sharing on an international level. I took a look at the Byron report last week. This week:

UK Home Office's guidance for social-networking sites

The guidelines are surprisingly digestible for a document coming from a government. The actual "Recommendations for Good Practice" are only about eight pages long (see p. 24), and they also come in convenient checklist form (p. 60). The whole report can be downloaded here.

1. Positives

Congrats in order. Everyone involved in these guidelines should be congratulated for the milestone the document represents. Consensus-building on this subject among commercial services, government agencies, child-online-safety advocates, and law enforcement in a medium still so little understood – the social Web – has proven to be difficult in my own country.

Based on solid research. For a practical understanding of a teen's-eye-view of Net use, don't miss "Children's Use of the Internet," p. 14, based on the research of Sonia Livingstone and colleagues (she is a social psychology professor at the London School of Economics & Political Science). Showing the difficulty of reaching child-online-safety consensus, she writes that "views on young people's development are often polarised." On the one hand, "children are seen as vulnerable, undergoing a crucial but fragile process of cognitive and social development to which technology poses a risk by introducing potential harms into the social conditions for development and necessitating, in turn, a protectionist regulatory environment." The other view holds that "children are competent and creative agents in their own right, whose ‘media-savvy’ skills tend to be underestimated by the adults around them, with the consequence that society may fail to provide a sufficiently rich environment for them." I agree with her that "finding a position that recognises both characteristics is important." [See also "Children and the Internet," Appendix B, p. 46, which is just over 4 pages in length, and all the great footnotes and appendix material referring to great work from many researchers.]

Something for everybody. The full document covers a lot of ground for audiences with all degrees of understanding – from defining social networking to considering why it's popular with youth to covering online bullying, self-harm, sexual exploitation, Webcams, and where criminal law comes in.

"Disinhibition" understood. Guideline 9.4 reflects what we know of this online condition that allows "space" between bully and victim as a contributing factor to cyberbullying. It suggests that sites inform users that they are not as anonymous as they may think and employ IP address and identifying technology to track users. I'd go further and recommend that sites explain to users in their online-safety pages, in as much detail as feasible (without giving information away to malicious hackers), how their real-life identities can be found. It's the kind of meaty information that's meaningful to adolescents and shows respect for their intelligence. [To great effect, a school in Philadelphia brought in a computer-forensics police officer to demonstrate the lack of real anonymity to an entire student body.]

Practical. The guidance reflects an understanding that a narrow focus on social networking is impractical as young people's self-expression and socializing flow freely from offline to online and back and among multiple devices that can increasingly be used anywhere.

Not just social networking. In spotlighting chatrooms and Webcams as trouble spots, the guidance reflects the understanding that young people's socializing flows freely from device to device and between various technologies – as both technology and kids develop – and social sites aren't the only place where socializing happens for good or bad. For example, this significant finding about Webcams: "Recent research conducted in Holland by the My Child Online Foundation in 2006, involving 10,900 participants between the ages of 13 and 19, reveals that 47% of girls who responded to the survey, said they had received unwanted requests to do something sexual in front of a webcam – although only 2% actually did so."

Adding "teeth": Because teens' profiles usually reflect a major investment of time and emotion on their part, it's important to have consequences for violations of Terms of Service, so this is good: "Provide warnings to users about uploading photos to their profile, for example: ‘Photos may not contain nudity, violent or offensive material, or copyrighted images. If you violate these terms, your account may be deleted'" (5.3 on p. 27).

2. Neutrals and negatives

A bit of irony. Based on where young Britons do most of their social networking (MySpace, Bebo, and Facebook), there's a certain irony to the fact that another government's guidelines are aimed largely at a group of companies based in the US. That's not to say this is true in countries where English isn't the primary language (though California-based Orkut, Hi5, and Friendster are huge in Brazil, Thailand, and the Philippines, respectively), but safety on the social Web clearly has to be an international effort going forward.

Only the beginning. The guidelines are a great base to build on but don't indicate an understanding of the full range of abuse in social sites, where it comes from or actually occurs, and how hard it is to control – for example, how abuse reports can themselves be abuse ("prank" abuse reports that themselves are harassment of a user by the person "reporting" the abuse) and how some content cannot be moderated or pre-viewed by the service provider because it's from malicious hackers or in third-party sites marketing x-rated content (see "Mother-son digital divide bridged" below). The guidelines need to go further in acknowledging that the users themselves are not the only source of some of the inappropriate content in social-networking sites. Increasingly, third parties are finding ways to socially engineer or hack their messages, images, and software code into users' profiles, blogs, bulletins, and IMs in social-networking sites.

"He said, she said." The term "imposter profile" doesn't come up in the guidance, and this is a huge problem for the social sites, which – if responsible enough to take on the task – have to figure out if a profile is fact or fiction (even basic, non-abusive profiles created by people about themselves have plenty of fiction in them) and if the person behind it is real, fictitious, or malicious. How bad it makes its subject look can be one measure, but that sort of analysis is usually pretty subjective, and chasing down facts is at best time-consuming, if not impossible when the site involves millions of profiles. Even in a court of law, when the accused and the victim are physically present, it's hard to distinguish fact from fiction. Society has not even begun to understand the complexities of coping with online harassment.

Privacy not all good. The premise that privacy in social-networking sites for children is good seems to be unqualified. To say it isn't always sounds like heresy, when we constantly hear "don't post personal info online," but it's only mostly good because privacy tools can also be a barrier to parents', researchers', and law enforcement's efforts to monitor children's activities. Too, posting personal information online is a fact of life for teenagers, and research released over a year ago suggested a new approach to this subject (see this article in the Archive of Pediatrics).

More on mobiles needed. Best-practice thinking obviously needs to match the fluidity and mobility of young people's socializing in terms of devices, technologies, and location. Under "GPS and Location Services," the guidance says that mobile "customers are very sensitive about giving away their location. Only those services that carefully respect customers’ rights to protect their privacy will be successful." This is not necessarily true about teenage customers. Given where adolescents are in their brain development (acknowledged on p. 15 of the guidance under "US Perspective" but also treated thoroughly in the Byron Review – see this), special care will need to be given to how minors use GPS technology for socializing with their friends.

In the "back office." The guidance is light on addressing what needs to happen in social-networking sites' customer-service departments after abuse reports come in – response time, how various types of reports are responded to, proportion of customer-service staff devoted to youth protection, what gets elevated to law enforcement, etc. This needs to be looked at more closely going forward.