Facebook Junior? The Implications of Expanding Facebook’s Universe to the Under-13 Crowd

According to a recent Wall Street Journal article, Facebook is considering expanding its site to include children under the age of 13 as members. At present, Facebook’s official policy is to exclude children, although numerous reports indicate that children skirt identity and parental-verification measures to use the social-networking platform.

If Facebook does make the move to include children, what will the implications be? In this column, I will discuss Facebook’s nascent plans, and discuss why integrating minors into the site may not be desirable unless (1) there is control over what children can do on Facebook, and (2) there are policies as to what companies can do with the information they store and collect about children via Facebook. Thus, the focus should not be just on parental controls, but also on the controls and restraints that Facebook itself will use to guard against unwarranted use of a child’s personal information.

Facebook’s Plans to Include Under-13 Children At Some Future Time

Facebook CEO Mark Zuckerberg has previously stated that he believes that children under 13 should be allowed to use Facebook. “That will be a fight we take on at some point,” he said, according to news reports.

And now, news reports reveal that Facebook is developing technology that would allow kids who are under the age of 13 to join—albeit with more parental controls and with a lockdown option. But the question remains whether this would this mean that even, say, a five-year-old could join Facebook. True, parents would still need to provide consent before a child could join—but should the very same platform that serves as a meeting place for adults also serve the whole under-13 crowd?

Last summer, Facebook was apparently in discussions with identity-verification providers about how to obtain verifiable consent from the parents of children who want to use Facebook—not a segregated Facebook kiddie site. From all accounts, Facebook isn’t working on a separate kids-only site right now. Rather, it seems that Facebook is developing parental controls with the goal of having kids use its regular platform. The tools that are being tested, in this respect, include (1) a means of linking children’s accounts to parents’ accounts, and (2) controls that would allow parents to decide whom their kids can “friend,” and what applications their kids can use.

What’s Prompting Facebook’s Move to Welcome the Under-13 Crowd?

Why is Facebook focusing now on under-13 users? One reason is that there are a significant number of underage users who are already on Facebook. In response to a recent Microsoft study, Facebook acknowledged that “[r]ecent reports have highlighted just how difficult it is to enforce age restrictions on the Internet, especially when parents want their children to access online content and services.”

Moreover, the pressure on Facebook to prevent children from lying about their birthdates to get accounts has increased as researchers have started to quantify how many kids are already using the site surreptitiously. Last year, Consumer Reports said that 7.5 million children under the age of 13 were using the site, including a reported group of more than five million children under the age of 10.

A study sponsored by Microsoft Research and released last Fall found that 36% of parents were aware that their children had joined Facebook while under the age of 13, and that a substantial percentage of those parents had helped their kids do so.

The FTC Has Authority to Enforce the Children’s Online Privacy Protection Act (COPPA) Against Social Networking Sites If It So Chooses, and It Has Done So in the Past

The presence of kids on Facebook occurs against the backdrop of federal law, including the Children’s Online Privacy Protection Act (COPPA), which is enforced by the Federal Trade Commission (FTC). Under COPPA, if a site targets under-13 children, allows them to use the site, and collects data about them, then it must first obtain “verifiable” parental consent from a child’s parent or guardian.

To date, Facebook has chosen not to officially allow children onto its site: It prohibits them as a matter of policy. But it still may be subject to FTC scrutiny if it can be shown that, in practice, Facebook knows that minors join its site and/or encourages them to do so. Such a showing would potentially expose the site to penalties and regulatory action. No wonder then, that Facebook may be reasoning that if it can’t practicably stop under-13 kids from joining, then it should open the gates and allow them to join, with suitable protections.

The FTC has taken action against companies with respect to COPPA violations before. In 2006, the FTC levied the largest COPPA fine to date against the website Xanga for repeatedly allowing under-13 children to use the site and harvesting their personal data without getting parental consent. On November 8, 2011, the FTC announced that the operator of skidekids.com, a social-networking website that markets itself as the “Facebook and Myspace for Kids,” had agreed to settle charges that he collected personal information from approximately 5,600 children without parental consent, also a breach of COPPA.

But is COPPA really the motivator for a change in Facebook policy? The larger reason is probably that Facebook wants to market to a bigger set of users—and to get users hooked on Facebook earlier in their lives. Officially including under-13 children on Facebook would mean that the site could advertise and market a whole new set of product lines—including children’s games, apps targeting children, and other goods for which children are the target market. Ideally, from Facebook’s perspective, the children of today would become loyal patrons of the Facebook of the future, if they were allowed to officially join the site. Children could also become a revenue source: The under-13 features could enable Facebook and its affiliates to have parents buy games and other entertainment that are then accessed by their children.

At a time when Facebook’s share price has plunged after its initial public offering, the company may now be set on expanding its markets and revenue base in order to preserve its value for investors.

Should Parents Opt to Let Their Under-13 Children Join Facebook, If That Indeed Becomes a Possibility?

Parents’ views and opinions will be integral to the issue of whether their teens and younger children can join Facebook. Thus, even if Facebook offers parents a new model for children, with new tools and controls, it will still be up to the parent to decide whether his or her child can join Facebook.

Information about under-13 children’s use of Facebook has exacerbated existing concerns about how Facebook handles its adult users’ data. In November 2011, the company agreed to a 20-year settlement with the FTC over accusations that it misled users about its use of their personal information. The social network agreed to regular privacy audits in response to the FTC investigation.

But here’s the rub: As noted above, Facebook is not creating a Facebook Junior—a separate platform for kids alone. Instead, it is testing ways to integrate kids into the main Facebook platform—the very same place where we, as adults, may disclose our relationship woes, illnesses, partying habits, and other adult topics. Yet a separate platform for kids might have made more sense, following the model of other sites that focus on issues and content that is tailored to kids or tweens, and thus are COPPA-compliant.

But let’s assume, instead, that rather than creating a second site, Facebook welcomes children as it is planning to do. Then, some new concerns will arise. First, children may have a less accurate sense than adults do as to when it is appropriate to provide data about their likes, tastes, preferences, family members, and so on. Second, parents will have to figure out when and to what extent Facebook and third-party applications will use the data collected from child members, and to what extent that data can be deleted by parents and by children themselves.

Will photos or images that are posted by kids—but that are about adults—be subject to similar privacy protections? Will a ten-year-old (or his parents) want his posts displayed on his Facebook Timeline—including discussions of, say, whether he hated or liked his fifth-grade teacher? Or, should kids perhaps be able to opt out of the controversial Timeline feature? Will parents have the job of scrubbing their kid’s Facebook walls of objectionable content that has been posted there by their child or other children?

Adult Facebook users already struggle to keep up with changes in privacy practices and in the use of their own data. Will parents always be in the best position to know the implications of their kids’ posting on Facebook? While parents may monitor the posts and friends of their kids, are they going to be well-positioned to vet each and every app their child may want to use? The answers to such questions, for now, may be no. If so, this is one place where technology, company practices, and regulation may need to fill the gap.

The notion of a nearly continuous profile of a person that begins to exist as early as, say, age 5 or 6, will undoubtedly raise lots of privacy concerns. Perhaps one’s under-13 Facebook profile ought to disappear—with the option of its being reinstated or re-created later, once one reaches 13, or some other age, and provides new consent. Moreover, prior to a child’s reaching age 13, who has the final say on whether her information is deleted or withdrawn. Can a child’s wishes ever trump a parent’s?

In addition, a larger question remains: What are the existing privacy protections for children who are currently on Facebook, having joined the site by evading age verification with or without their parent’s help? Should Facebook have a duty to do more than block its own content if it learns that an under-13 minor has become a user?

Third-party Facebook application developers presumably are subject to COPPA and would have their own obligations to remove data from own databases if a parent comes forward to make a request. But this requires parents to follow trails from Facebook to third-party applications to which their kids may have gained access. Should Facebook have any duty to provide notice to these application developers of a child’s improper access to the site?

There are also third-party data aggregators—who pull information about us from whatever we make “public”—and, for some people, that is everything they post. For kids who have evaded age verification, if they have not locked down their page, then data about them may be collected and stored and possibly used to evaluate them in the future. Should such entities have an obligation to expunge such data and/or refrain from using it if they are notified that a minor has evaded age restrictions when posting on a social media site? The answer is likely no, but the privacy concerns or risks remain.

It may be that COPPA, Facebook policies, and other laws together provide a coherent answer to these questions. But that answer is only going to get more complicated if Facebook welcomes under-13 users, and makes the site nearly a birth-to-death experience for users.

Interestingly, Facebook’s own exploration of site expansion comes at a time when COPPA is under review—after more than a decade.

Some privacy researchers and scholars liked danah boyd (one of the co-authors of the Microsoft study I mentioned above) are encouraging policymakers to move away from the notion of verifiable parental consent as the cure-all for concerns about children’s privacy. boyd’s enlightening research helps us understand that parents are often helping their children join Facebook and other sites. That research may be influential in the review of the COPPA rules.

In September 2011, the FTC announced proposed revisions to the COPPA rules that, if accepted, will represent the first significant changes to the Act since its rules were issued in 2000. The proposed rule changes expand the definition of what it means to “collect” data from children.

The new rules would also present a data retention and deletion requirement, such that data obtained from children would only be kept for the amount of time necessary to achieve the purpose for which it was collected. But when the purpose of data is to memorialize events on a Timeline, what does it mean to have an obligation to delete content? This is not data processing, but data publishing—and so COPPA, as redrafted, may not address the concerns of a new, kid-friendly Facebook.

Ultimately, it may be the case that COPPA revisions will be an important step in ensuring that Facebook and other companies will have to collect (and delete) the data of children in an appropriate manner. But as of now, this is all still under discussion.

What Should Parents Do in the Interim, Before Facebook’s Planned Under-13 Children’s Access Kicks In?

Meanwhile, what should parents do while waiting for Facebook’s kid-friendly access? The best answer may simply be the obvious one: keep a close eye on what your younger kids are doing online by “friending” or following their posts and their use of sites.

Experts say that parents should be familiar with the sites their kids use, the sites’ rules, and privacy policy, and the types of activities the sites host. Moreover, privacy settings can be confusing, so parents should work with their kids to choose appropriate settings. This requires parents to be vigilant about changes in a site’s privacy practices, since they are unlikely to remain static.

If we looked into Facebook’s crystal ball, we might see that, in the future, every child will have the right and ability to access social media. But we need to think about the implications of social-networking companies’ claiming even limited property rights in our kids’ personal information. This is a concern in addition to the concern about the potential for others—identity thieves, data brokers, predators and others—having access to our kid’s profile

We do have existing models of how to develop public policy when children access new media. As some commentators have reminded us, in the 1990s, Congress passed the Children’s Television Act by a unanimous vote. The Act was heralded for its bipartisan support for the idea that we have a responsibility to offer our children educational programming, with only limited forms of commercial advertising. Congress was able to set limits, through the Act, on how and when companies could gain access to our children for marketing purposes.

We are now at a similar turning point with social media. Congress could learn from its prior efforts and evaluate the implications of a future Facebook where all generations can post and share, from cradle to grave—and where many companies will likely have access to our children’s data as part of the bargain.

Anita Ramasastry is the UW Law Foundation Professor of Law at the University of Washington School of Law in Seattle, where she also directs the graduate program on Sustainable International Development. She is also a member of the Law, Technology and Arts Group at at the Law School. Ramasastry writes on law and technology, consumer and commercial law, and international law and globalization.

marty9999

If you are looking for full parental control that monitors & controls everything kids do online (including Facebook) , as well as blocks inappropriate websites, and does linguistic analysis to watch out for dangerous behavior – such as internet predators or cyberbullys – check out McGruff SafeGuard’s Parental Control system: http://www.GoMcgruff.com

You may remember McGruff “The Crime Dog” – Take A Bite Out of Crime – from your own childhood

No matter what Facebook or the government says, parents must remember that they have the final word on allowing children to join social networking sites. The #1 safety tip is to communicate with your children, set expectations, and consequences.