Facebook Playing Your Feelings Is Legal But 'Creepy,' Say Law Experts

Facebook manipulated the News Feeds of hundreds of thousands of people to see if showing them mostly positive or negative posts affected their emotions. The research ignited anger among users, who accused the company of manipulation in the guise of science.

But did Facebook actually break any laws? Mashable talked to law professors to separate fact from fiction.

Was it illegal?

Several factors have to be considered when judging whether Facebook broke any laws. First of all, Facebook's terms of service (which the company calls its Data Use Policy) makes it clear that, when creating an account, a user consents to his or her data being used for "research" — although what kind of research is unclear.

Ryan Calo, a privacy expert and law professor at the University of Washington, told Mashable that the study may be "creepy" but not necessarily in violation of any privacy law.

Whether the Facebook research is subject to federal regulations on human-subject research, however, is a little more complicated. Had the study been authored solely by a Facebook employee, it wouldn't fall under such regulations, which mostly apply to federally funded research institutions. But since the paper was authored both by a Facebook data scientist, Adam Kramer, and also by two other researchers — Jamie Guillory and Jeffrey Hancock, who work for Cornell University and the University of California at San Francisco, respectively — the issue is murkier.

Michelle Meyer, Director of Bioethics Policy at the Union Graduate College-Icahn School of Medicine at Mount Sinai Bioethics Program, explained in a widely read blog post that both Cornell and UCSF have to get permission from their ethics boards when conducting research on human subjects.

But were Guillory and Hancock really "engaged in research," as the regulations put it? Meyer, an expert in the intersection of law, science and philosophy, told Mashable that she thought they weren't. According to the footnotes of the paper, Facebook's data scientist was the one who "performed the research" and "analyzed the data." (The other scientists kept their hands clean: apparently they just designed the study and wrote up the results.)

And Cornell agreed that there was no need for the university to review the research.

The ethical issue of "informed consent"

Facebook's research may have been legal, but several experts say it was unethical, and generally a bad idea for Facebook not to ask for permission from its users to manipulate their emotions.

"This was a creepy study — it boils down to 'We want to see if we can make you happy or sad,'" said James Grimmelmann, a law professor at the University of Maryland who criticized the research. "Facebook should have gotten informed consent for this experiment. Facebook says it wants to do the right thing — that's the right thing. Facebook says it wants to contribute to science — that's what scientists do."

But Facebook didn't completely acknowledge why this was controversial. In its public response, the company focused mostly on privacy issues and the fact that it has "a strong internal review process" — although it didn't specify any details about it.

Kramer, Facebook's data scientist, not surprisingly defended the research, saying its impact on people was "the minimal amount to statistically detect it." But even he seemed to admit that this whole study was a little creepy.

"In hindsight, the research benefits of the paper may not have justified all of this anxiety," he wrote in a Facebook post.

Can users sue?

Even if you got played, don't bother calling your lawyer: Experts agree that suing Facebook for this study will probably not lead anywhere.

"A lawsuit would be a very long shot, unless an attorney could find somebody who could document specific harm from that period," Grimmelman said, adding that specific harm could be, for example, someone who committed suicide as a result of seeing only negative posts on his or her News Feed.

"Emotional damage doesn't count," Meyer said. "You would really have to show some tangible damage."

Mashable
is a global, multi-platform media and entertainment company. Powered by its own proprietary technology, Mashable is the go-to source for tech, digital culture and entertainment content for its dedicated and influential audience around the globe.