The Electronic Privacy Information Center (EPIC) filed a complaint with the Federal Trade Commission against Facebook alleging that the social network deceived users and violated a 2012 Consent Order after it tampered with users' news feeds for an emotional experiment.

Facebook published research in the Proceedings of the National Academy of Sciences in March detailing how it tinkered with the news feed algorithm of nearly 700,000 users for one week in early 2012. Researchers found that in instances where Facebook showed users more positive posts, they were more likely to share positive statuses. Conversely, when Facebook showed users more negative posts, they were more likely to share negative status messages.

EPIC's complaint, which it filed last week, takes the social network to task over three allegations. The group said Facebook's experiment "purposefully messed with people's minds" and claimed that researchers at Cornell University and the University of California, San Francisco, with whom Facebook conducted the experiment, "failed to follow standard ethical protocols for human subject research."

EPIC also questioned the nonconsensual and secretive nature of Facebook's experiment. "At the time of the experiment, Facebook did not state in the Data Use Policy that user data would be used for research purposes," the complaint said. "Facebook also failed to inform users that their personal information would be shared with researchers."

Lastly, EPIC says that because Facebook misrepresented its data-collection practices and made information accessible to third parties, Facebook violated the FTC's 2012 Consent Order.

"Moreover, at the time of the experiment, Facebook was subject to a consent order with the Federal Trade Commission which required the company to obtain users' affirmative express consent prior to sharing user information with third parties," it said.

EPIC is asking the FTC to open an investigation into Facebook's experiment, enforce the 2012 Consent Order, and make its news feed algorithm public, the complaint said. Other organizations have also filed complaints against Facebook, including the Center for Digital Democracy (CDD) and regulators from the United Kingdom.

While Facebook has not apologized for conducting the experiment, Adam Kramer, coauthor of the research, explained in a Facebook post his reasons for conducting it. He said the goal was not to upset users, but to learn how to provide a better service.

"...I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused," he wrote. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."

Facebook COO Sheryl Sandberg weighed in last week, saying she, too, was sorry about upsetting users. "This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated," she said. "And for that communication we apologize. We never meant to upset you."

Our InformationWeek Elite 100 issue -- our 26th ranking of technology innovators -- shines a spotlight on businesses that are succeeding because of their digital strategies. We take a close at look at the top five companies in this year's ranking and the eight winners of our Business Innovation awards, and offer 20 great ideas that you can use in your company. We also provide a ranked list of our Elite 100 innovators. Read our InformationWeek Elite 100 issue today.

Kristin Burnham currently serves as InformationWeek.com's Senior Editor, covering social media, social business, IT leadership and IT careers. Prior to joining InformationWeek in July 2013, she served in a number of roles at CIO magazine and CIO.com, most recently as senior ... View Full Bio

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.

I can imagine a time when government could use Facebook to get buy-in for a war, for example, or a major change in Social Security by using FB for funneling its propoganda. Or a corporation could use it in a similar manner, way beyond advertising or marketing, by manipulating user posts. It may seem a bit far-fetched now but before news of this 'experiment' emerged, it probably seemed even more out of the realm of possibility.

From my understanding of Epic's complaint, Facebook amended its ToS after it had begun its so-called experiment. I'm no lawyer, but surely one party cannot change the terms of a contract and still expect the contract to stand unless the other party also agrees to the new terms? Sure, users expect ads. But being turned into emotional guinea pigs steps far over the line.

This is a great point to make, as platforms like Facebook have multiple revenue streams. Advertising is one part of it, but another is the data behind user profiles. Anytime you sign up for a free service, there are other costs involved. Privacy might just be one of them.

It's important for people to remember what Facebook really is. On the social network, the users are the product. User data is being utilized to study behavioral aspects of marketing. That's what these studies are all about: Understanding the behavior of people to better sell products and services.

This is what Google does. This is what Twitter does. And this is what Facebook does. These services are not free, they are just structured in a way so that we think that they are. The user data is super valuable to these companies.

Typically you need to get informed consent from the subjects before subjecting them to any kind of psychological testing and I think Facebook using their terms of service agreeement as a catchall is fairly underhanded...

The workforce is changing as businesses become global and technology erodes geographical and physical barriers.IT organizations are critical to enabling this transition and can utilize next-generation tools and strategies to provide world-class support regardless of location, platform or device