Facebook’s controversial study that manipulated users’ newsfeeds was not pre-approved by Cornell University’s ethics board, and Facebook may not have had “implied” user permission to conduct the study as researchers previously claimed.

In the study, researchers at Facebook tweaked what hundreds of thousands of users saw in their news feeds, skewing content to be more positive or negative than normal in an attempt to manipulate their mood. Then they checked users’ status updates to see if the content affected what they wrote. They found that, yes, Facebook users’ moods are affected by what they see in their news feeds. Users who saw more negative posts would write more negative things on their own walls, and likewise for positive posts.

As reported by The Post and other news outlets, Princeton University psychology professor Susan Fiske told the Atlantic that an independent ethics committee, Cornell University’s Institutional Review Board (IRB), had approved use of Facebook’s “pre-existing data set” in the experiment. Fiske edited the study, which was published in the June 17 issue of Proceedings of the National Academy of Sciences.

A statement issued Monday by Cornell University clarified the experiment was conducted before the IRB was consulted. A Cornell professor, Jeffrey Hancock, and doctoral student Jamie Guillory worked with Facebook on the study, but the university made a point of distancing itself from the research. Its statement said:

Professor Hancock and Dr. Guillory did not participate in data collection and did not have access to user data. Their work was limited to initial discussions, analyzing the research results and working with colleagues from Facebook to prepare the peer-reviewed paper “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” published online June 2 in Proceedings of the National Academy of Science-Social Science.

Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.

User consent called into question

Facebook researchers claimed the fine print users agreed to when they signed up was tantamount to “informed consent” to participate in the study. Facebook’s current data use policy says user information can be used for “internal operations” including “research.” However, that’s not what it said in 2012 when the study was conducted. According to Forbes:

In January 2012, the policy did not say anything about users potentially being guinea pigs made to have a crappy day for science, nor that ‘research’ is something that might happen on the platform.

Four months after the study, in May 2012, Facebook made changes to its data use policy, and that’s when it introduced this line about how it might use your information: ‘For internal operations, including troubleshooting, data analysis, testing, research and service improvement.’ Facebook helpfully posted a ‘red-line’ version of the new policy, contrasting it with the prior version from September 2011 — which did not mention anything about user information being used in ‘research.’

“When someone signs up for Facebook, we’ve always asked permission to use their information to provide and enhance the services we offer,” a Facebook spokesman told Forbes. “To suggest we conducted any corporate research without permission is complete fiction. Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word ‘research’ or not.”

Content from Allstate This content is paid for by an advertiser and published by WP BrandStudio. The Washington Post newsroom was not involved in the creation of this content. Learn more about WP BrandStudio.