When I first heard that Facebook was messing with our emotions by manipulating the stories some users saw in their News Feed, my first thought was: Duh.

News Feed plays with my heart and mind every day—that’s kind of why I still check it. I want to see the myriad creative ways people have chosen to announce
their engagements and pregnancies. I want to see photos of my far-away family. I could do without the people who constantly complain about their lives or
the ones who seek attention with their romantic escapades, but that’s all part of the messiness of Facebook.

But the network’s week-long psychological experiment feels more sinister. Facebook was trying to figure out if people can transfer emotions to one
another—for instance, if you surround yourself with people who are sad or depressed, that could lead to your own sadness. So the network manipulated the
News Feeds of nearly 700,000 users, reducing negative emotional posts for one group and reducing positive emotional posts for the second. The experiment
spanned a week’s worth of News Feed posts, from Jan. 11-18, 2012.

The results were published in an academic journal this week, immediately incensing the
Twittersphere and the blogosphere, but not so much the Facebook-sphere. (Not one peep about the experiment in my own News Feed, at least.)

Facebook experiments with News Feed all the time.

Facebook’s future experiments

I couldn’t quite muster the outrage others were feeling. Facebook is a corporation to which I willingly hand over my information. Though I don’t explicitly
consent to all of the changes they make, they’re free to do whatever they want, and I’m free to leave. That doesn’t mean Mark Zuckerberg shouldn’t feel a
guilty twinge over some of the decisions he’s made in the name of improving Facebook—and this experiment is just the latest example. In some respects, like
privacy, Facebook has figured out that enraging its users is just bad PR. But it took many, many, many missteps to get to that point. (Let’s take a moment
to remember
Beacon, Sponsored Stories, and the original News Feed, which many thought was beyond creepy.)

Facebook CEO Mark Zuckerberg has had to apologize an awful lot in the company’s past.

Going forward, the network should make psychological experiments opt-in and open, not some weird secret. Facebook should also give careful consideration to
whether there are any real benefits to testing its users this way.

Facebook data scientist Adam Kramer defended the research in a Sunday Facebook note. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling left out,” he wrote. “At
the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”

The research found the opposite, that positivity begets positivity and negativity begets negativity. But the effects were so small and the methodology so questionable—turns out it’s pretty tough to parse someone’s emotional state from a Facebook post—that the experiment just doesn’t seem worth it.

“In hindsight, the research benefits of the paper may not have justified all of this anxiety,” Kramer added.