I'm a privacy pragmatist, writing about the intersection of law, technology, social media and our personal information. If you have story ideas or tips, e-mail me at khill@forbes.com. PGP key here.
These days, I'm a senior online editor at Forbes. I was previously an editor at Above the Law, a legal blog, relying on the legal knowledge gained from two years working for corporate law firm Covington & Burling -- a Cliff's Notes version of law school.
In the past, I've been found slaving away as an intern in midtown Manhattan at The Week Magazine, in Hong Kong at the International Herald Tribune, and in D.C. at the Washington Examiner. I also spent a few years traveling the world managing educational programs for international journalists for the National Press Foundation.
I have few illusions about privacy -- feel free to follow me on Twitter: kashhill, subscribe to me on Facebook, Circle me on Google+, or use Google Maps to figure out where the Forbes San Francisco bureau is, and come a-knockin'.

After The Freak-Out Over Facebook's Emotion Manipulation Study, What Happens Now?

The dust is finally starting to settle around the revelation that Facebook manipulated users’ emotions for science. So what now?

Legally, I don’t think much will come of this beyond Facebook’s government liaisons working longer, harder hours for a while. On the other side of the pond, Facebook has to provide an in-depth explanation of procedures for all of its research to its local privacy regulator. Here in the U.S., at least one privacy group filed a formal complaint with the Federal Trade Commission saying that what Facebook did was “deceptive” and violated the agency’s existing 20-year consent decree with the site for previous privacy mistakes (which would mean monetary penalties). But I think that’s a dead end because Facebook didn’t start its probation with the FTC until August 2012, seven months after the emotion manipulation study happened. So the FTC wouldn’t be able to hit the company with fines; it would just be able to get a second, redundant consent decree. That’s if the FTC even thought it was harmful to consumers to have their Facebook stream more negative than usual.

“It’s clear that people were upset by this study and we take responsibility for it. We want to do better in the future and are improving our process based on this feedback,” says a Facebook spokesperson. “The study was done with appropriate protections for people’s information and we are happy to answer any questions regulators may have.”

Class-action lawyers are surely sniffing around the case, but it’d be hard to prove that a Facebook-using client was one of the 100,000+ whose News Feeds turned blue for a week. Facebook says it designed the study so that the test subjects stayed anonymous. And it’ll be hard to argue that there was some kind of financial harm from a week of attempted sad-making.

“A statistician who lives in Silicon Valley is a ‘data scientist,’” says Paul Ohm, a law professor at the University of Colorado. “Lots of companies — Bitly, OKCupid – have this weird conflation of data research based on what their users are doing and corporate profit-making. The ethics have been begging to be discussed. There’s A/B testing to better deliver a product a customer wants, but it’s another thing for companies to consider users to be a willing and ready pool of lab rats that they can prod however they want.”

Dating site OkCupid’s OkTrends blog was a voyeuristic and often salacious look into what works in a dating profile and what doesn’t and why people choose the people they choose. It was sadly discontinued after OkCupid was snatched up by IAC’s Match.com for $50 million. Whether Match worried more about the competitive intelligence being leaked or the “ethics” of the blog was never announced.

“We haven’t looked at the harms or invasiveness that comes along with these Big Data dives,” says Ohm. Ohm was one of the first people to criticize the universally acclaimed Google Flu trends project, pointing out that while there may be some benefits to knowing where flu is going to strike, it came with the harm of the search engine combing through some of our most sensitive searches: our medical conditions. “Google breached a wall of trust by dipping into its users’ private search data in ways that went beyond traditional and historically accepted uses for search query data, such as those uses relating to security, fraud detection, and search engine design,” he wrote last year. “While Google’s users likely would have acquiesced had Google asked them to add ‘help avoid pandemics’ or ‘save lives’ to the list of accepted uses, they never had the chance for public conversation. Instead, the privacy debate was held — if at all — within the walls of Google alone.”

Medical ethicist Alta Charo of the University of Wisconsin’s Law School thinks the outcry over the Facebook study is overblown, but that it’s worthy of discussion because of the ubiquity of Facebook and the sheer scale of experiments for companies that have a billion customers. “As a business practice, companies do research on consumer behavior all the time. Which colors work? Should a mailer start with happy story about candidate or an attack on competitor? This is not novel and not limited to Facebook,” she says. “I think there’s a larger question about how much individualized information we have around each person. As a matter of ethics, it’s not at all hard for a company to simply announce, ‘We constantly test our business practices, let us know if you never want to be part of that.’”

Many observers say this research is important, that they don’t want it to go away, but that they want a clear understand of corporate research ethics, as companies are not subject to the same oversight that academic researchers are. It’s a question of the technology and the extreme personalization that can happen in the Internet age, writes Zeynep Tufekci. “[I]t is clear that the powerful have increasingly more ways to engineer the public, and this is true for Facebook, this is true for presidential campaigns, this is true for other large actors: big corporations and governments.”

Ryan Calo, an academic at the University of Washington, was writing about corporate lab rats even before it became a hot topic of conversation. “It’s about information asymmetry,” he says. “A company has all this information about the consumer, the ability to design every aspect of the interaction and an economic incentive to extract as much value as possible. And that makes consumers nervous.”

Calo has a concrete ask. Facebook has said that it didn’t have a formal review procedure in place for studies in January 2012 when the emotion manipulation one took place, but that it has one now. But we don’t know anything specific about how that review works at Facebook or at other companies that know a lot about us and can run fascinating tests with that information. “I want all of these companies to treat consumer research like an ethical problem on par with all the other ethical problems they deal with all the time, like deciding whether to block a Chinese dissenter in another country. They’re used to making ethical choices but not with their own data,” says Calo. “I want Facebook and others to publish their criteria for greenlighting research.”

An academic at Microsoft Research has a suggestion for what that might look like. It’d also be nice if after reading that criteria, consumers could decide whether they want to take part in research.

Post Your Comment

Post Your Reply

Forbes writers have the ability to call out member comments they find particularly interesting. Called-out comments are highlighted across the Forbes network. You'll be notified if your comment is called out.

Comments

As you mentioned in this and past posts, the test was done prior to government mandate on FB and just before the penalty. So FB hid the test while under investigation. Then release it after penalty and claim all is well with the FB house! Something smells off about that. Either the fed was asleep at the wheel; a normal act. Or duped by FB; a normal Corp ploy. Or they are both in on it and share the intel under NSA threat of…well, let’s see can one rendition a bunch of scientist and hold them against their will indefinitely? Um right they did that already after WW2. I think FB got off easy by giving the intel to the feds as part of the payment. Just speculation here. But isn’t that what all this is about? Speculation of human habit, adjustment of environment stimuli and resultant fallout. In science one should not rule out any possible X factor without good reason. Has anyone asked the feds about this scheduled release of info to beat the system? Were the feds involved in research? The last question is moot because the fed has a right to lie to the public. There is the X factor.

Excuse me, Ms. Hill: “passing it onto their friends”? “Onto” is a perfectly respectable word, but it doesn’t mean the same thing as “on to”, as in “passing it on to their friends”. The phrasal verb involved is “pass [something] on”.

(Comments have been closed on your other article on this topic, where this solecism occurred, and forbes.com does not tell me how to contact you.)