I'm a privacy pragmatist, writing about the intersection of law, technology, social media and our personal information. If you have story ideas or tips, e-mail me at khill@forbes.com. PGP key here.
These days, I'm a senior online editor at Forbes. I was previously an editor at Above the Law, a legal blog, relying on the legal knowledge gained from two years working for corporate law firm Covington & Burling -- a Cliff's Notes version of law school.
In the past, I've been found slaving away as an intern in midtown Manhattan at The Week Magazine, in Hong Kong at the International Herald Tribune, and in D.C. at the Washington Examiner. I also spent a few years traveling the world managing educational programs for international journalists for the National Press Foundation.
I have few illusions about privacy -- feel free to follow me on Twitter: kashhill, subscribe to me on Facebook, Circle me on Google+, or use Google Maps to figure out where the Forbes San Francisco bureau is, and come a-knockin'.

Ex-Facebook Data Scientist: Every Facebook User Is Part Of An Experiment At Some Point

Andrew Ledvina used to be a data scientist at Facebook. He recently made the mistake of talking to a reporter about that, notably telling a WSJ reporter that when he was there from 2012 to 2014, there was no internal review board that might have had qualms about Facebook’s now infamous emotion manipulation study, that he and other data scientists were allowed to run any test they wanted as long as it didn’t annoy users, and that people working there get “desensitized” to the number of people included in their experiments as it’s such a tiny percentage of Facebook’s overall user base. Ledvina, like many a person quoted in the media, didn’t like the way the reporter presented his words and so took to his blog to defend himself, Facebook, and the Facebook study — but I think he simply dug a deeper hole for the company that he quit this April. He facetiously titled the blog, “10 Ways Facebook Is Actually The Devil,” to needle those who see the study as evil, and then went on to confirm the WSJ’s report and shed new light on how Facebook’s data science team views users.

Ledvina: “While I was at Facebook, there was no institutional review board that scrutinized the decision to run an experiment for internal purposes. Once someone had a result that they decided they wanted to submit for publication to a journal, there definitely was a back and forth with PR and legal over what could be published.”

2. If you’re on Facebook, you have definitely been a test subject at some point.

Ledvina: “Experiments are run on every user at some point in their tenure on the site…”

3. But you may have been a test subject in a very boring experiment.

Ledvina: “…Whether that is seeing different size ad copy, or different marketing messages, or different call to action buttons, or having their feeds generated by different ranking algorithms, etc.” [Ed Note: Link on "call to action" is mine not Ledvina's.]

4. This ex-employee of Facebook still doesn’t understand why people are upset that Facebook researchers tried to see if they could upset people.

Ledvina: “The fundamental purpose of most people at Facebook working on data is to influence and alter people’s moods and behaviour. They are doing it all the time to make you like stories more, to click on more ads, to spend more time on the site. This is just how a website works, everyone does this and everyone knows that everyone does this, I don’t see why people are all up in arms over this thing all of a sudden.”

5. Facebook researchers forget that what they’re doing has an effect on the real live people that use Facebook.

Ledvina: “Every data scientist at Facebook that I have ever interacted with has been deeply passionate about making the lives of people using Facebook better, but with the pragmatic understanding that sometimes you need to hurt the experience for a small number of users to help make things better for 1+ billion others. That being said, all of this hubbub over 700k users like it is a large number of people is a bit strange coming from the inside where that is a very tiny fraction of the user base (less than 0.1%), and even that number is likely inflated to include a control group. It truly is easy to get desensitized to the fact that those are nearly 1M real people interacting with the site.”

Ledvina expressed surprise that the experiment playing with the emotional content of users’ News Feeds was getting so much play in the press while other Facebook research has been ignored. He pointed to an event last year, where Facebook researchers and academic researchers got together to talk about work done to see how “Facebook and social networks in general can be more compassionate,” and prevent bullying particularly among teens. “I am a bit taken aback by the fact that the most recent paper has gotten as much press as it has, when the work done as part of the compassion research days has never been mentioned,” he writes. “Some of these papers are based on experiments that influence people’s behavior in similar ways, but I guess they do not have as much cachet for whatever reason.”

I went ahead and watched the hours of presentations archived by Facebook from the “Compassion Research Day.” There were a couple of key differences from the January 2012 manipulation study. First, none of the work done by the researchers aimed to make people feel worse. Secondly, the research on people’s behavior was far less surreptitious. Thirdly, researchers didn’t interfere with people’s “natural” experience of the site beyond obvious (and some not-so obvious) prompts for feedback.

One of the videos about “new tools to understand people” doesn’t have a presenter talking about trying to influence people’s emotional state in a negative way and then measure it by monitoring their status updates. “ We asked users, ‘What are you trying to do? Why did you click that button?’” says the presenter. “We learned a lot from asking people for feedback.”

This is a transparent way of “running tests” on users, presenting them with questions and asking them for feedback, a rather traditional approach to experimentation. Another presenter talked about measuring emotion by asking people to put “emoticon” faces on their status updates. Okay, Facebook users may not have realized they got these cute digital stickers for Facebook to measure their emotional states, but it’s at least a translucent way of taking users’ emotional temperature.

Facebook researchers worked up these digital stickers so they could measure users’ emotions across the world

In trying to see how teens deal with stuff they don’t like on the site, Facebook proactively asked the youngish ones how they felt when they flagged something they didn’t like. That’s a far more aboveboard approach than the one taken by the Facebook data scientist in the 2012 emotion study. The researchers were trying to find out why people get offended and how they resolve the issue, and giving them better tools to aid the process, which is all a far cry from turning the tone of News Feeds negative to see if it makes someone blue and turns them off using Facebook. The perhaps unexpected thing done by the ‘compassion researchers’ is collecting the messages Facebook users send to other users when asking them to take down an embarrassing photo or offensive post; they do this to try to understand why people object to content on the site.

Facebook asking users how a post made them feel, rather than secretly divining it from their status updates

Facebook’s ‘innovation’ in gauging feedback, as presented at a company event in December 2013

Ledvina is pessimistic about the ‘hubbub’ over the study leading to any change at Facebook, though he makes very clear that he is an ex-employee who knows little about the inner workings at Facebook now. “The only thing I see changing from this is not whether similar experiments will be run, but rather [whether they will] be published,” he writes, confirming the outcome that many commentators, including the New York Times’ Farhad Manjoo, fear. “Similar experiments have been and will continue to be run, but you probably just won’t see a paper about it anymore.”

Post Your Comment

Post Your Reply

Forbes writers have the ability to call out member comments they find particularly interesting. Called-out comments are highlighted across the Forbes network. You'll be notified if your comment is called out.

Comments

““The fundamental purpose of most people at Facebook working on data is to influence and alter people’s moods and behaviour. They are doing it all the time to make you like stories more, to click on more ads, to spend more time on the site. This is just how a website works, everyone does this and everyone knows that everyone does this, I don’t see why people are all up in arms over this thing all of a sudden.””

This is one fairly minor reason why: https://history.state.gov/milestones/1866-1898/yellow-journalism

So you think journalists are making this into a big deal rather than “normal” Facebook users being concerned about what happened? I disagree with you, but would like to know why you think it’s the case.

I.e. It’s one thing for a person to choose to head to the Drudge Report or AlterNet where they expect (and demand!) to be emotionally manipulated, but given the blowback it’s obvious very few expect this sort of thing from a supposedly non-partisan, non-issue-based major media source such as facebook.

The screenshot of “how did it make you feel” notification box is a joke. Like when did that box show up? People at FB must really think account holders are very unintelligent. “how did it make you feel” notification box LOL… seriously that’s insult to injury. I will bet not one account holders saw that notification box… it’s just for this article.

One week after the psychological research news, they do the “we did the notification box”… it’s damage control. I’m very irate by what fb did to tamper with lives. FB is the biggest bully, using every trusted private personal information shared. I’m sorry fb, through my observation and my own experiments at this end, there were boundaries that were crossed that could have potentially harmed a lot of innocent account holder’s mind.

Even if the “notification box” was displayed, through my observation it most likely was shown to a selected group, but it seems more of a damage control. There are other groups that were experimented on without any such notification. It’s a very personal topic because I’ve realized their methods evoked a lot of feelings that could have been detrimental. They intruded to real-life to gather conclusive results, via phone as well. I’ve been an avid fb account holder with several “fun” pages, business page, advertising on fb, a very active account holder. There were lots of things observed. So many things were interjected and privacy violations.

Ms. Hill, educate yourself to the APA Ethics Code before you tarnish your career. I don’t care if you are uneducated or are just peddling smut, asking for ‘feedback’ in no way constitutes an empirical research study that ‘runs tests’ on humans. Consider yourself WARNED:1.04 Informal Resolution of Ethical Violations.

Why is it that the general public think that think tank scientist don’t get the point of the issue? Try this; lock your self and other like minded people in a room, dream up ways to change some thing and make money, at some point someone will get hurt. If you insulate the researchers from the effects of research you have desensitization those doing the research from those being researched. I would think animal rights activist would get involved in this! Oh, right, it does not involve animals just us humans….