Digital experimentation is a flowering field. But a Facebook experiment that came into the public eye earlier this year has cast a harsh light on the practice.

Many people see the Web as a big laboratory for research, it's true. But digital experimentation will not get a free ticket to ride. Ethics issues -- ones that may come to affect the work of data management and analytics professionals -- lurk in the fabric of the new techniques.

Facebook discovered that after news emerged about a study it quietly conducted with Cornell University researchers in 2012. The social networking company altered the regular news feeds of some of its users for one week, showing one set of users happy, positive posts while another set saw dreary, negative missives. The results were contagious: Happiness begat happiness and sadness spawned gloom.

Measuring emotional contagion

Unlike in medical clinical trials, though, the participants weren't explicitly made aware that they were being studied. Few were the wiser until the Cornell crew published a paper titled "Experimental evidence of massive-scale emotional contagion through social networks" in the June 17 issue of the journal Proceedings of the National Academy of Sciences. Contagion is catchy: The New York Post news desk could scarcely have come up with a punchier headline.

But the study proved to be a matter of contention. Many people found it troubling that Facebook made guinea pigs of users. Their only warning was some arcana buried in a one-click user agreement.

The Facebook study also stands out because it mixed two usually distinct types of research: a company trying to fine-tune a product and scientists trying to test hypotheses. The blending has helped carry discussion of Facebook's experiment in manipulation beyond social media and news organizations. That was underscored last month at the Conference on Digital Experimentation (CODE) at MIT, where the experiment was among the topics being talked about.

Spotlight on digital experimentation

Speakers at the conference discussed advanced data science -- efficient exponential experimentation, crowd self-organization, mobile advertising effectiveness, online experiment design, and the now-perennial favorite of data analytics cognoscenti: causation and correlation. It was clear that digital experimentation currently is something largely done by big Internet companies trying to improve their online offerings. But Web-based medical research was also on the program.

A CODE panel on experimentation and ethical practice included Leslie Meltzer Henry, an associate professor of law at the University of Maryland who, together with a colleague, has written an open letter to Maryland's attorney general urging legal action against Facebook over its experiment.

While acknowledging the potential benefits of digital research, Henry thinks online research like the Facebook study should be held to some of the same standards required by government-sponsored clinical trials. "I start from the position that large-scale digital experimentation is here to stay," she said. "It can be good. That said, I do think we have to be respectful of the subjects." What makes the Facebook experiment unethical, in her opinion, was not explicitly seeking subjects' approval at the time of the study.

While they shared some criticisms, other panelists steered away from the idea of imposing clinical-style research requirements. They looked to put Facebook's activity in the context of "manipulative" advertising -- on the Web and elsewhere -- and news outlets that select stories and write headlines in a way that's designed to exploit emotional responses by readers. An underlying concern was placing strictures on large websites that aggressively mine user activity data.

The Facebook study was "unusual in the way it brought Cornell in," said Esther Dyson, the veteran technology journalist and venture capitalist whose present efforts include HICCup, a Simsbury, Conn. company that seeks to use small U.S. towns as laboratories for health improvements.

The line between scientific research and Web marketing should be clear, Dyson suggested. But she thinks users have a responsibility, too. People must understand when and how they're being manipulated, she told the conference attendees. "The best thing for all of this is a lot more education so people understand what is happening and how they are being manipulated."

Data dilemmas to work through

There are emerging medical research alternatives that also hold issues of data ethics. Because big data systems are able to trawl through masses of forms data like never before, analysis of historical medical records -- a form of evidence-based research -- is being considered by some doctors looking to better diagnose and treat diseases. But a rheumatologist at Stanford University's Lucile Packard Children's Hospital who searched past records to adjust treatment for a patient with lupus was later warned against pursuing such methods by administrators concerned about HIPAA privacy rules. This method encounters ethics issues too.

The backdrop for all this activity is a general feeling that the methodology of the traditional clinical trial has run its course. Such trials are expensive. They sometimes rely on very sparse data. A lot of statistical conniptions are used to backfill, but often the results are not reproducible.

Evidence-based research and digital experimentation could play a useful role in moving science, medicine and product development forward. But whether the goal is product improvement or scientific advancement, there will be data dilemmas to sort through -- and data professionals will eventually be called on to join in the sorting efforts.

1 comment

Register

Login

Forgot your password?

Your password has been sent to:

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

It is an interesting topic. Facebook's experiment didn't harm anyone, but I do wish that they had informed the users and asked for consent to participate.

As for Dyson's claim that users have a responsibility to understand when and how they're being manipulated, well, I pretty much always operate under the assumption that that's what all companies are doing. They're trying to further their own interests, and it's my responsibility to protect my own.