Hello, Privacy

Irina Raicu

Last week, as part of Loyola University’s seventh annual Symposium on Digital Ethics, I gave a brief talk about the “Internet of Toys.” The title came from one of my earlier blog posts—“Et tu, Barbie?”—but in the talk I tried to focus more sharply on the ethical implications of internet-connected toys.

In the process, I described an in-class activity that I came up with a few months ago, when I was invited to guest-teach a Software Engineering Ethics class here at Santa Clara University.

The course is directed at graduate students. This particular class session was to focus on privacy. So, after a brief warm-up exercise intended to clarify my view of informational privacy (that it’s not about secrecy but about having some measure of control over which information we disclose to whom, when, how, in what context, etc.), I asked the students to break up into small groups.

And then I asked them to come up with the most privacy-invasive device or tool they could think of.

Here are the questions I wrote on the board to help them in that effort:

Where does the device “live” or reside?

What kind of information does it collect?

From/about whom?

What does it do with the collected data?

Who has access to the data collected?

Where is the data stored? Is it encrypted?

How long is the data stored for?

What purpose does the device aim (claim) to serve? Why?

(The questions are a work in progress. That last one is designed to get at the benefits v. harms analysis, but I’m not sure it quite does the job.)

After a while, I had a few of the groups present their ideas. The first group described a product that would be placed in a bathroom. And would collect photographs. Of… financial data! (At that point, I was, um, relieved to hear that).

Other groups described products that collected information about particularly vulnerable populations. Of course, the data collected would be publicly accessible, or sold to whoever was willing to pay for it. And as for how long it would be stored? Given the project’s stated goal, the almost unanimous answer was “forever.”

Then we talked about “Hello, Barbie,” the actual internet-connected toy that is currently on the shelves—and in some children’s bedrooms. We went through the questions one by one. It was an interesting conversation (especially since more than half of the students in the class were women, and some had children of their own).

Needless to say, the point of the exercise is to make future software engineers stop and think if (or when) they find themselves answering such questions in privacy-violating ways as they design actual products and services. And internet-connected toys, targeted, as they are, at young children, should be designed with even more care than most products. Such toys can undermine children’s privacy, and in the process undermine their developing autonomy and sense of trust. If a child speaks to his/her doll or teddy bear or dinosaur and then finds out that the “conversation” was recorded and transmitted to the child’s parents (not to mention uploaded to the “cloud” to be used by various entities for various purposes, which might be hard to explain to a young child), the child might well feel betrayed. (See reactions cited in a study by researchers from the University of Washington: “Toys that Listen: A Study of Parents, Children, and Internet-Connected Toys.”)

During the talk at Loyola, I pointed out, as well, that the ethical analysis of such products doesn’t end at the design, manufacturing, or even the marketing stage. As my colleague Shannon Vallor puts it, software is “a moving ethical target; features change, users change, platforms change, contexts change. A benign [product] can become ethically problematic with a single bug fix or update.” Vallor, who frequently teaches ethics in classes directed at engineers, also points out that “ethical issues cannot be fully anticipated; ethics is about responsiveness, not just foresight & prediction.”

Foresight and responsiveness: Internet-connected toys require a significant ethical commitment on the part of their creators, if they are to contribute to, rather than undermine, human flourishing.