Cautiously apocalyptic

"We've been waiting for societal readiness," Ian Danforth says, at the end of his list of factors that have kept us waiting for robots. The head of the pet-like prototype machine on the table next him nods.

"People enjoy teaching pets tricks and find it endearing when they fail; why shouldn't this apply to robots?"

Yet Danforth confidently predicts that in six months "incredible, unexpected new robots you want in your home" will be available; in a year thousands of homes will have them; in two years tens of thousands; and five years will produce the first "true AI". We are less than a mile away from the research lab where John McCarthy labored for 50 years; in 1956 he thought it would take six months.

Danforth's ideas tap into a particular trend, which Ken Goldberg at UC Berkeley calls "cloud robotics". Today's networked computational power means that you can launch a cute pet robot into the market with rather limited abilities, and let it improve in the field via the cloud. People enjoy teaching pets tricks and find it endearing when they fail; why shouldn't this apply to robots? Coalescence happens for me when someone asks what kind of data this cute little pet will be collecting, especially in conjunction with other recent events. Answer: video, audio, accelerometer, and geolocation from an attached GPS unit, all sent to a central server, from where the data can be shared back out again so my robot suddenly knows a trick that yours has learned. Someone's actually implementing Rupert Sheldrake's morphic resonance.

Danforth claims the data will not be looked at by humans. Not impressed: as the ACLU's Jay Stanley has pointed out, what matters is less whether data is examined by humans or read by machines than the way the resulting decisions reverberate through the rest of our lives. Later, Danforth tells me the stream will be encrypted in transit to and from the server, and he hopes that if law enforcement issues a subpoena he'll be able to say he has no data to show them. Now, why does that make me want to say CALEA and communications data bill?

The notion of robot as intimate data collection device came up at the first We Robot last year, among the many other things lawyers worry about, like liability, but this is less hypothetical. It shows that Charlie Stross was right in his talk about the future of Moore's Law that computational power is yesterday's future, just as increasing transportation speeds were the future of the first half of the 20th century. Today's future is rapidly emerging as data (his meditations on the implications of bandwidth included lifelogging). Big data, open data, algorithmic decision-making. Asimov did not, if I remember correctly, consider this aspect of robotics. His robots fought through individual behavioral tangles brought upon them by the Three Laws, but did not collaborate across vast data networks and did not wrestle with deciding whether disclosing their intimate knowledge of you to a hostile interrogator would cause you sufficient harm that they should harm the interrogator or self-destruct rather than answer.

"Asimov's robots fought through individual behavioral tangles brought upon them by the Three Laws, but did not collaborate across vast data networks"

Julie Martin saw this as a possibly hopeful thing: "Robotics cases may force people to look at things they should be looking at," she said. "It shouldn't be different because it's robotics." She meant that the world is now full of data collection technologies we shouldn't be taking so casually, and robots provide an opportunity to make that visible enough to engage people in stopping it. In response, Ian Kerr commented that Ryan Calo has made similar comments about drones in the past – that they would spark a chance to h4ave and win a privacy debate that should have already taken place, "but I'm cautiously apocalyptic about that now".

One of Martin's examples was Tesla's recent spat with the New York Times, which showed how much data cars can collect about their drivers. Unfortunately, if the past discussions are any guide, the argument others will make is that in a world of CCTV cameras, wiretap-ready telephone services and ISPs, online profiling, and audit trails, "why should robotics be any different" will be the line used to justify the invasion of our most private settings. Cue Bill Steele's 1970s song The Walls Have Ears.

At this point an evil thought occurs: you sell a cute robot people will fall in love with. You include the kind of subscription service common in software, where you push updates and improvements to the robot automatically. Or, in the way of today's world, you offer those services free, contingent on my agreeing to data sharing. When I fail to resubscribe or refuse to provide data, all that stops. With an Internet service, the site stops giving me personalized service (search results, targeted ads). A pet robot would seem to stop loving me back. This seems to me a chilling but perfectly plausible business model and not at all what we imagine when we long for a robot to do the housecleaning.

ORG Events

Contact us

Email us

Write for us

ORGzine welcomes contributions. If you are interested in writing a comment on a digital rights issue,
please get in touch

About ORG

The Open Rights Group campaign for digital rights, and defend democracy, transparency and new creative possibilities.
ORGzine is the Open Rights Group digital magazine. The zine is a space for news, opinion, features, and debate over the social,
political and legal issues associated with digital rights.