Algorithms Love You and Want You to Be Better!

Last week, I had the pleasure of attending a seminar on disruptive technology given by the director of Singularity University, Salim Ismail, his colleague David Roberts, and Banning Garrett, Senior Fellow at the Atlantic Council.

Heads up — this post is going to be higher-geek-octane than usual. Specifically: robots, 3D printers, gene-hacks, exponential technology growth, pristine-algorithm-theory, self-replication, and godmodding. If those words make you clutch your bleeding ears, Cooking the Books is over here, and there are fine fiction links here and here.

Still with me? Sweet.

What did we talk about? A lot of tech developments from the 2008-2012 era, and their current implications. Mr. Roberts gave a fine list of previous tech disruptors (refrigeration, fountain pens) that warmed the cockles of my pen-loving heart.

Mr. Ismail had some wonderful one-liners guaranteed to get the D.C. wonkerati’s heads to snap to attention. To wit (paraphrasing): I grew up hacking my computer, my 18 month-old will be able to hack the family pet; and (invoking Kurzweil): any tech-enabled discipline area goes into a doubling pattern that becomes self sustaining. (remember that one for me, will you?)

There were great questions from the floor: about morality-backstopping the developers and controllers of disruptive tech; about the disruptions caused by the development curve outpacing GDP growth.

And again and again there was an overwhelming sense of trust for the tech being developed, and the algorithms driving much of the development. Ismail cited biology that can write its own code, and Craig Venter’s self-replicating life form. He assured everyone that it wasn’t going to be like science fiction and Hollywood, though. I, for one, was relieved to hear it, sort of. Well, truthfully, I had questions.

To the presenters’ great credit, there was a ‘whoa nellie, we don’t advise this’ when it came to armed self-navigating, self-replicating drones. Phew! That took care of one question.

But I kept waiting to hear a little of that same woah nellie about algorithms or self-replicating 3D printers or some of the other tech mentioned, and I didn’t. Instead, what I heard (and this could have just been me) was: Trust your algorithms! They love you and want you to be better!

And it’s this trust that has me writing this post. Because what this trust is built on is a white-room, ‘these things will never talk to each other, much less have to rely on/modify outdated esoteric code to run except this one time as an experiment’, what-could-possibly-go-wrong mentality.

It’s a lot of fun to think about from a science fiction perspective.

It’s a little jarring from a programmer’s perspective.

Is it because we have conveniently forgotten that it’s human nature to make mistakes and (often) to cover those up? Do none of us have a piece of software that’s laggy or buggy because outsourced programmer A forgot to comment their code properly and in-house programmer B decided line 4523 didn’t matter all that much?

Have none of you enjoyed the benefits of the book-and-music recommendation algorithms and Bing searches of late?

Because that’s what we’re looking at. The pristine algorithms, when they get out of the lab and into the real world, meet … well, the real world. Which is messy, not pristine. It has mistakes in it. And sometimes those get written into the self-replicating code.

And then (thank you Mr. Kurzweil), sometimes that becomes self-sustaining. So here’s hoping that Craig Venter remembers that, and puts in a couple redundancies on the chemical kill switches (and you too Monsanto — oh, wait, that cat’s out of the bag.).

Here’s hoping the Big Dog developers do too. Because this? I don’t want a self-replicating swarm of this chasing me.

And you guys with the awesome technology that could make 3D printing in space possible? It was so great to meet you! Please stay away from the grey goo.

Related

Post navigation

12 comments

I program software. The idea of self-piloting cars terrifies me. How many times a year does your phone/computer/game console/etc update itself? Imagine car viruses. Re-imagine one of the recent news stories about a vehicle recall with self-piloting cars instead.

Programs do exactly what we tell them to. Humans are terrible communicators, and we have a long history of refusing to deal with unpleasant realities. We’re all going to be fleeing nanobot covered grey goo planets while screaming it wasn’t our fault.

What do you think the nanobot equivalent of xeroxing one’s own butt will be?

And all that is ignoring intention. I have met a (very few) people who really could fit in that trustingly generous, endlessly kind-hearted, and detail oriented ideal. Most of our species isn’t like that. We come in a lot of different neuro-types, and even before you layer on experience, not all of those are particularly compatible. We’re often greedy and cruel, we have fits of rage and moral outrage, and as a species we tend to make decisions first and justify afterward. And we often don’t act responsibly for it – we don’t take it into account beforehand, and we don’t clean up afterward.

The only consolation I have is that these fear we have are based on a reality that is still some years away. Between now and when that future does become reality I hope we have more of these possible problems worked out.