The Singularity Is Clear

Getting down to brass tacks about one of Interactive's favorite buzzwords

As human beings move into an age when we can keep our elderly alive into their hundreds, fresh fruit readily available year-round in regions where once folks were eating salted fish all winter, and infant mortality rates so low you'd think the Geico lizard was working the maternity ward, we've lost touch with one another and with our own humanity.

Technology and innovation are making miracles literally biblical in origin and scale seem a little flat. Think of the Internet. Now think of the Tower of Babel. Now think, what was so great about the Tower of Babel? The Internet is the library of Alexandria, the Tower of Babel, and every useful aspect of telephony combined. Just think of all the people who go around thanking "the Universe" instead of "God" for good fortune, or praying to that same endless celestial void for the manager's position at the food co-op. Folks today thank the Universe because it makes tangible and scientific the idea of infinity and the unknown. As I type on my slender rectangle capable of computing millions of times faster than I ever could, I sit in a silver cylinder, flying across the country while connected – midair, mind you – to a decentralized repository of all the world's knowledge. Looking at miracles like that, it's no wonder that technologists have developed a religiosity in their field. This year at SXSW, some of these modern prophets will come to share their robot gospel.

The high prophet of that faith is a vitamin-guzzling synthesizer magnate named Ray Kurzweil. His highly influential book The Age of Spiritual Machines posits that in the next few decades, artificial intelligence will radically outpace human intellect. More importantly, computers with this type of intelligence – called artificial general intelligence or strong AI – will be able to learn and problem solve independent of programmers, utilizing the full spectrum of human cognitive abilities. What gets Kurzweil's buns so toasty about the prospect of strong AI isn't how humans will use it, but how machines will. See, the very moment a computer (theoretically) learns to think for itself, it will (hypothetically) be able to improve on its own cognitive abilities instantaneously and rapidly. That's where the "singularity" part of the equation comes in: The moment strong AI gets a foothold, it will be like dropping the bass on the world's longest, most confusing dubstep track ever. (I am, of course, referring to the history of humankind.) Everything after that moment is up for debate. If Kurzweil's vision comes to fruition, we will see the dawn of a new age where scarcity and mortality will be things of the past. Every resource will be easily fabricated using molecular nanotechnology and the consciousnesses of humans everywhere will be mapped and uploaded to computers. And why not? With an all-knowing strong AI system running the show, there will be no problem unsolvable.

What's being lost in the static among singularity devotees is just how irresponsibly we're priming ourselves for the moment of our electronic ascension. At panels like "Unmanned Government: The Autonomous Future" and "From Rosie to Siri: Shifting Robotic Perceptions," SXSW attendees can expect to get slathered with a nice glossy coat of pro-tech exculpation. "Unmanned Government: The Autonomous Future," especially, takes this tack just a short stumble away from the real state of robot affairs: More than anything else, we're building and using autonomous bots on the battlefield or as literal vehicles of commerce, in the case of the Amazon drone delivery system. Google's driverless cars might save lives on the highway, but if we're still using Predator drones to kill people from a computer monitor, we've only displaced the body count. This is because many of the people at the forefront of robot and artificial general intelligence design walk to work with Jetsons daydreams and bank accounts loaded up with DARPA dollars.

As debate about the use of drone aircraft continues to boil, one thing is entirely evident: Without a computer god to tell us how to run things, autonomous killing robots are in the wrong hands. In Spike Jonze's Her, we like to think that sentient computers will be excited to hang out with us, fall in love, and eventually leave us because life is suffering. In all likelihood, a computer with ever-unfolding intelligence and insight would have hardly any interest at all in our goings-on. Why would it? I don't follow my three cats' skirmishes; they're cute, but I couldn't care less about their pointless territory wars. Moreover, when I don't feel like springing for Fancy Feast, they eat Meow Mix – so why would a godlike computer treat us any differently?