The Reality of Robot Surrogates

How far are we from sending robots into the world in our stead?

23 September 2009—Imagine a world where you're stronger, younger, better looking, and don't age. Well, you do, but your robot surrogate—which you control with your mind from a recliner at home while it does your bidding in the world—doesn't.

It's a bit like The Matrix, but instead of a computer-generated avatar in a graphics-based illusion, in Surrogates—which opens Friday and stars Bruce Willis—you have a real titanium-and-fluid copy impersonating your flesh and blood and running around under your mental control. Other recent films have used similar concepts to ponder issues like outsourced virtual labor(Sleep Dealer) and incarceration (Gamer).

The real technology behind such fantastical fiction is grounded both in far-out research and practical robotics. So how far away is a world of mind-controlled personal automatons?

"We're getting there, but it will be quite a while before we have anything that looks like Bruce Willis," says Trevor Blackwell, the founder and CEO of Anybots, a robotics company in Mountain View, Calif., that builds "telepresence" robots controlled remotely like the ones in Surrogates.

Telepresence is action at a distance, or the projection of presence where you physically aren't. Technically, phoning in to your weekly staff meeting is a form of telepresence. So is joysticking a robot up to a suspected IED in Iraq so a soldier can investigate the scene while sitting in the (relative) safety of an armored vehicle.

Researchers are testing brain-machine interfaces on rats and monkeys that would let the animals directly control a robot, but so far the telepresence interfaces at work in the real world are physical. Through wireless Internet connections, video cameras, joysticks, and sometimes audio, humans move robots around at the office, in the operating room, underwater, on the battlefield, and on Mars.

A recent study by NextGen Research, a market research firm, projects that in the next five years, telepresence will become a significant feature of the US $1.16 billion personal robotics market, meaning robots for you or your home.

According to the study's project manager, Larry Fisher, telepresence "makes the most sense" for security and surveillance robots that would be used to check up on pets or family members from far away. Such robots could also allow health-care professionals to monitor elderly people taking medication at home to ensure the dosage and routine are correct.

Right now, most commercial teleoperated robots are just mobile webcams with speakers, according to NextGen. They can be programmed to roam a set path, or they can be controlled over the Internet by an operator. iRobot, the maker of the Roomba floor cleaner, canceled its telepresence robot, ConnectR, in January, choosing to wait until such a robot would be easier to use. But plenty of companies, such as Meccano/Erector and WowWee, are marketing personal telepresence bots.

Blackwell's Anybots, for example, has developed an office stand-in called QA. It's a Wi-Fi enabled, vaguely body-shaped wheeled robot with an ET-looking head that has cameras for eyes and a display in its chest that shows an image of the person it's standing in for. You can slap on virtual-reality goggles, sensor gloves, and a backpack of electronics to link to it over the Internet for an immersive telepresence experience. Or you can just connect to the robot through your laptop's browser.

Photo: Anybots

Blackwell has been using QA to telecommute to his office for the past few weeks so he can enjoy the "sun streaming in" his windows in the morning while his robot sits next to his desk at the office, available for anyone to approach and ask him a question. He can hear and answer through a Bluetooth headset on his end, and he can follow colleagues if they want to show him something. "Or I can go pester them in their cubicles," he says.

Blackwell's real dream is a teleoperated robotic servant to wash the dishes and check the oil in his car while he's watching the kids. That might be creepy, considering it means that someone else—whether from a call-center type of operation run by a large company, an outsourced sea of computers, or someone logged in from three doors down (granted, with your approval)—could be peering into your world from afar (á la Sleep Dealer). But it could free up time for the good things in life if done right, Blackwell says.

Not all of today's telepresence robots are for personal use. Yulun Wang, CEO of InTouch Health, a Santa Barbara, Calif., company that provides medical telepresence robots to hospitals, says his goal is to solve the problem of getting "the right [medical] expertise to the right place at the right time." For urgent health-care needs such as stroke, critical care trauma, or high-risk pregnancy, medical specialists aren't available at every care center, Wang says.

That's where robots can help out. The St. Joseph Mercy Oakland hospital, in Pontiac, Mich., for example, "has the physician expertise and the technology to provide the latest in stroke care," Wang says. But outlying hospitals do not. With InTouch Health's RP-7i robot, controlled by a remote doctor through a joystick and a secure Internet connection, 31 Michigan hospitals take advantage of St. Joseph's doctors to treat patients with symptoms of stroke. According to Wang, only 4 percent of stroke patients across the country get treated with the right drugs. That's partly because drugs need to be administered within three hours of the first symptoms, which often can't happen in a remote location. Within the Michigan RP-7i network, Wang says, 85 percent of patients are now treated appropriately and in time.

RP-7 (RP-7i's predecessor) has also helped train and advise surgeons from more than 8000 kilometers away. In 2007, Dr. Alex Gandsas, head of the Bariatric and Minimally Invasive Surgery division of Sinai Hospital in Baltimore, used the robot to remotely train two Argentine doctors on a procedure to treat morbid obesity. For three months, the doctors in Bahía Blanca, Argentina, logged in to a robot in Baltimore that followed Gandsas on his rounds and in surgery.

When it came time to schedule three surgeries in Argentina, Gandsas reversed the process. He sent the robot to Argentina and logged in himself, becoming the "virtual doctor" observing during the procedures and postoperative rounds. "We concluded that operations can be learned totally by remote presence," Gandsas says. And with the number of surgeons graduating from medical school decreasing while the population increases, Gandsas believes "telemedicine will come to the rescue."

But no one would watch a movie about a doctor—or an action hero, for that matter—joysticking a robot around. So in Surrogates and many other films, the connection to the robot goes straight to the brain. "The keyboard was invented in the 19th century, and we still use it and love it," says Jose Principe, director of the Computational NeuroEngineering Laboratory at the University of Florida. "But that doesn't mean that there aren't better alternatives. The brain has exquisite control of muscles. There's the possibility of finding the code and translating it to machines."

Photo: InTouch Health

Principe and his collaborator, Justin Sanchez of UF's departments of pediatrics, neuroscience, and biomedical engineering, hope to link computer algorithms with biological systems, "short-circuiting the brain–spinal cord–muscle link" and using the brain's "intent" to control a robotic device.

In the late 1990s, neuroscientists discovered that by implanting electrodes in rats' brains and connecting them to computers operating robotic levers, it was possible to decode the intent of rats to move the levers for a food or drink reward. Then monkeys succeeded in mentally playing video games to get a juice reward. These experiments showed that the animals' motor cortices continued to produce the brain patterns they'd learned when physically moving a robot, even when the animals weren't actually moving their paws or arms.

The latest work from Principe and Sanchez is moving one level further, using the motor cortex for actions that aren't related to movement. Rats in their lab learned to maneuver a robotic gripper outside their cages to move an LED sideways—an action unrelated to any movement the rat can make on its own. Principe says this is a proof of concept that shows rats are using the motor cortex to do something other than move a body part.

The hope is that, eventually, a "high-level dialogue" between brains and computer agents could decide how to move a robot in the world, says Principe.

In the meantime, for lack of a good brain interface to the robotic surrogate of your dreams (and a few other technical obstacles), you'd best go out and face the world yourself.