A clever new system helps paralyzed patients and computers work together to control a robot, helping to connect locked-in people with the world.

Over recent months, in José del R. Millán’s computer science lab in Switzerland, a little round robot, similar to a Roomba with a laptop mounted on it (right), bumped its way through an office space filled with furniture and people. Nothing special, except the robot was being controlled from a clinic more than 60 miles away—and not with a joystick or keyboard, but with the brain waves of a paralyzed patient.

The robot’s journey was an experiment in shared control, a type of brain-machine interface that merges conscious thought and algorithms to give disabled patients finer mental control over devices that help them communicate or retrieve objects. If the user experiences a mental misfire, Millán’s software can step in to help. Instead of crashing down the stairs, for instance, the robot would recalculate to find the door.

Such technology is a potential life changer for the tens of thousands of people suffering from locked-in syndrome, a type of paralysis that leaves patients with only the ability to blink. The condition is usually incurable, but Millán’s research could make it more bearable, allowing patients to engage the world through a robotic proxy. “The last 10 years have been like a proof of concept,” says Justin Sanchez, director of the Neuro­prosthetics Research Group at the University of Miami, who is also studying shared control. “But the research is moving fast. Now there is a big push to get these devices to people who need them for everyday life.”

Millán’s system, announced in September at Switzerland’s École Polytechnique Fédérale de Lausanne, is a big step in making brain-machine interfaces more useful by splitting the cognitive workload between the patient and the machine. Previously, users had to fully concentrate on one of three commands—turn left, turn right, or do nothing— creating specific brain wave patterns detected by an electrode-studded cap. That system exhausted users by forcing them to think of the command constantly. With shared control, a robot quickly interprets the user’s intention, allowing him to relax mentally. Millán is now developing software that is even better at weeding out unrelated thoughts and determining what the user really wants from the machine.

Although the disabled will probably be the first beneficiaries of Millán’s technology, we may all eventually end up under the scanner. Millán and auto manufacturer Nissan recently announced they are collaborating on a shared-control car that will scan the driver’s brain waves and eyes and step in if the mind—and the Altima—begin to wander.