Severely Disabled People Control Robotic Arm Through Thought

Severely Disabled People Control Robotic Arm Through Thought

Two people who are unable to move their limbs have been able to guide a robot arm to reach and grasp objects using only their brain activity, a paper in Nature reports today1.

The study participants — known as Cathy and Bob — had had strokes that damaged their brain stems and left them with tetraplegia and unable to speak. Neurosurgeons implanted tiny recording devices containing almost 100 hair-thin electrodes in the motor cortex of their brains, to record the neuronal signals associated with intention to move.

In a trial filmed in April last year and presented with the paper, Cathy, who had her stroke 15 years ago and received the implants in 2005, used her thoughts to steer a robot arm to grasp a bottle of coffee and lift it to her lips. She drank and smiled (see video).

‘We’ll never forget that smile,” says Leigh Hochberg, a neuroengineer at Brown University in Providence, Rhode Island, and a co-author of the paper.

The work is part of the BrainGate2 clinical trial, led by John Donoghue, director of the Brown Institute for Brain Science in Providence. His team has previously reported a trial in which two participants were able to move a cursor on a computer screen with their thoughts2.

Related stories

Neuroscience: Brain-controlled robot grabs attention

‘Marilyn Monroe’ neuron aids mind control

Neuroscience: Opening up brain surgery

More related stories

“To move from this type of two-dimensional movement to movements involving reaching out for an object, grasping it and then guiding it in three-dimensional space is a huge step for us,” says Donoghue. “It seems like more than one additional dimension in complexity.”

The power of thought

The challenge lies in decoding the neural signals picked up by the participant’s neural interface implant — and then converting those signals to digital commands that the robotic device can follow to execute the exact intended movement. The more complex the movement, the more difficult the decoding task.

The neuroscientists are working closely with computer scientists and robotics experts. The BrainGate2 trial uses two types of robotic arm: the DEKA Arm System, which is being developed for prosthetic limbs in collaboration with US military, and a heavier robot arm being developed by the German Aerospace Centre (DLR) as an external assistive device.

In the latest study, the two participants were given 30 seconds to reach and grasp foam balls. Using the DEKA arm, Bob — who had his stroke in 2006 and was given the neural implant five months before the study —- was able to grasp the targets 62% of the time. Cathy had a 46% success rate with the DEKA arm and a 21% success rate with the DLR arm. She successfully raised the bottled coffee to her lips in four out of six trials.

Scientists are euphoric about the results, which show that people who have been paralysed for many years can still be helped to communicate and perform tasks by themselves. Rodrigo Quian Quiroga, a neuroengineer at the University of Leicester, UK, who was not involved in this study, was “amazed” that the brain’s movement intentions could be read so long after a person had been paralysed. “It’s all very promising,” he says.

But Donoghue stresses that there is long way to go. “Movements right now are too slow and inaccurate — we need to improve decoding algorithms,” he says.

In the meantime, his team is continuing to recruit for the BrainGate2 trial, which is aimed mainly at testing whether the implanting procedure is safe. So far, seven people have received the implants, and none has shown serious adverse effects. The researchers hope to recruit a total of 15 people who have been paralysed by stroke, by neurodegenerative conditions such as amyotrophic lateral sclerosis, or because their spinal cords have been severed.

In the longer term, the scientists want to dispense with the wires that must be attached to a patient’s skull; wireless systems are in development, says Donoghue. Even further in the future, researchers hope to dispense with the robot arms and direct the decoded brain signals straight to the patient’s own muscles.

Paralysis following spinal cord injury, brainstem stroke, amyotrophic lateral sclerosis and other disorders can disconnect the brain from the body, eliminating the ability to perform volitional movements. A neural interface system1, 2, 3, 4, 5 could restore mobility and independence for people with paralysis by translating neuronal activity directly into control signals for assistive devices. We have previously shown that people with long-standing tetraplegia can use a neural interface system to move and click a computer cursor and to control physical devices6, 7, 8. Able-bodied monkeys have used a neural interface system to control a robotic arm9, but it is unknown whether people with profound upper extremity paralysis or limb loss could use cortical neuronal ensemble signals to direct useful arm actions. Here we demonstrate the ability of two people with long-standing tetraplegia to use neural interface system-based control of a robotic arm to perform three-dimensional reach and grasp movements. Participants controlled the arm and hand over a broad space without explicit training, using signals decoded from a small, local population of motor cortex (MI) neurons recorded from a 96-channel microelectrode array. One of the study participants, implanted with the sensor 5 years earlier, also used a robotic arm to drink coffee from a bottle. Although robotic reach and grasp actions were not as fast or accurate as those of an able-bodied person, our results demonstrate the feasibility for people with tetraplegia, years after injury to the central nervous system, to recreate useful multidimensional control of complex devices directly from a small sample of neural signals.