Monkey controls robotic arm with mind

Share:

Save:

Subscribe:

Print:

Behind the Headlines

Thursday May 29 2008

A rhesus monkey feeds itself with a thought controlled robotic arm. Courtesy of Andrew Schwartz, University of Pittsburgh

“Monkeys have learnt to feed themselves using a robotic arm controlled by their thoughts”, The Times reported today. It said that this experiment could ultimately lead to paralysed people and amputees leading more independent lives. Extensive media coverage was given to a study in two rhesus monkeys that were fitted with a brain implant and then trained to control a robotic arm with their thoughts to feed themselves.

A letter to the scientific journal Nature described the study and included a description and videos of the technology known as the "brain-machine interface". Microelectrodes were implanted in the parts of the brain that control movement and the monkeys learned how to generate signals that were used to direct a robotic arm with five types of movement. Complex software allowed the researchers to adjust the speed, direction and end position of the arm so that the electrical impulses from the brain produced a useful movement with which the monkeys fed themselves.

This extensively reported study appears to have been well conducted. Although The Independent referred to this - perhaps justifiably - as a “major breakthrough in the development of robotic prosthetic limbs”, any practical application of this technology is still many years away.

Where did the story come from?

Dr Meel Velliste and colleagues from the University of Pittsburgh and Carnegie Mellon University, in Pennsylvania USA, carried out the research. The study was supported by a grant from the National Institutes of Health. The study was published in the (peer-reviewed) medical journal: Nature.

What kind of scientific study was this?

This experimental study was described in a narrative report in which the researchers recounted the methods and results of their experiment and supplemented it with video clips of the two monkeys. The researchers reported how previous studies have shown how monkeys could control the cursor on a computer screen using the signals generated by implanted electrodes in the brain. In this study, they aimed to show how these cortical signals could be used to demonstrate “fully embodied control”, that is to produce a direct interaction with the environment.

The monkeys were first taught to operate the robotic arm using a joystick, and were given incentive to use the arm to feed themselves. Once they had mastered this, they progressed to controlling the arm through thought alone. This was achieved by inserting implants in the motor cortex region of the brain, the area that controls movement. By mapping spikes in neural activity in different locations of the motor cortex, the researchers were able to translate this information into movement instructions for the arm.

The arm could move in multiple directions and had a shoulder, elbow and hand, which meant that the animal had to co-ordinate five separate movements to get the food, three at the shoulder, one at the elbow and a gripping movement with the hand. The researchers observed the interaction between the arm, the food target and the mouth, also recording the target’s three-dimensional location using a positioning device.

Electrical signals from the brain were used for reaching and retrieval movements as well as the loading and unloading of food as it was placed in the mouth. The researchers note that the gripper had to be within about 5–10mm of the target food’s centre position to successfully collect the food but that less accuracy was required for inserting the food into the mouth because the monkey could move its head to meet the gripper.

Two monkeys, called A and P, were tested. Monkey A was tested on two separate days. The researchers improved the methods between these two days but say that these improvements could not be used with monkey P as recordings from the cortical implant had faded by the time of the second round of experiments. In the improved method, the researchers replaced the robotic arm with one that had better mechanical and control properties. They also introduced a new presentation device that recorded the target location and removed the tendency of the human presenter to help the loading by moving their hand to meet the gripper. The gripper control was also improved.

What were the results of the study?

Monkey A performed two days of the continuous self-feeding task with a combined success rate of 61% (67 successes out of 101 attempted trials on the first day, and 115 out of 197 on the second day).

Monkey P also performed a version of the continuous self-feeding task, this time with an average success rate of 78% (1,064 trials over 13 days). Monkey P typically used just 15–25 cortical units, or electrical signals for control. The researchers say that monkey P’s success rate was higher than monkey A’s because his task was easier.

What interpretations did the researchers draw from these results?

The researchers say that “this demonstration of multi-degree-of freedom embodied prosthetic control paves the way towards the development of dexterous prosthetic devices that could ultimately achieve arm and hand function at a near-natural level”.

This means that that by showing that monkeys are capable of manipulating a robotic arm in several dimensions, the researchers are hopeful that artificial devices capable of skilful hand and arm movements, close to normal for humans, will follow.

What does the NHS Knowledge Service make of this study?

This extensively reported study appears to have been well conducted. The immediate implications for people with amputated limbs or paralysed by accidents or neurological disease may have been overstated. The fact that the researchers were able to improve their software and the robotic control between experiments on the different monkeys suggests that this type of research is being continuously improved. Future research in the fields of neurobiology and bioengineering are needed to perfect the hardware and the software used in these devices before it is known whether they could be implanted in humans.

Sir Muir Gray adds...

The brain is a big electronic control box; now that the electronic energy of the brain can be captured, it can drive a machine just like it can drive a limb.