It’s a big word, proprioception. Think of it as sensory feedback. Your body provides this feedback continuously as you move. If you move your arm, the nerves in muscles and joints send a stream of positioning information, telling your brain about the position of the arm. Proprioception makes it possible for you to move your arm in well-controlled actions with your eyes closed. From a different perspective, proprioception is helpful, even necessary, for people with inoperable limbs (such as quadriplegics) who are trying to use a brain-machine interface to do things such as control a computer cursor.

This insight concerning the need for proprioception became the basis of research by Nicholas Hatsopoulus and his team at the University of Chicago (Illinois, USA). Earlier experiments with using the brain to control devices (usually a computer cursor) relied on visual cues. This worked, but often only partially or without any sophistication. The Hatsopoulus team wondered if a wearable robot might be used to provide proprioceptic information, and if this would improve the control.

The experiments they devised, using monkeys, fitted a robotic ‘sleeve’ over an animal’s arm. The sleeve acted like an exoskeleton that could move the monkey’s arm in synchronization with movements of a cursor on a computer screen. The cursor, in turn, was controlled by brain-wave reading sensors on the monkey’s head. The monkey ‘visualized’ cursor movement in its brain; the cursor moved on the screen; the monkey’s arm reflected the correct movement of the arm for the movement of the cursor, which sent feedback signals to the monkey’s brain. This arrangement increased the accuracy of the monkey’s control of the cursor by 40 percent.

As reported in The Journal of Neuroscience [December 15, 2010: Incorporating feedback from multiple sensory modalities enhances brain-machine interface control], the researchers observed brain activity of monkeys using the wearable robotic sleeve for proprioception feedback. They could see an increase in the information (firing complexity) in the neuron activity, when compared to trials that were conducted with only visual feedback.

The improvement seen from adding proprioception feedback may inform the next generation of brain-machine interface devices, Hatsopoulos said. Already, scientists are developing different types of “wearable robots” to augment a person’s natural abilities. Combining a decoder of cortical activity with a robotic exoskeleton for the arm or hand can serve a dual purpose: allowing a paralyzed subject to move the limb, while also providing sensory feedback.

To benefit from this solution, a paralyzed patient must have retained some residual sensory information from the limbs despite the loss of motor function – a common occurrence, Hatsopoulos said, particularly in patients with ALS, locked-in syndrome, or incomplete spinal cord injury. For patients without both motor and sensory function, direct stimulation of sensory cortex may be able to simulate the sensation of limb movement. Further research in that direction is currently underway, Hatsopoulos said.

The concept of wearing robotics to augment or extend physical capability is not new. It’s been the staple in one way or another for comic book heroes for a long time. But in reality, the linkage between robotic capability and the human brain remains a major challenge. The addition of proprioception capability to this kind of robotics is indicative of both the progress being made and how much there remains to be discovered.