Share this story

Imagine: a robotic prosthetic arm that you can not only control with your brain, but actually feel when it touches something. This might sound like science fiction, but a team of researchers designed and tested a system with monkeys that does just this. They call their setup a "brain-machine-brain interface," or BMBI, and it has the potential to give amputees closer-to-normal functionality.

Brain-machine interfaces (BMI) have come a long way in recent years, enabling complex robotic limbs with multiple degrees of freedom, but people rely on tactile feedback for fine control of their limbs. Try to imagine picking up something as simple as a glass without being able to feel when your fingers are around it—awkward and difficult. Unfortunately, this is one area where there has been less progress. One group used vibrational feedback to indicate touching, but otherwise most BMI systems rely on sight—until now.

The BMI portion of this new approach is similar to one we reported on a few years ago. The researchers implanted microelectrode wires in the primary motor cortex (also known as M1) of a monkey’s brain. M1 is the region of the brain responsible for movements, so by measuring electrical signals in particular places, the brain interface can effectively directly control robotic limbs.

There weren’t any robotic monkey arms here, though; the monkeys controlled a virtual arm on a computer monitor. Initially, this was done through a joystick (which they learned to use through fruit juice rewards). After it was clear that the monkeys could use the joystick to find virtual objects, the researchers switched the control of the virtual arm to the BMI—the monkeys still moved the joystick, but their brain signals actually moved the arm. The joystick was necessary because monkeys, unlike most humans, don’t respond well to being told to move their arms in a certain way.

Here’s where the new stuff comes in (and the second "B" in BMBI). Through additional microelectrodes implanted in the primary somatosensory cortex (or S1), the team sent what’s known as intracortical microstimulation (ICMS, essentially low-level electrical pulses) to the area of the brain responsible for the sense of touch. This tells the brain that the hand is touching something (even when it isn’t, or the hand is missing entirely).

Together, these two sets of electrodes form the BMBI feedback loop. The M1 center sends a movement signal, the virtual arm moves and, when it touches something, an ICMS signal is sent back to the S1 center in the brain. Feeling, without actual contact.

The scientists made their system more sophisticated by using slightly different ICMS signals to indicate different textures—and found the monkeys could distinguish them. This was fairly straightforward in a virtual environment, but moving this system to the real world would take some work—robotic fingers would need sensors to recognize textures.

One tricky issue the team had to solve was interference—the ICMS feedback could interfere with the BMI detection of brain signals for movement. To solve this, they multiplexed the signals (think of sending multiple phone calls over a single line), using separate 50 ms intervals for recording and stimulation.

This is truly exciting research, and might open the door to the next generation of prosthetic limbs. The BMBI approach presented here completely bypassed the body, only communicating with the brain. It’s going to take serious work to extend this concept to an actual prosthetic limb, especially adding a sense of touch, but that is definitely the next step.

Share this story

Kyle Niemeyer
Kyle is a science writer for Ars Technica. He is a postdoctoral scholar at Oregon State University and has a Ph.D. in mechanical engineering from Case Western Reserve University. Kyle's research focuses on combustion modeling. Emailkyleniemeyer.ars@gmail.com//Twitter@kyle_niemeyer