Paralyzed Patients Can Now Control Android Tablets With Their Minds

Patient T6 was barely middle-aged when she began losing muscle function.

A talented musician with a love for red lipstick, T6 was diagnosed with Amyotrophic Lateral Sclerosis, a progressive—and unstoppable—neurodegenerative disorder that eats away at motor neurons that control movement. Speech and swallowing are generally the first to go. In just a few years, T6 was paralyzed and hooked to a ventilator to breathe.

T6’s story may have ended there. Although modern neurotechnologies have given paralyzed patients mind-controlled robotic limbs or even the ability to walk again, brain-machine interfaces haven’t been able to reopen access to a similarly indispensable world: the digital universe that gives us email, Google, YouTube, and all the associated conveniences.

In 2012, T6 made a decision that changed her story: she had a tiny 96-channel microelectrode array implanted into the motor regions of her brain. The implant was barely the size of a fingerprint, but it became the linchpin that hooked up her thoughts and needs with the online world.

This month, in an open-access study published in PLOS One, a team reported the first brain implant system that lets patients use their thoughts to navigate an off-the-shelf Android tablet.

With just her thoughts, T6 was able to send emails, chat with other paralyzed patients in the trial, Google random questions, and even shop on Amazon. For the first time since she became paralyzed, T6 regained access to the entire commercially-available Google Play ecosystem and the digital world.

The BrainGate Experiment

A group of three patients including T6 was enrolled in a multiyear trial overseen by BrainGate. A multidisciplinary research consortium with neurologists and engineers from Brown University, Massachusetts General Hospital, Stanford University, and others, BrainGate has worked for decades to give paralyzed patients their independence back.

An initial success came a few years ago, when BrainGate developed and verified a thought-controlled typing system that lets paralyzed and locked-in patients type out their thoughts. For those with no other means of communication, this was life-changing.

Yet the research group wasn’t satisfied. Although efficient, the user interface looked like an early DOS system, without intuitive graphical user interfaces. And perhaps more importantly, text processing is just a tiny sliver of our digital world. What if, they pondered, we can give patients the ability to control a virtual mouse, opening up the entire internet and app stores?

To try out the hypothesis, they recruited three paralyzed patients, all of whom had microelectrode arrays implanted into their motor cortex. In addition to T6, there was also a 51-year-old African-American ALS patient dubbed T9 and a 63-year-old patient T5 who lost mobility due to a spinal cord injury.

All patients were fitted with a small rectangular box outside their skulls that held the necessary hardware. A recording system called NeuroPort recorded the electrical cacophony within groups of neurons inside the motor cortex. To decipher intention, the signals were passed on in real time to a computer running custom software for processing and decoding.

Finally, the output of the decoding algorithm was sent wirelessly to a Bluetooth interface that was programmed to work like a Bluetooth mouse. Using standard protocols, the researchers connected the Bluetooth interface to Google Nexus 9 running Android OS 5.1.

It’s like reprogramming your smartphone to work as a mouse for your computer or tablet, except with this system, your thoughts alone can control the movement and clicks of the mouse.

Each patient selected his or her own trigger for a mouse click. For example, patient T6 would imagine squeezing her left hand, which generates a specific neural activation pattern that the interface can pick up and decipher to click the mouse. In contrast, T5’s “telekinetic” trigger was flexing his left arm, whereas T9 pictured squeezing his right hand.

On average, this initial calibration period took roughly two weeks, working just a dozen minutes each day. “Methods for further reducing this initial calibration period have been implemented more recently,” the authors noted.

Easy and Natural

Once the BrainGate team finished calibration, it was off to the races.

The patients were each given a tablet initialized to the home screen. First, they were asked to use their new mind-controlled “point and click” superpower to perform some common tasks: they checked and responded to emails using the Gmail app and chatted with each other through Google Hangouts. They checked the weather, browsed through a news aggregator, and spent some quality time on YouTube and Pandora.

T6, for example, looked up “orchid care” on Google and shot a BrainGate researcher a sweet message in celebration of her 1,001th day with the team: “Thank you for all your work and support. This research means a lot to me.”

In all, the patients took under 20 minutes to complete seven tasks using multiple apps. The team didn’t find any technical issues surrounding decoder calibration, Bluetooth pairing, or app crashes during the process.

With the tablets’ autocorrect feature on, the patients could type between 13 and 30 characters per minute, using just their thoughts.

When given “free” time with the tablet, T9 used his mind to Google information and videos explaining his disorder, ALS—while having Pandora streaming his favorite stations in the background. He also opened up a word processor to jot down his thoughts and composed essays for fun. “[The interface is] amazing!” he said, “I have more control over this than what I normally use.”

T5, the oldest of the group, successfully sent his first text message—ever—to his friends and family using only his thoughts. “[I] loved sending the message. Especially because I could inject some humor,” he later told the research staff.

From Clicking to Multi-Gestures

For all that the system allows, it isn’t perfect. Right now, mouse control is limited to point and click, which means that the patients can’t click and drag or use multitouch with their minds. This limited their ability to scroll down a page on the tablet—an obvious shortcoming.

“Some of these limitations would have been overcome by enabling accessibility features found in the Android OS or third-party programs,” the BrainGate team noted, but added that their goal was to check user feasibility using off-the-shelf stock user interfaces.

“Expanding the control repertoire with additional decoded signals, leveraging more optimized keyboard layouts, exploring accessibility features, and controlling other devices and operating systems are subjects of future study,” the team said.

Even with these shortcomings, the study opened up a vast, mature, industry-scale suite of software to the paralyzed patients.

“It was great to see our participants make their way through the tasks we asked them to perform, but the most gratifying and fun part of the study was when they just did what they wanted to do—using the apps that they liked for shopping, watching videos or just chatting with friends,” said lead author Dr. Paul Nuyujukian.

And decades after she was initially diagnosed, T6 could once again play music.

“Providing her with a music keyboard interface on the tablet was as simple as installing an application from the internet,” the team noted.

“One of the participants told us at the beginning of the trial that one of the things she really wanted to do was play music again. So to see her play on a digital keyboard was fantastic,” the team said.

Shelly Xuelai Fan is a neuroscientist-turned-science writer. She completed her PhD in neuroscience at the University of British Columbia, where she developed novel treatments for neurodegeneration. While studying biological brains, she became fascinated with AI and all things biotech. Following graduation, she moved to UCSF to study blood-based factors that rejuvenate aged brains. She is the ...