UNLOCKED

BY KORTNY ROLSTON

FOR PEOPLE WITH AMYOTROPIC lateral sclerosis (ALS), cerebral palsy or a severe upper spinal cord injury, communicating with the outside world can be nearly impossible.

Many eventually develop “locked-in syndrome,” a condition in which they lose all voluntary muscle control and any ability they might have had to communicate – even blinking their eyes in response to questions. And unlike those who are in a coma, people who suffer from this syndrome are aware of what is going on around them.

“It’s a very difficult syndrome and can be caused by all manner of injuries, neuromuscular diseases and even stroke,” said Patricia Davies, professor of occupational therapy who co-directs Colorado State University’s Brainwaves Laboratory in the College of Health and Human Sciences and specializes in researching severe impairments. “They can’t control any of their muscles and they lose the ability to communicate.”

Davies is partnering with Chuck Anderson, a professor in the Department of Computer Sciences in the College of Natural Sciences, and William Gavin, a researcher in the Department of Human Development and Family Studies in CHHS, to help those with locked-in syndrome or other limitations communicate with the outside world.

The research team is developing a brain-machine interface that would allow users to activate an electronic device or dictate a message on a computer by changing their mental activity.

“We really envision building a system that would enable people to turn on a television by doing something simple like multiplying 2 times 4 in their head or type an email by completing a series of mental tasks,” Anderson said.
Researchers have long experimented with brain computer interfaces – or BCIs – which are essentially hardware/software systems that establish a direct pathway from the brain to an external device.

That pathway is created by placing electrodes around the scalp to detect the electrical signals produced by neurons firing in the brain. Those signals are then transmitted via wires to a device that decodes and translates them into a letter, word or action with the help of a mathematical algorithm.

But moving these brain-computer interfaces from the laboratory and into, say, the home have been difficult.

Placing the electrodes over the correct spots is tricky, and with many EEG caps, a gel is dabbed on the head to amplify brain signals. The EEG caps and systems required to make BCI technology work can be expensive. It’s also unclear if an EEG system will work in a home where there is interference from other electronic devices.

Those are all obstacles Anderson, Davies and Gavin are trying to overcome with their system.

“It needs to be able to work in a home, be easier to use and be more affordable,” Davies said.

During the first phase of the project, the team tested off-the-shelf and higher-end laboratory-quality EEG systems in homes of Fort Collins-area people who struggled to communicate with caregivers. One participant, for example, suffered a traumatic brain injury and could blink his eyes slightly to communicate.

They collected brain signals and patterns from the patients and also checked for the interference from televisions, computers and other devices within the homes.

Their initial results indicate that brain signaling data collected by the lower-cost EEG systems were similar to the more costly versions Davies and Gavin use in CSU’s Brainwaves Laboratory. The researchers also found that they were able to block some of the electronic interference.

“It appears these lower-cost systems can work inside homes,” Davies said.
The National Science Foundation-funded project has now entered its second phase, which lies primarily with Anderson and his graduate students.

They are building algorithms that sift through “noise” in the EEG data and quickly classify the different brain signals so they can be interpreted by a computer or sensor as a specific action in real time.

For example, the complex algorithms need to be able interpret the signal emitted by someone multiplying 2 times 4 correctly so it can accurately turn on the television. “We need to know what these signals look like when
multiplying or performing a specific action for this system to work,” Anderson said.

To do that, Anderson and his students have not only incorporated the field data from the initial in-home phase but also have people wear an EEG cap in their lab and have them perform mental tasks so they can continue to classify those brain signals. They have refined the algorithms enough now that participants can direct a robot in the lab to move by completing mental tasks.

Anderson’s goal is to build artificial intelligence into the system so it adapts to a specific person’s brain signals and learns from their thoughts. He eventually would like to install it in a robotic caregiver that could be directed to fetch its patient water or help with other tasks.
In the meantime, he continues to refine the algorithms, which are the brains behind the technology.

Once that is complete, the team plans to return to the patients’ homes they visited during the initial field study and test the system. They will ask patients to perform a series of tasks to either type a message or control another electronic device.