Navigation

Unresponsive Wakefulness Syndrome: System to help patients communicate

Researchers at Cluster of Excellence CITEC launch collaborative research project

A new device is meant to help people with severe brain damage to communicate with others. The NeuroCommTrainer is supposed to understand brain signals, and enable the patient to respond with a “yes” or a “no” using electroencephalogram (EEG) measurements. Best of all, the system helps train patients to control their brain activity in a targeted way. The NeuroCommTrainer project has now begun and is funded with 1.87 million Euro. Neuropsychologist Professor Dr. Johanna Kissler, of Bielefeld University’s Cluster of Excellence Cognitive Interaction Technology (CITEC), is leading the new research. Three universities, two companies, and the v. Bodelschwingh Foundation Bethel are working together on this project.

CITEC researcher Prof. Dr. Johanna Kissler is coordinating the NeuroCommTrainer project. The goal is to create a system that understands brain signals and mediates communication with patients suffering from severe brain damage. Photo: Bielefeld University/CITEC

People fall into an unresponsive wakefulness syndrome when they have suffered severe brain damage arising from, for instance, an accident or a cerebral bleed. Doctors often assume that these patients are vegetative – meaning that they are unconscious. “Yet, in more than a third of cases, this proves to be a misdiagnosis,” says Johanna Kissler. The neuropsychologist wants to enable patients to be able to make themselves understood with simple answers. To do this, she is using electroencephalography. With this technology, brain activity can be measured using electrodes placed on scalp.

“Today, there are already brain-computer interfaces with which people can communicate via brain signals. However, these are not suitable for patients with disorders of consciousness,” says Kissler. “Our system has the advantage that it adapts to the individual. This is because it recognizes phases of optimal alertness in which the person is most responsive.” In addition to this, the system also trains patients to control their attention, and thus their brain signals. “And it trains language comprehension. This is necessary because brain damage often causes the individual to at least partially lose their language,” says Kissler.

The basis for the NeuroCommTrainer is a program that recognizes patterns in brain activity. “In order to understand what the patient wants to say, the system has to read, comprehend, and translate the structures within the brain signals, so to speak,” says Johanna Kissler. For the new system, the research project team is developing several components, including tiny EEG sensors. The sensors send the brain signals to a computer, which analyzes them. In order to capture patients’ reactions, the system is also equipped with sensors that measure temperature, contact, force, and strain. With these sensors, weak motor reactions in the fingers and hands can be detected. At the same time, signals are also sent through such sensors to stimulate the patient. Because all sensors are small and unobtrusive, and do not bother the person wearing them, the system is also suitable for long-term measurement and long-term stimulation.

Kissler’s research group is testing the NeuroCommTrainer at Haus Elim, a nursing care facility of the v. Bodelschwingh Foundation Bethel in Bielefeld. “In order to establish contact with patients with unresponsive wakefulness syndrome, we are working with acoustic stimuli, such as their favorite music,” says Kissler’s colleague Dr. Inga Steppacher, who is evaluating the new technology at Haus Elim.

Steppacher will practice with the patients how to answer questions by controlling their thoughts. “For this, as a first step, we train language comprehension,” says Kissler. The researchers take advantage of a special feature of the brain: if the patient perceives a nonsensical sentence (e.g. “The bread is too hot to dog”), the EEG measures a typical amplitude in the brain that occurs with a 400 millisecond delay (N400 response).

“Through this training, we find out whether the patient understands the meaning of a sentence. Only once this works do we practice with the patient how to answer ‘yes’ or ‘no’ with the brain.” For this, the team uses another specific reaction of the brain: the P300 response. This occurs when the patient perceives an acoustic stimulus, such as their partner’s voice. “Using this, we can practice with the patient how to give the P300 response in order to answer ‘yes’ to a question.”

The Neuroinformatics research group is working on automatic analysis of the EEG signals in NeuroCommTrainer. The group is led by Professor Dr. Helge Ritter, who is also the coordinator of the Cluster of Excellence CITEC. His team is developing a program that filters and analyzes the seeming jumble of data in real-time. “This classifier derives from the measured brain signals when a patient reacts to a stimulus, such as an emotional sound, and when their brain does not respond,” says Helge Ritter. “The remarkable thing is that the classifier learns the particular language of the individual brain, and thus understands the person’s brain signals.”

Meanwhile, the CITEC research group Ambient Intelligence, led by Dr. Thomas Hermann, is working to turn the EEG data into sound. “If a caregiver leaves for an hour, for instance, they can hear whether there were any remarkable brain signals during this time,” explains Thomas Hermann.

As part of this project, Hermann’s team is also working on wearable sensors and pulse generators (haptuators), which send out tactile vibrations. “Contrary to long-held belief, coma patients actually do perceive tactile stimuli,” says Kissler. The haptuators are supposed to be woven unobtrusively into clothing and are meant to help people perceive the processes in their brain. A patient is asked a question (“Does your back hurt”): the patient directs their attention, and in doing so, activates a certain part of the brain. The NeuroCommTrainer understands the answer (“yes”) and confirms this with two short vibrations. This principle is called biofeedback.

The NeuroCommTrainer research project is funded by the Bundesministerium für Bildung und Forschung [Federal Ministry for Education and Research] and will run for three years until May 2020. In addition to Bielefeld University, project partners include the Carol von Ossietzky University of Oldenburg and the Ludwigsburg Protestant University of Applied Science, as well as measurement equipment manufacturer Easycap (in Herrsching, Bavaria, Germany) and Applied Biosignals (in Weener, Lower Saxony, Germany).