Posted
by
timothyon Wednesday September 08, 2010 @05:10AM
from the brain-tsunamis-too-intense dept.

cortex writes with an excerpt from the L.A. Times: "In a first step toward helping severely paralyzed people communicate more easily, Utah researchers have shown that it is possible to translate recorded brain waves into words, using a grid of electrodes placed directly on the brain. ... The device could benefit people who have been paralyzed by stroke, Lou Gehrig's disease or trauma and are 'locked in' — aware but unable to communicate except, perhaps, by blinking an eyelid or arduously moving a cursor to pick out letters or words from a list. ... Some researchers have been attempting to 'read' speech centers in the brain using electrodes placed on the scalp. But such electrodes 'are so far away from the electrical activity that it gets blurred out,' [University of Utah bioengineer Bradley] Greger said. ... He and his colleagues instead use arrays of tiny microelectrodes that are placed in contact with the brain, but not implanted. In the current study, they used two arrays, each with 16 microelectrodes."

It uses "not new" technology to select words with 50% accuracy from a list such as "yes" and "no"...really. (Okay, it hits 90% accuracy with only two items and goes down to 48% with 10.)

In other news, you can use P300 responses picked up with a $300 off-the-shelf over-the-hair EEG receiver to select from a grid of visual stimuli at a pretty good rate and with something like 95%+ accuracy (presumably nearly 100% with the sort of training that goes into touchscreen or voice activated interfaces). Those items can be letters, words, pictures...whatever. Anything quickly recognizable. Congrats guys, you just invented a crappy version of something I can buy for $300 which requires cutting open the person's skull and implanting things on the surface of their brain.

FYI, to whoever funded this, please give the lab I work at the grant monies next time. We'll make much better use of it.

Do you think in English, or do you think in abstract thoughts that your brain then later makes you think were direct English? I think there's a bit of debate on that, and it is something that's difficult to test.

As somebody who is fluently bilingual (speaking one language at home and another while out with friends), my thoughts tend to be neither English or Afrikaans but rather concepts which are then translated.
When I think I generally dont think in words unless I think about thinking in words.
I'm sure many other bilingual people that speak both languages frequently can probably say something similar.

P300 is typically 300ms (thus the name), and the technique I was referring to uses two responses to generate a match (it flashes rows and columns so you need an X and Y response). 600ms or thereabouts is thus the time to beat. It's not lightning fast - nothing like typing - but a whole hell of a lot better than the reference methods that they're referring to. They're solving a brain-computer interface problem that was solved 10 years ago, and that was made irrelevant several years ago when cheap neural interfaces started hitting the commercial commodity market.

Of course this is all relying on TFA, which could be completely misrepresenting their research given the general high quality of modern science journalism.

Also, earlier kidding aside, the article is probably completely missing the point. It is likely that the actual purpose of the research is NOT to develop the current prototype's functionality. It is more likely that it is exploring the ability to take, reduce, and analyze data of this type. The fact that you can build (buy off-the-shelf for peanuts) a BCI whose functionality is equal to or greater than their prototype using less invasive methods is probably completely beside the point.