Posted
by
timothyon Tuesday April 21, 2009 @03:31PM
from the twitter-use-officially-not-all-brainless dept.

An anonymous reader writes "From a University of Wisconsin-Madison announcement: 'In early April, Adam Wilson posted a status update on the social networking Web site Twitter — just by thinking about it. Just 23 characters long, his message, 'using EEG to send tweet,' demonstrates a natural, manageable way in which "locked-in" patients can couple brain-computer interface technologies with modern communication tools. A University of Wisconsin-Madison biomedical engineering doctoral student, Wilson is among a growing group of researchers worldwide who aim to perfect a communication system for users whose bodies do not work, but whose brains function normally.' A brief rundown of the system: Users focus on a monitor displaying a keyboard; the interface measures electrical impulses in the brain to print the chosen letters one by one. Wilson compares the learning curve to texting, calling it 'kind of a slow process at first.' But even practice doesn't bring it quite up to texting speed: 'I've seen people do up to eight characters per minute,' says Wilson. See video of the system in action."

Seems to me I remember reading in Ted Nelson's "Computer Lib and Dream Machines" about a working prototype headband and software where a a cursor continually scanned across the alphabet (on a screen) and when the student caused the correct pattern the the letter currently over the cursor would be added to the output. The student could write a sentence this way. Now of course what this article describes is more sophisticated but of course it's also about 35 years later too...

My thoughts on this.. why not split it up into multiple 'keyboards' and have it briefly flash all letters in each 'keyboard' once at the beginning of each character.. use short-circuiting logic and have it go for the most common letters first. Seems you should be able to get this up to speed in very little time!

This sounds like we're getting that much closer to a human computer interface. How long till we go a little more invasive and have implants that let us "jack in" ala matrix, or andromeda, or any other Sci-Fi show or movie, and start interacting merely by thinking. In the next few hundred years we could turn ourselves into fat unmoving beings plugged from birth to death into computers. Till solar flares overload the system and kill us all at least... Either way, very interesting to see where this line of res

This sounds like we're getting that much closer to a human computer interface.

Tell him we've already got one.

Monitor + keyboard.

Sure, I know what you mean -- a direct neural interface instead of a kinetic input device (like a keyboard).

I think you're looking at it from the wrong perspective though, in terms of coolness. Instead of implanting a neural interface, wouldn't it be much better if we could just use telekinesis? Then we don't have to deal with extremely messy surgical hardware upgrades.

Similar things have been done. Robot arms can be moved by the mind. Rats brains have flown F14 fighters. EEG sensors placed directly onto the brain (rather than onto the head) produce far more detailed information - it's not a stretch to suggest that some day a sensor layer will be placed onto the inside of the skull with a connection to the outside world.

You could, of course, play with EEG technology yourself. The OpenEEG project details the hardware needed and provides some basic software. See if you can

The device on House was just a binary interface, and it was entirely plausible.

The patient could make a cursor go up, or not make it go up, and that was it. He had to make it go up twice for 'no'. And he had a response time measured in seconds just to get the cursor to go up.

I'm pretty certain that's real. It's actually a good deal less complex than technology that already exists, because I'm pretty certain they've demonstrated full 2-D control of a cursor using brainwaves. Half an axis vs. two full axis.

Professor Hawking uses a more sophisticated system with word-prediction and a micro-switch activated by a slight motion of his shoulder. he can do much better than 8 cpm. I use a similar system but use eye gaze on a virtual keyboard rather than a sectoring keyboard.

Perhaps he's more accustomed to the sectoring keyboard or no longer has the ocular control for the eye gaze system.

The eyes probably couldn't be steered accurately enough. His muscular control was a mess when I saw him in person in the late 1980s, and it won't have improved since.

On the other hand, if they tune into the neurons that control his arm, they may be able to anticipate what he is going to type. That might help accelerate things for him. It's a bit much to be able to decode the language centres sufficiently to record thoughts directly, but it will eventually get to that point.

Once it is possible to decode his thoughts directly, he would be able to communicate as fast as he can think. Which means that it'll be a babble because he thinks far too fast. On the other hand, it will help him to turn out papers at a fantastic speed.

The screen is used to figure out just WHAT letter the user is thinking of, until the technology is there to lift the letter out of someone's head. When the letter that blinks is the same as the user is thinking about, a signal is triggered in the brain, and that can be read by EEG.

Granted, it might be possible for someone who knows binary ASCII values or morse code to do without a screen, simply by thinking two different ways that generate a binary signal, but this is probably far more feasible.

I remeber hearing of a study once where they found that people could on average send text messages faster using morse code then with a cellphone keyboard. Although AFAIK this was before things like predictive text and such. Nevertheless though you just need to find 2 muscles that a person is able to control and from there would be pretty easy to translate that to morse code and then text.

I remeber hearing of a study once where they found that people could on average send text messages faster using morse code then with a cellphone keyboard. Although AFAIK this was before things like predictive text and such. Nevertheless though you just need to find 2 muscles that a person is able to control and from there would be pretty easy to translate that to morse code and then text.

Be careful what you ask for. You might get to teach someone who can only control his sphincter and prostate...

This particular kind of mind-reading functions by detecting electrical impulses created when the chosen letter flashes blue. If row 1 and column 1 both produce impulses, then the desired letter must be A.

We have a long way to go before you can just think a letter or word and it shows up.

So, when the letter being focused on flashes, the EEG picks it up and figures out which row and column are desired...

So it wouldn't work very well for the blind and its not pulling the letters out of the brain, its just a more sophisticated eye tracking device, similar to the goggles in apache helicopters? Why not just fit patients with those for a faster input method?

Now you're not only assuming that the user can move his eyes, but also blink at will.The whole point is to bring communications to the very worst cases, who currently have no methods of communication because they can't control their body at all.

Eye tracking would be much faster and easier to implement. FTA "I've seen people do up to eight characters per minute." That is ridiculously slow.

Aside from people who cannot move their eyeballs due to some kind of paralysis... so what? It's slow -- now. It's possible that further developments could speed it up a lot. It's also possible that the ceiling on this tech may be higher than the ceiling on eye-tracking... who knows, until there's been a lot more study and advancement?

So it wouldn't work very well for the blind and its not pulling the letters out of the brain, its just a more sophisticated eye tracking device, similar to the goggles in apache helicopters? Why not just fit patients with those for a faster input method?

Because Apache helicopters are prohibitively expensive even for patients with the best insurance, aside from being illegal for civilians to own. Duh.

'Because Apache helicopters are prohibitively expensive even for patients with the best insurance, aside from being illegal for civilians to own.'

It would probably be much cheaper to pick up a surplus thought-guided control system from the Soviet Mig-31 project on ebay. The only downside (and this is very important) is that you must think in Russian. You can't think in English and transpose it - you must think in Russian.

stephen hawking could probably afford an Apache....he'd kick some ass too.... not sure about our arms trade agreement with england.... but i'm sure we could work something out where hawking could get an Apache.. as long as he used it to run a few patrols over pirate infested waters to protect international trade interests....

I would call this a first step. They still don't know how the brain works, they are just guessing right now. They can figure what part of the brain deals with say language(as an example) and detect what is firing for what thought and then adjust the machine to say this is what he was thinking about so this is what the device should do.

The more we understand the brain the better a device like this will work. Either way it may never work for a blind person unless you can somehow figure out how to transmit

Because it's not eye tracking. The user sees the letters flash, in sequence, and when the correct one is seen to flash the user changes his/her thoughts in a way detectable by the EEG. The system then inputs that letter. The eyes don't actually have to move (though it can be hard to see a letter since the usable area of the vision range is quite small).

While technically you are correct. If i have a device that remotely heats my dog until it changes the channel for me via a sophisticated cabinet of lights and pull levers, I have a whole lot of overkill to achieve the same functionality as a an infrared remote control, except it doesn't use the infra-red LED or Sensors built in to standard remotes and TV's.

Instead, this device figures out what you're focusing on (looking at in 99/100 cases) flashes without interfacing or reading the eye. If the object of

I also wonder why the user is presented with a full alphabet. I would have thought that some form of predictive input, such as T9 would be much faster - it seems pretty much perfect in this scenario, and could easily leverage existing T9 software.
I'm going to go ahead and assume I'm missing something. Can someone who is more informed correct me here?

Wilson compares the learning curve to texting, calling it 'kind of a slow process at first.' But even practice doesn't bring it quite up to texting speed: 'I've seen people do up to eight characters per minute,' says Wilson.

I don't know if Wilson has seen how fast people really get with texting, but it's fast. This would have to get a lot faster than 8 characters per minute to even be close to texting.

Instead of flickering one row or column at a time, flicker ALL the letters simultaneously in different patterns. The brainwave trace should follow the one you're watching and the wait for it to be identified and confirmed will be much shorter.

The new technique works by recognizing, from brainwaves, when a letter on a screen is blinked. I doubt that will work for the blind.

What if you can't move your eyes?

It's not clear to me whether the EEG device is recognizing the brain signal alterations from the letter being concentrated on blinking or the letter being looked at blinking. If the former it may work for someone whose eyes are paralyzed. If the latter, it certainly won't. (I suspect

What you're suggesting involves much more sophisticated signal processing methods involving narrow-band spectral detection. What this is is just a P300 speller, hooked up to twitter. What they're picking out using EEG is a broad event-related potential known as the P300 which is detectable by averaging traces together. You can find more about how it works here: http://www.gtec.at/products/g.BCIsys/P300_Speller.htm

And honestly, it's not that much better than an eye tracker. It just uses fancier technology.

"The interface consists, essentially, of a keyboard displayed on a computer screen. "The way this works is that all the letters come up, and each one of them flashes individually," says Williams. "And what your brain does is, if you're looking at the 'R' on the screen and all the other letters are flashing, nothing happens. But when the 'R' flashes, your brain says, 'Hey, wait a minute. Something's different about what I was just paying attention to.' And you see a momentary change in brain activity."

I'm curious what kind of language optimization has been added, if any. Do they use predictive text of some sort?

Also, it seems a waste to limit the input to a display of a static keyboard (other than ease of use for people who know where to look for certain letters.) Why not have a dynamic interface, something alongs the lines of http://en.wikipedia.org/wiki/Dasher/ [wikipedia.org]?

Speaking of Hawking, they should change this so that it is full words. It is probably easier to get the comp to recognize the difference between left or right than A,B,C,D,.... Use the interface that Hawking has on his computer, where it just narrows down the word groups.

This system has been around for a while; I've seen it demonstrated live twice, and it didn't work at all either time. In my opinion, even in best conditions (bald patient, shit-tons of electrodes, professional setup, well-trained subject) it doesn't work well enough to fuel science-fiction fantasies, and probably never well. For locked-in patients, who can do nothing but move their eyes, though, it's an awesome technology. They made a movie recently about such a patient who spent years using it to write

I typed this message with brain waves. I did this making waves that actuated the input mechanism. The input mechanism is a complex chemical-based detector which translates the waves into physical movements, which it then translates into electrical signals using crude switches. In the article, a device is described which uses a mechanism which is different in particulars, but gives the exact same result (though slower).

A Beowulf cluster of those.
I wonder if the on-screen keyboard they are using is Dvorak. The Dvorak keyboard layout is far more efficient. Qwerty was deliberately designed to slow people down. I personally switched to a Dvorak keyboard on my EEG device, and I went from 8 to 12 letters per minute and experience way less eyestrain now.

Think about growing a new arm...or a new anything. It might be an appendage you've never previously imagined before. "Thought" is not the same as motor control. I don't think my fingers into typing this post at 60wpmish. If I had to think about it it would take forever to type.

Now think about if you had a third arm growing out of your chest. How would you control it? Without the motor control that has been learned over several years of childhood and adolescence, what good will it do you? A good quest

He already knows what the keyboard looks like. He should have made it where he thinks of the character and it appears rather than focusing the eyes on a keyboard. The problem with the keyboard approach is that think of a key on the keyboard. In a sense you think of a image of the area round that key. Say the H key, but in reality you visualize the keys around it as well.

Now you know the shapes of letters. Think of an L and that's about it.

I'd have to say that this is kind of the wrong way to go about expressing yourself directly from your brain. The brain doesn't naturally think in language. Language is manmade and is a "higher level protocol" if you will, so something that directly accesses brainwaves (via EEG) and tries to output English or another language is kind of dumb IMHO.

Why not make something that expresses emotion (happy/sad/mad/regretful/passionate) first? It'd probably be way easier to get directly from the ole noggin.

"We originally hooked it to the brain," said Wilson, "but only a very limited selection of messages came out, that appeared to be coming from somewhere else. So we've just gone directly to the penis without the middleman."

Male humans suffer from having functional bodies trapped with almost completely paralysed minds. The penis is an organ used by male hu