Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

destinyland writes "This article asks, 'Why bother to type a document using a keyboard when you can write it by simply thinking about the letters?' A brain wave study presented at the 2009 annual meeting of the American Epilepsy Society shows that people with electrodes in their brains can 'type' using just their minds. The study involved electrocorticography — a sheet of electrodes laid directly on the surface of the brain after a surgical incision into the skull. ('We were able to consistently predict the desired letters for our patients at or near 100 percent accuracy,' explains one Mayo clinic neurologist.) And besides typing, there's new brain wave applications that can now turn brain waves into music and even Twitter status updates — by thought alone."

What would happen if a solar power cell was hacked? Would you be able to send energy to the sun? Just because something can send or receive something, doesn't mean it can do both. This reads brain waves and sends updates to twitter. It does not receive updates from Twitter and send brain waves.

I wouldn't be embarrassed, but the sexual harassment lawsuits would be unrelenting.

Why? Everytime you think of something sexual in that e-mail to your boss, you'll have to hit backspace. It's a positive feedback system. And I fail to see why this is a problem only for men -- if you knew half the stuff that went through the average woman's brain you'd probably crap a few bricks. Women make up for in detail what men do in quantity in that regard. ^___^ I'm not afraid my boss will find out... I'm worried my mother will.

And I fail to see why this is a problem only for men -- if you knew half the stuff that went through the average woman's brain you'd probably crap a few bricks

Sure, but the double standard would still ensure that the sexual harassment lawsuits against men would vastly outnumber those against women.

The typical man who opens a “whoopsie” e-mail from a female coworker would have several paragraphs worth of questionable material to delete before he had even finished reading her e-mail. The typical woman who received a similar e-mail from a male coworker would have the letter to her lawyer halfway completed...

if you knew half the stuff that went through the average woman's brain you'd probably crap a few bricks. Women make up for in detail what men do in quantity in that regard.

Case in point [blogspot.com]. Except most adult women are smart enough not to write it down.-This probably won't write down every single thought that runs through your head. You still probably have to 'think' of the letters. When I type I mentally spell out every word and know that I have to hit those keys. When I'm talking to someone or just thinking about that hot waitress I'm not thinking "W-o-w w-h-a-t a g-r-e-a-t a-s-s".

I wonder what the WPM is. I've reached a point in my keyboarding skills that the limitation seems

Cite? Where's your data? As far as you go, I'd be perfectly willing to accept that's how you roll, but women in general... the indirect evidence doesn't seem to support this.

In my experience, men are far more visual than women are. I think one obvious chunk of evidence for this was the pre-Internet era proliferation of men's picture magazines while one or two comparable woman's magazines (e.g. Playgirl) addressed what they thought mig

Men's judgments of women's attractiveness were based primarily around physical features and they rated highly those who looked thin and seductive. Most of the men in the study also rated photographs of women who looked confident as more attractive.

As a group, the women rating men showed some preference for thin, muscular subjects, but disagreed on how attractive many men in the study were. Some women gave high attractiveness ratings to the men other women said were not attractive at all.

The media tell us that men are sex-starved pigs and women are little angels. Since the first part is true, they assume the second is as well. Until, of course, you run across a (usually drunk) woman who doesn't care that you know she's a sex-starved pig, too, and that she's no different from the other women she knows.

There's a double standard here that has stood since antiquity (with the exception of the 1970s) that says that women who like sex are sluts,

I wonder if they can differentiate between what I mentally “type” (which implies focus of some sort) and every random thought that happens to zip through my mind. I’d expect there to be some sort of difference – if not in the region of the brain involved then at least in the level of activity.

(My workplace categorizes TFA as “entertainment”, so I’m not sure whether this was given mention... so if it was, then no, I didn’t RTFA, but at least I have an excuse.)

My guess is that the system has to be trained. As in, it listens to your thoughts while you type out a preset text by hand, and eventually matches the motor impulses that you would normally act out to the keys you would type.

Of course, if you're one of those freaks who can type in Dvorak and QWERTY it might not work so well.

For those who can't access the article, here is the information about how the system "learns":

Dr. Shih’s patients at the Mayo Clinic were asked to look at a computer screen containing a 6-by-6 matrix with a single alphanumeric character inside each square. Every time the square with a certain letter flashed, the patient focused on it and a computer application recorded the brain's response to the flashing letter. The computer software calibrated the system with the individual patient's specific brain

I always figured the final system would require a series of specific thoughts before entering "read" mode, kinda like a login/logoff to the keyboard once your hands no longer serve that purpose.

Not so concerned right now though. Reading letters only means only the dirty thoughts you literally spell out will be displayed. All those "wish I was playing WOW" (or other 15 second occurrences) won't make it into the email to your boss since our brain uses a mesh of stimuli and and language to convey thoughts. A w

"By implanting an electrode into the brain of a person with locked-in syndrome, scientists have demonstrated how to wirelessly transmit neural signals to a speech synthesizer. The "thought-to-speech" process takes about 50 milliseconds - the same amount of time for a non-paralyzed, neurologically intact person to speak their thoughts. The study marks the first successful demonstration of a permanently installed, wireless implant for real-time control of an external device."

My iPhone is a good device for output (I can read nearly all the webpages I want) but is awful to input, taking up much more time to, say, making a post on/. than it would with a desktop or laptop. I don't think a miniature keyboard will fix this issue. Making the phone bigger is not an option, nor is carry around a fullsize keyboard (even those roll up flexible ones).

So this will be good for that. Though I suspect a front-side facing camera that can track your eyemovements down to the key on screen key

Would it really? I realize that it can be difficult for people in some situations to type, but how is this any better than the "look at the letter on the computer screen" method?

That one doesn't require a a craniotomy and wires in your brain. In that method (or the blow in the tube method Steven Hawking uses) you could look at individual words that you may be trying to spell to speed things up (like in texting). In this method, you only get letters. To get something else, you'd have to train the computer t

... it turns out that they used an old AT-style connector, so you're only able to use your thoughts to type on a 386DX2/40 at best. Wuich is okay, I guess, still runs Linux.

Seriously, tho, combine this with Bluetooth, and we've got ourselves a winner. Connect to your PC, cell phone, PS3, whatever. I'll go in for the surgery as soon as it's availa... wait. Can I also move a mouse with my thoughts? Using a computer with just keys could be harsh these days.

It's a lot easier than you think, unless you're surfing the web with lynx/links. The number of hidden clickables has risen greatly, and you have to tab through them all to get to the link you want to activate.

whenever i hear about groundbreaking advancements in the neurosciences, i for one automatically think about how it can improve my twitter feed.

Well, the internet was a groundbreaking advance in information technology that has allowed both advanced physics research and 4chan posts to exist in the same medium. But that's the case with any technology -- it will be used for both really intelligent, and really stupid, purposes. A car is a wonderful advancement that allows people to get to and from work, and then get drunk and turn livingrooms into garages.

It’s a lot easier to learn sign language when you only want to memorize the alphabet.

It’s a lot easier to do speech recognition when you only have to match 26 patterns.

Spelling everything out is slow. Sign language and speech recognition have both developed beyond this limitation, and obviously that would be a major goal in the development of this technology as well.

Amazing. Why there are no(*) downsides at all! This will sweep the world!

Soon we will all use this, and the keyboard will be dead. Imagine what computers could look like without the needing keyboard. Almost like... tablets of some kind. We'll call them "portable blackboard computers".

Although I haven't RTFA, it seems likely that they will be able to, eventually, move from being able to interpret 26 characters to being able to interpret the top 1000 most commonly used words. Then all the words in the person's vocabulary. Then all the most common basic grammatical constructions (e.g. verb conjugations, etc), then sentence fragments, then sentences.

The craniotomy may be needed for a very long time (at least to get the best accuracy/speed), but if the craniotomy procedure became safe/cheap

In addition to the ability to “mind read” vowels, consonants, and individual letters, brain wave applications also include algorithms to turn brain waves into music and even “tweeting” (using the popular Twitter Internet application) by thought alone.

Expect to see millions of tweets saying, "I'm tweeting about what I'm thinking of tweeting next!" In succession. For a week. And then there's Music Monday, Thinking Tuesday, and Lord knows what else...

I'm curious as to whether or not this will be able to help patients with locked in disorder. Recently in the news there was an story about a man who had been "locked-in", unable to communicate with others for nearly 20 years. The Science-Based Medicine blog did a big write up of this story (http://www.sciencebasedmedicine.org/?p=3122) and some of the inherent problems with the way in which they made contact with the patient "facilitated communication". If the accuracy rate is truly as good as claimed thi

I thought the same thing myself; the "Facilitated Communication" has been slammed in court on more than one occasion because it became rapidly quite clear that the facilitator was the one doing the actual communication, not the poor schmo in the wheelchair. Hell, I know if I had true locked-in syndrome then I'd love to have something like this so I can at least have a hope of communicating with the outside world.

I find this story very perplexing. The Science-Based Medicine article claims that they were getting yes-no answers from him using a toe he could control, but other sources don't seem to mention that part.

If it's true, it should be easy enough to ask, "So, is this facilitated communication actually any good, or just a load of hooey?" and get a direct, unfacilitated answer. If he gives an unambiguous yes, then FC is validated and you get the rest out that way.

I only speak for myself here, but it seems like thinking about letters is actually harder than typing on a keyboard. I don't really think about what letters I'm pressing when I type, I just think of the words and the vast majority of the time, it's just muscle memory doing its thing. Perhaps for novel words or words that I don't quite remember how to spell, I'll think of the letters individually. Sounds like more trouble than it's worth.

Further, it's not entirely clear that our cognitive capacities reside solely in our brain. The rest of our body could have a role to play in cognition. It could be the case that when we're typing, a big part of our typing cognitive process actually depends on our body executing typing actions. For more info, see Embodied Embedded Cognition [wikipedia.org], Enactivism [wikipedia.org], and other related philosophy of mind or AI theories.

That's a good point that I think a lot of people miss when it comes to new interfacing technology. Sure, it takes some getting used to, and at first, it's probably going to suck and you are going to want to go back to the old, 'better' way of doing things. However, given consistent use and a bit of patience our minds and bodies are remarkable at learning new interfaces. Think about the the first time you drove a stick shift. You probably popped the clutch a few times and squealed some tires and killed the engine once or twice. However, after a month or two getting a feel for the clutch everyday while driving, you begin to master the motions and, eventually, working the clutch becomes an art form in and of itself.

This is one of the underlying principles of Kung-Fu. Through disciplined, consistent repetition, our bodies develop habits all of our own. Martial arts mastery comes when your body has ritualized so many action-reaction combinations that you can start combining them in new, more inventive, more powerful ways. The same thing goes for an editor like vi. Eventually, you master enough key strokes that you don't even need a mouse anymore. The same thing happened with typing when the keyboard first came about and, now, it is happening again with mobile platform keyboards (I can text with two thumbs as fast as I could type with two hands three years ago).

My bet would be that, as these neuroscience interfaces develop with the future, our 'mental-fu' will start to develop just like the Kung-Fu we practice to learn any number of physical interfaces and actions. Before you know it, we may be living in a world where our very wills could be pitted against one another in mental show downs. I, for one, welcome the idea of interfaces that force humanity to start mastering and disciplining its own mental habits on a wide-scale.

True enough. I made a reply [slashdot.org] to one of your sibling posts that relates to this. If this technology were to be implemented, it would also require something to provide input for your normal brain-body feedback loops in order to be functionally effective. Without those feedback loops, your cognition in typing may be severely impaired, because you're essentially taking away (actually making invisible) the interface through which you communicate with your environment.

It cracks me up that AI people are just getting around to noticing this. I guess they've never ridden a bicycle, threaded a needle, or done any of the myriad other complex tasks that require intelligence in the fingers or other parts of the body: the processing power may be in the brain, but a huge amount of the work is being done via complex multi-sensorial feedback from the whole body.

This actually comes really close to a pretty recent argument against brain-in-a-vat thought experiments. Envatted brain thought experiments try to illustrate that cognition resides solely in the brain. However, if you really think about the experiment carefully, an envatted brain would require something so similar to a body that it could be said to be a surrogate body. This article [utoronto.ca] written by a Philosophy professor at the University of Toronto, Evan Thompson, explains this argument in much greater detail.

I only speak for myself here, but it seems like thinking about letters is actually harder than typing on a keyboard.

This is probably true for anyone who has use of at least one functional limb. Similarly, typing by dictation is easier for anyone who can speak. For people who have neither the use of a limb nor speech (total paralysis for example), typing with brain waves may be an attractive interface.

Though the article's recorded rate of "up to" 8 characters per minute means it will be quite a while befor

I don't know about "normal people" but for me, if I had to think of each letter, I would probably forget what I was thinking in the first place. When I type, I simply think of the words I want to say and they come out through my finger movements. So, if this technique of mind reading becomes more advanced and entire words can be recognized, then we would have something useful.

Though it's great for people with no other means of communication, there are two main obstacles I see for everyday use: Speed, and words.Speed: "I've seen people do up to eight characters per minute," Wilson says. Nothing else needs to be said.Words: When I type, I don't think about typing individual letters, so much as I think about typing the words in the sentence. I'm no neuroscientist, but I would wager that this doesn't trigger the part of the brain that they're reading the letters from - or if it do

For normal people could be slower than typing. You should think on a letter, and for long enough. Alone letters usually dont have associations that could make very complex determining in which one is thinking,

I think between the time I think of something to type, and the time I use my fingers to put it on the screen, I'm forced to focus a little more to put my thought into a communicable form that will make sense to someone else.

And really, actually having to think of each individual letter (something my brain sends to my fingers in a fairly automatic fashion) seems like more effort to me than just pushing a button and having the letter pop up on the screen.

If the craniotomy doesn't turn you off, the constant risk of infection should. If that STILL doesn't do it, you have to get the whole thing redone after a while because the brain builds scar tissue around the electrodes, eventually cutting them off.

What they've found here is that they can map certain patterns of brainwaves to known facts when they are expecting one of a small set of patterns at a specific time. There are obvious applications for this with people who can't communicate any other way, but beyond that they fall into the same trap AI and speech recognition is already in. Picking out a letter, word, or thought from all the other noise inside a person's head has to be orders of magnitude more difficult that understanding spoken text.

Pair some machine learning up with this to figure out what fires when I'm typing something, perhaps The quick red fox jumped over the lazy brown dog? or the dictionary.

And, hell, have me look at and read a visual dictionary or encyclopaedia, similar to Leeloo in The Fifth Element, that way when I think of an image or concept it's typed. Anything that I can't specifically correlate to something I've seen I'd need to think about how to spell it out.

Because I can type faster than I can consciously think of all the letters involved, and I'd rather not have the unconscious do the selection of letters, since it nnlk2f0 momsosbsbg 30jmgmgea0kaa kms9oj3f smov amsalk s.

A technology that allows you to type at a blistering eight characters per minute after getting invasive brain surgery isn't for you. A quadriplegic might decide it is worth the effort to them. Especially one that has other issues that prevent mouth-typing or other adaptations. And doubly so if they could control a wheelchair and home assistance systems by thinking specific patterns.

I can think of some other applications for something like this.. communication with someone i

When writing a document, I don't think of individial letters, I think of the word, because I automatically know how to write it. With this method, I'll be thinking of letters and I might lose track of the words, conjugation, pluralization, or even the entire sentence. This method seems unproductive unless you can get it to recognize entire words.

This is an old desire. The amount of electrical noise in a nervous system is very large, compared to the relevant signals. The result is that no matter what you do with all the processing, you have to monitor for roughly 500 msec to detect a real signal. So unless you type less than two characters/second, and don't care about having to do lots of corrections, it's not worth the effort and expense.

It [p0rn] would [p0rn] be [p0rnp0rnp0rn] be [p0rn] interesting [p0rnp0rn] to see [p0rn] how [p0rn] well this would work [p0rnp0rnp0rn] compared to [p0rn] technologies [p0rn] like voice [p0rn] recognition [p0rnp0rn].

I wonder why they didn't try something like Dasher [cam.ac.uk]. This uses simple two-axis control to choose letters as they fly by. I would think this kind of method would be better than having to train for each individual letter.