Have you ever wanted to control your computer with your mind? I have. And come next December maybe we will. It’s been almost a year since I first talked about Emotiv Systems and the company’s EPOC headset. They’ve stayed somewhat secretive since then (crypticness and stealth that also extended to my email correspondence with them). But last February 19th they came out of the cave at the GDC’08 conference with a brand new bone, the latest version of their consumer based brain-computer interface that is quite frankly geeking me out. The headset will be marketed for the game industry and is expected to go for $299. Read on for what to expect. The features are, well, pretty unbelievable.
I’ll start off by noting that there was not one, but two companies demoing brain-computer interfaces at GDC this year. Emotiv Systems & the EPOC neuroheadset, which we’ll be focusing on here, and NeuroSky. The latter is planning on selling their sensors and technologies to partners and will not be developing a specific headset on their own (they had a demo unit at GDC just to show the functionality of their systems).

What Are Brain-Computer Interfaces and the EPOC Neuroheadset?

If you’ve read Think Artificial before, you’re probably somewhat familiar with BCIs. Brain-computer interfaces. Devices that allow us to control machines using only our minds.

The key technology is called electroencephalography (EEG). A device monitors your brain’s electrical activity via sensors on your scalp. It’s been used for medical purposes for years — and the futuristic image on the side here depicts setup for a musical brainwave performance at the Deconism Gallery in 2003, for example. The audience of a concert hooked up to the EEG devices to affect music and lighting.

However, monitoring the waves is different from detecting their patterns and using them reliable “triggers”, like Emotiv Systems’ EPOC device and software does. For this to work, two things are essentially required: The user has to practice producing a repeating, recognizable pattern. But there’s always noise (because it takes practice to be able to visualize the same image, or sequence), so the second thing is that the software deciphering the electrical activity must learn to recognize trigger waves.

The Emotiv EPOC Neuroheadset uses a set of sensors to tune into electric signals naturally produced by the brain to detect player thoughts, feelings and expression. It connects wirelessly with all game platforms from consoles to PCs. The Emotiv neuroheadset now makes it possible for games to be controlled and influenced by the player’s mind. [link]

Emotiv Systems have been working 4 years on R&D, and have come up with their commercially viable BCI — and at a remarkably low price considering its capabilities and that this is the first time such technology hits the market for general consumers. Which brings us to its features.

What Emotiv’s Epoc Neuroheadset Can Do

Let’s start off with an easy-digest list of features expected to be bundled in the first release of EPOC:

Wireless headset – 12 hour battery-life (playing time)

Demo Game – Makes use of- and demonstrates the headset’s features

Emortal – Access to an online hub that allows users to interact with photos and music using Epoc

The EPOC system is comprised of three main software components, each of which detects different kinds of brainwave activity.

The Affectiv suite can reportedly measure the emotional states of the user. Anger, fear, frustration. Emotiv puts forth the example that this could be used to have games increase or decrease the difficulty level depending on the player’s state of mind. The Cognitiv Suite is the control mechanism that allows players to control objects, and the Expressiv suite which measures and interpretes facial expressions of the user. The descriptions and demos are vivid, for example: You smile and thus your avatar smiles.

One of my earliest questions regarding EPOC was: can the system discern many patterns at the same time with any knowledge beforehand on what you’re trying to accomplish?

Most of what I’ve seen from their demos is task-and-turn based, where the player is moved between “phases”, each of which requires him to use one and only one specific action at a time. The Stonehenge Demo, for example, moves the player from stone to stone — but the player only applies one action to each stone (e.g. “rotate” or “lift”; not both).

Let’s elaborate. A user is inside Second Life and has created a plain box. My question is: Can the system handle rotating the box while the user is smiling/making the avatar smile? Or rotating the box, moving it a bit forward, then up — perhaps even rotating and lifting the box at the same time? Is all of this possible? Because if this were possible I’d be geeked out.

Keymap Your Brainwaves

I got mail yesterday. And I geeked out. The letter was from Emotiv reporting, amongst other things, more information on EmoKey — their software for mapping mental intention to keyboards (yes, meaning the Epoc headset will be connectable to virtually any application).

The descriptions almost sound surreal:

EmoKey Software – Use the Emotiv EPOC with your existing software

In our efforts to enabled our users, Emotiv has developed the EmoKey software application in conjunction with the Emotiv EPOC. EmoKey allows you to associate any of the Emotiv EPOC detections with keystrokes on your PC. EmoKey enables all of your existing PC software to be Emotiv EPOC compatible right out of the box! In practice, this means that you can link a “smile” detection to type the “smiley emoticon” in your chat application or link a thought, such as “rotate clockwise” to a series of keystrokes such as “a-w-d-s-a-w-d-s” to rotate your magic wand!

This appears to indicate that you can basically do any action, at any time, anywhere. Right? Well, almost. It’s not clear whether you can only “press” one button at a time (“a then w then d then s….”), or if you can press many buttons at the same time. It could even be a third case where you can press 3 buttons at a time — one from each detection suite (unlikely).

However, I can venture the guess that a feature of the EmoKey is to define a “virtual button” (if not, please spread the idea to Emotiv!). This could allow you to compose a series of virtual buttons. A specific thought could then be assigned to a series of them “ctrl+a, ctrl+w, …” instead of single physical buttons, thereby enabling you to press two buttons at the same time. Like enabling rotation of a something while smiling. This brings up the question of how many mappings there can be?

Now We Know How Santa Will Control His Robotic Reindeer Next Christmas

Regardless of questions and concerns, this is an incredible device that I would love to get my hands on. Granted, it is the first commercial edition, and we can expect that things may not run as smoothly as we hope. This video of someone trying the beta demo is actually the worst example I could find. I say actually because I don’t think that’s bad at all. The upload date & specific demo indicates this was at GDC’08, which means it’s probably a first time user.

Looking forward to seeing and hearing more about Epoc. In a recently featured article I talked about what great virtual reality gear could be; an important part being a non-invasive EEG device like Epoc for movement control.

Imagine making your avatar smile. Not via keystrokes. Simply by smiling yourself, walking towards that interesting monolith in the distance by seeing it happen in your mind.

Hi Tzipora. Sorry to hear about your boyfriend. The Epoc monitors neural activity in the brain, so given that the brain itself is intact its fairly certain he’ll be able to use the Affectiv- and Cognitive suite.

But it’s hard to say whether it will function perfectly without knowing what can disturb the Epoc sensors. And I’m not sure what the Expressiv suite monitors.

You could try contacting Jonathan Geraci at the Emotiv group on Facebook (his email is on the page). And please share the answer with the rest of us here!

Hello
I’m Havoc 1 of the top Gamers on the internet I’d like to think. I have beta tested Games Hardware and New inventions for over 11 years. In August 2006 I had a work related accident damaging and crushing 6 disks from C3 to C7 of my neck. I think this would be something I could use this to benefit My gaming and other computer duties. I run a small community of 300+ Gamers but having thousands of Gaming friends this could be used for the Gaming next generation of online gaming competitions. I can just imagine never having to touch a keyboard again. I would Love to Beta test this in the Gaming gener to see just what and how this works and can be better implemented for Gaming.

If you need a beta tester for it let me know Im really interested to see how this would work under the stress of Gaming. I will be posting more info about this on the clan Site. you have a great system here lets work this Baby in to get her in the lime light

Hello Emotiv!!! – Hoooagh!!! – I am a telepathic person with a lot of experience being monitored on DARPAs’ application for location and monitoring of persons with higher amplitude brain wave patterns. You guys are no doubt aware of this system. Your system is Never gonna have the bandwidth necessary to do the real stuuf using Blue Tooth ans I gues you know it. for advanced neurofeedback and pure two way telepathy, you will Always have to use a hardwired headset and a Much higher resolution A/D converter, like DARPA does in the laboratory for real syntehetic telepathy. But the idea of extending the sensitivity down into the “everybody” amplitude is awesome. What I would like is to be able to think directly into a word processor for writing. I understand it’s right around the corner. Kudos to you guys for the work you are doing. And for bravery in the face od the “Remaining Classified ” crowd. As you know, some of these people have very wierd and threatening ideas concerning telepathy, and will Not accept the simple scientific fact that we Are, and always Have Been, a telepathic species.

THIS SORT OF REMIND I; OF THE MOVIE LAWN MOWER MAN WITH PIERCE BROSAN,OR IS IT BROSMAN,IN THE MOVIE THEY HAD A MENTLY RETARDED PERSON WHOM WAS BROUGHT IN BY THE SCIENTIST WHOM THOUGHT COULD USE THIS PIECE OF THECHNOLOGY INSTEAD HE GOT WHAT THEY CALL, A BIG HEAD FROM THE MACHINE,THAT HE WAS USEING,WHAT I AM TRYING TO SAY IS THE THECHNOLOGY THEY WER’RE TRYING TO USE WAS TO WAS WAY AHEAD OF THE LEARNING CURVE FOR THE SUBJECT TO USE!THE MOVIE POINTED OUT THE DANGER OF THECHNOLOGY CAN BE MIS-USE BY THE SOMEONE WITH A,LITTLE BIT OF A MENTAL RETARDATION PROBLEM,HE DIDN’T LIKE WHAT HE SAW IN THE DATA BASE HE WANTED TO CONTROL EVERYTHING THE SCIENTIST REALIZE A LITTLE BIT LATE THAT HE HAD TO DESTROY WHAT HE CREATED,& ALSO TOO SOME CERTAIN PEOPLE WANTE TO GET THERE HANDS ON SUCH THECH.,DEVICES TO USE FOR THERE OWN PURPOSE BY THE WAY I THINK THAT PART WASN’T IN THE MOVIE.GO RENT THE MOVIE LAWN MOWER MAN & SEE FOR YOUR SELF.SIGN,MR.JAMES OF ASHLAND ORE.,OUT.

So if this device can produce the same brain waves that your brain receives than why doesn’t your tweck the waves to use sirtent commands
like: To use the full percent of your brain. Hey I’m only 17 year old but yet I’ve all ready created half of the machine but the rest of the supples i need cost thousands of dollars. And that when i for that device uses the exact same supples as mine. You could the World.