Brain-Computer Interfaces: Interactions at the Speed of Thought

Where open source software and brain-computer interfaces intersect, there are opportunities to revolutionize interaction design.

Article No :1173 | January 17, 2014 | by Hunter Whitney

Think about it: what if many of the interactions you currently have with your devices could be performed using your thoughts alone? No keyboards, remotes, or handheld game controllers necessary to help tell the machine what you want it to do. Although there are many technical hurdles to overcome, the connections between our brains and the devices we use are increasing. This trend is expanding and redefining the range of possibilities for UX designers.

A current Kickstarter campaign called OpenBCI (brain-computer interface) is an example of the convergence of broadening interest in neuroscience with the ethos of the open source software and maker movements. OpenBCI founders Joel Murphy (pictured above at the OpenBCI/Thoughtworks Hackathon) and Conor Russomanno describe the project on their Kickstarter page this way: “OpenBCI is a low-cost, programmable, open-source EEG platform that gives anybody with a computer access to their brainwaves.”

Tuning in to Our Brains

Some methods for opening up channels of communication between mind and machine are more invasive than others. Although implanting sensors directly into the brain can provide a highly precise and powerful connection, this approach obviously has serious drawbacks. A common non-invasive approach called electroencephalography (EEG), involves putting electrodes on the scalp to detect the brain’s electrical activity. Putting on a cap is not regarded as invasive at all from a medical point of view.

EEGs are fairly easy to use for sensing brain activity, but they do have drawbacks. Imagine there’s a raucous party and you are standing in an adjoining room. If you put your ear up to the wall at different places, you may be able to discern a few clear patterns in all the noise. For example, you might know what kind of music is playing and even get a sense of some conversations. If you put a cup against the wall, you could get a clearer sense of what you are hearing. Similarly, our brains are filled with a cacophony of electrical signals, which are a fundamental part of how our brain cells communicate with each other.

Although our brain cells’ myriad activities generate a lot of electrical “noise” and “crosstalk,” some patterns can be relatively easy to discern. There are, for example, typical wave patterns that form in awake and sleep states. Certain kinds of physical actions such as blinking or jaw clenching generate a particular kind of wave pattern. With the right equipment, including items like an electrode cap and signal processor, the signals can be captured, visualized, and ultimately used to perform actions such as switching on and off a device or moving a virtual object in a game.

That’s just the beginning, but by opening up the technology to enthusiasts, the idea is to create all kinds of interesting and useful BCIs. The possibilities have both beneficial and disturbing implications. One thing is certain: opening more direct channels between our brains and devices will transform interactions we currently take for granted and create entirely new interactions, including:

Replacing lost physiological functions due to illness or injury—from thought-controlled prosthetic limbs to computer-assisted vision

Applications where split-second decision making is essential for first responders, commercial pilots, or military personnel

Neurogaming, which would enable users to perform actions with their thoughts alone

The concept of BCIs is even making its way to mainstream entertainment: a new TV series called, Intelligence, features an operative with a microchip implanted in his brain that helps him solve crimes

The evolution of BCIs in academia, business, government, and the interested public is accelerating. How might these developments change the picture for UX designers? Just think … and share your thoughts in the comments below.

About the Author(s)

Hunter Whitney is a User Experience Designer who has helped create useful and usable interface designs for clients in areas ranging from bioscience and medicine to information technology and marine biology. In addition to his UX work, he has written numerous articles about a range of subjects, including data visualization, for various online and print publications.

Comments

This technology could be so helpful in the car as well. Forget complicated IVI systems that demand user input... this could significantly reduce distracted driving. Unless, of course, our autonomous vehicles get here first.

That is such an amazing topic, I am quiet passionate about it.
I wrote a related article which you may found interesting as well, which a demo on a mobile device.
(You need to copy paste the link)
http://scn.sap.com/blogs/olivier_mercier/2013/07/16/business-at-the-speed-of-thought