With a combination of brain implants and sophisticated software, the team was able to interpret their subjects’ thoughts and extrapolate what their subjects were seeing. Thanks to this study, scientists were able to gain insight into how people are able to translate sensory information from an image into something our minds can comprehend.

In their work, which was published in PLOS Computational Biology, the researchers implanted electrodes for a week in seven epilepsy patients. The electrodes were initially implanted to determine where the seizures were originating, but the scientists saw an opportunity to use them to gather data for research (with patients’ permission, of course).

“We were trying to understand, first, how the human brain perceives objects in the temporal lobe, and second, how one could use a computer to extract and predict what someone is seeing in real time?” explained Rajesh Rao, one of the lead authors, in their press release.

UNDERSTANDING YOUR THOUGHTS

Source: 10.1371/journal.pcbi.1004660

With electrodes in their brains, patients were shown a random sequence of images varying from human faces, houses, and empty gray screens in brief 400 millisecond intervals. The patients were instructed to look out for an image of an upside-down house.

Data was gathered during this time period from the electrodes. This data reflected what is called
“event related potentials,” the massive inflow of neurons lighting up after seeing the image, and “broadband spectral changes,” lingering signals after the image was seen.

A computer sampled and digitized the incoming data at a rate of 1,000 times per second, which enabled it to figure out the correlation between the images seen by the patients and the electrode location. Using this data, it was able to distinguish which location is more sensitive to an image of a face and that of a house. This data was used to train the software.

Another set of images, both containing houses and faces, that weren’t shown in the earlier sequences was then shown to the patients. The computer was able to interpret the incoming brain signals in the data and determine what image the subject was then viewing with a 96 percent accuracy—despite a lack of prior exposure to these new picture.

It accomplished all that within 20 milliseconds from the arrival of the data, or almost at the speed of perception itself.

What is more interesting is that the software was only able to accomplish this when it analyzed the signals using both event-related potentials and broadband spectral changes, suggesting that impulses across the region (and not only from the neurons in question) is important to how a person perceives an object.

This research is significant, as scientists are currently trying to map our brains. In the end, brain mapping could identify how various neurons, and their locations, relate to how we process our information and ultimately improve our understanding of how neurological diseases affect us.

Keep up. Subscribe to our daily newsletter.

I understand and agree that registration on or use of this site constitutes agreement to its User Agreement and Privacy Policy