The research, led by Jay Sanguinetti of the University of Arizona, challenges currently accepted models about how the brain processes visual information.

Sanguinetti, a doctoral candidate in the UA’s department of psychology in the College of Science, showed study participants a series of black silhouettes, some of which contained recognizable, real-world objects hidden in the white spaces on the outsides.

Working with John Allen, Distinguished Professor of psychology, cognitive science and neuroscience at the University of Arizona, Sanguinetti monitored subjects’ brainwaves with an electroencephalogram, or EEG, while they viewed the objects.

“There’s a brain signature for meaningful processing,” Sanguinetti said. Participants’ EEG data showed the signature, a peak in the oscillating brainwaves that occurs about 400 milliseconds after the image was shown, called N400.

“The participants in our experiments don’t see those shapes on the outside; nonetheless, the brain signature tells us that they have processed the meaning of those shapes,” said Mary Peterson, a professor in the UA department of psychology and Sanguinetti’s advisor. “But the brain rejects them as interpretations, and if it rejects the shapes from conscious perception, then you won’t have any awareness of them.”

Importantly, the N400 waveform did not appear on the EEG of subjects when they were seeing truly novel silhouettes, without images of any real-world objects.

These findings lead to the question of why the brain would process the meaning of a shape when a person is ultimately not going to perceive it, Sanguinetti noted.

“Many, many theorists assume that because it takes a lot of energy for brain processing, that the brain is only going to spend time processing what you’re ultimately going to perceive,” added Peterson. “But in fact the brain is deciding what you’re going to perceive, and it’s processing all of the information and then it’s determining what’s the best interpretation.”

“This is a window into what the brain is doing all the time,” Peterson said. “It’s always sifting through a variety of possibilities and finding the best interpretation for what’s out there. And the best interpretation may vary with the situation.”

Our brains may have evolved to sift through the barrage of visual input in our eyes and identify those things that are most important for us to consciously perceive, such as a threat or resources such as food, Peterson suggested.

“There are a lot of complex processes that happen in the brain to help us interpret all this complexity that hits our eyeballs,” Sanguinetti said. “The brain is able to process and interpret this information very quickly.”

Sanguinetti’s study indicates that ultimately, when we walk down a street, our eyes perceive and our brains recognize meaningful objects, even though we may never be consciously aware of them.

In the future, Peterson and Sanguinetti plan to look for the specific regions in the brain where the processing of meaning occurs. “We’re trying to look at exactly what brain regions are involved,” said Peterson. “The EEG tells us this processing is happening and it tells us when it’s happening, but it doesn’t tell us where it’s occurring in the brain.”