Researchers discovered that keeping the head still but shifting the eyes to one side or the other sparks vibrations in the eardrums, even in the absence of any sound.

Surprisingly, these eardrum vibrations start slightly before the eyes move, indicating that motion in the ears and the eyes are controlled by the same motor commands deep within the brain.

“It’s like the brain is saying, ‘I’m going to move the eyes, I better tell the eardrums, too,'” says Jennifer Groh, a professor in the departments of neurobiology and psychology and neuroscience at Duke University.

The findings, which the researchers replicated in both humans and rhesus monkeys, provide new insight into how the brain coordinates what we see and what we hear. It may also lead to new understanding of hearing disorders, such as difficulty following a conversation in a crowded room.

Sight and sound

It’s no secret that the eyes and ears work together to make sense of the sights and sounds around us. Most people find it easier to understand somebody if they are looking at them and watching their lips move. In a famous illusion called the McGurk Effect, for instance, videos of lip cues dubbed with mismatched audio cause people to hear the wrong sound.

But researchers are still puzzling over where and how the brain combines these two very different types of sensory information.

“Our brains would like to match up what we see and what we hear according to where these stimuli are coming from, but the visual system and the auditory system figure out where stimuli are located in two completely different ways,” Groh says.

“The eyes are giving you a camera-like snapshot of the visual scene, whereas for sounds, you have to calculate where they are coming from based on differences in timing and loudness across the two ears,” she says.

Because the eyes are usually darting about within the head, the visual and auditory worlds are constantly in flux with respect to one another, Groh adds.

Working together

In an experiment designed by Kurtis Gruters, a formal doctoral student in Groh’s lab and co-first author of the paper, 16 participants were asked to sit in a dark room and follow shifting LED lights with their eyes. Each participant also wore small microphones in their ear canals that were sensitive enough to pick up the slight vibrations created when the eardrum sways back and forth.

Though eardrums vibrate primarily in response to outside sounds, the brain can also control movements using small bones in the middle ear and hair cells in the cochlea. These mechanisms help modulate the volume of sounds that ultimately reach the inner ear and brain, and produce small sounds known as otoacoustic emissions.

Researchers discovered that when the eyes moved, both eardrums moved in sync with one another, one side bulging inward at the same time the other side bulged outward. They continued to vibrate back and forth together until shortly after the eyes stopped moving. Eye movements in opposite directions produced opposite patterns of vibrations.

“The fact that these eardrum movements are encoding spatial information about eye movements means that they may be useful for helping our brains merge visual and auditory space,” says David Murphy, a doctoral student in Groh’s lab and co-first author on the paper.

“It could also signify a marker of a healthy interaction between the auditory and visual systems,” Murphy adds.

The team, which included Christopher Shera at the University of Southern California and David W. Smith at the University of Florida, is still investigating how these eardrum vibrations impact what we hear, and what role they may play in hearing disorders. In future experiments, they will look at whether up and down eye movements also cause unique signatures in eardrum vibrations.

“The eardrum movements literally contain information about what the eyes are doing,” Groh says. “This demonstrates that these two sensory pathways are coupled, and they are coupled at the earliest points.”