How does your brain process sounds (including language) in a realistic acoustic environment where there are multiple sounds competing for your attention?

A problem:

How do you study neural responses in the higher auditory cortical regions when you don't even know what they will respond to? These neurons don't respond to the usual battery of tests such as tones, filtered noise, etc.

Auditory cortical regions of the marmoset. The inset on the right shows the connections of the primary auditory cortex with higher auditory regions. Figure adapted from de la Mothe et. al. 2006.

Solution:

Monitor neural activity in the brain while listening to the sounds they are responding to and then correlate both signals. This is known as reverse correlation, which is illustrated below.

A prototype telemetry system has been developed and tested that will enable this data collection (Block diagram illustrated below).

Here's a picture of the custom 5.8 GHz transmitter (top and bottom of RF PLL and VCO board):

Here are several spectrograms of marmoset vocalizations that were transmitted by the acoustic channel of the telemetry system. Below one can see a trill-phee, juvenile calls, as well as tsik calls.