Lecture by Anne Keitel: How auditory and motor cortices contribute to speech comprehension

How the human brain makes sense of a continuous speech stream is of interest for neuroscience, linguistics, and research on language disorders. Previous work that examined dynamic brain activity has addressed the issue of comprehension only indirectly, for example by contrasting intelligible speech with unintelligible speech. Recent work, however, suggests that brain areas can show similar stimulus-driven activity, but differently contribute to perception or comprehension. In this talk, I will focus on our recent study (Keitel, Gross & Kayser, 2018, PloS Biology), which directly addressed the perceptual relevance of dynamic brain activity for speech encoding, by using a straightforward, single-trial comprehension measure. Furthermore, previous work has been vague regarding the analysed time-scales. We therefore based our analysis directly on the time-scales of phrases, words, syllables, and phonemes of our speech stimuli. By incorporating these two conceptual innovations, we demonstrate that distinct areas of the brain track acoustic information at the time scales of words and phrases. Moreover, our results suggest that the motor cortex uses a cross-frequency coupling mechanism to predict the timing of phrases in ongoing speech. I will also elaborate on some more findings of the motor system’s multifaceted involvement in speech processing. To sum up, our recent findings suggest spatially and temporally distinct brain mechanisms that directly shape our comprehension.