Neurobiology of Language -

Music and language comprehension

What happens in the brain when we understand language and hear music? Richard Kunert looked at whether the structure of music and language is partially processed by the same brain systems. Based on his research it turns out that structurally challenging sentences could alter harmonic judgments. Kunert will defend his thesis on February the 10th at 12:30h at the aula of the Radboud University Nijmegen.

Music and language comprehension

Based on a behavioral study Kunert showed that after having heard music while reading sentences, research participants were asked to judge music stimuli in terms of their harmonic ending. These harmonic judgements were altered by structurally challenging sentences. It’s as if this kind of sentence taxed shared brain systems such that music processing suffered (see video).

Brain studies showed that the shared music-language system resides in a prefrontal brain region behind the left temple. Moreover, there is a brain region behind the left ear which is only involved in language comprehension, but not in music comprehension. In summary, language and music are not fully, but nonetheless partially, processed by the same brain regions.

Richard Kunert (1985, Halberstadt, Germany) studied Psychology in Gasgow (UK), and Brain & Cognitive Science in Amsterdam, before coming to Nijmegen to complete a PhD in Psycholinguistics. After his PhD he toured the Netherlands with the Ludwig company in order to share his research with the general public.

What is the neurobiological infrastructure for the uniquely human capacity for language? The focus of the Neurobiology of Language Department is on the study of language production, language comprehension, and language acquisition from a cognitive neuroscience perspective. Read more...