Months before their first words, babies' brains rehearse speech mechanics

July 14, 2014

A year-old baby sits in a brain scanner, called magnetoencephalography -- a noninvasive approach to measuring brain activity. The baby listens to speech sounds like "da" and "ta" played over headphones while researchers record her brain responses. Credit: Institute for Learning & Brain Sciences at the University of Washington

Infants can tell the difference between sounds of all languages until about 8 months of age when their brains start to focus only on the sounds they hear around them. It's been unclear how this transition occurs, but social interactions and caregivers' use of exaggerated "parentese" style of speech seem to help.

University of Washington research in 7- and 11-month-old infants shows that speech sounds stimulate areas of the brain that coordinate and plan motor movements for speech.

The study, published July 14 in the Proceedings of the National Academy of Sciences, suggests that baby brains start laying down the groundwork of how to form words long before they actually begin to speak, and this may affect the developmental transition.

"Most babies babble by 7 months, but don't utter their first words until after their first birthdays," said lead author Patricia Kuhl, who is the co-director of the UW's Institute for Learning and Brain Sciences. "Finding activation in motor areas of the brain when infants are simply listening is significant, because it means the baby brain is engaged in trying to talk back right from the start and suggests that 7-month-olds' brains are already trying to figure out how to make the right movements that will produce words."

Kuhl and her research team believe this practice at motor planning contributes to the transition when infants become more sensitive to their native language.

The results emphasize the importance of talking to kids during social interactions even if they aren't talking back yet.

"Hearing us talk exercises the action areas of infants' brains, going beyond what we thought happens when we talk to them," Kuhl said. "Infants' brains are preparing them to act on the world by practicing how to speak before they actually say a word."

In the experiment, infants sat in a brain scanner that measures brain activation through a noninvasive technique called magnetoencephalography. Nicknamed MEG, the brain scanner resembles an egg-shaped vintage hair dryer and is completely safe for infants. The Institute for Learning and Brain Sciences was the first in the world to use such a tool to study babies while they engaged in a task.

Here's a video of one the babies in the experiment:

The video will load shortly

The babies, 57 7- and 11- or 12-month-olds, each listened to a series of native and foreign language syllables such as "da" and "ta" as researchers recorded brain responses. They listened to sounds from English and Spanish.

The researchers observed brain activity in an auditory area of the brain called the superior temporal gyrus, as well as in Broca's area and the cerebellum, cortical regions responsible for planning the motor movements required for producing speech.

This pattern of brain activation occurred for sounds in the 7-month-olds' native language (English) as well as in a non-native language (Spanish), showing that at this early age infants are responding to all speech sounds, whether or not they have heard the sounds before.

In the older infants, brain activation was different. By 11-12 months, infants' brains increase motor activation to the non-native speech sounds relative to native speech, which the researchers interpret as showing that it takes more effort for the baby brain to predict which movements create non-native speech. This reflects an effect of experience between 7 and 11 months, and suggests that activation in motor brain areas is contributing to the transition in early speech perception.

The study has social implications, suggesting that the slow and exaggerated parentese speech – "Hiiiii! How are youuuuu?" – may actually prompt infants to try to synthesize utterances themselves and imitate what they heard, uttering something like "Ahhh bah bah baaah."

"Parentese is very exaggerated, and when infants hear it, their brains may find it easier to model the motor movements necessary to speak," Kuhl said.

Related Stories

Babies only hours old are able to differentiate between sounds from their native language and a foreign language, scientists have discovered. The study indicates that babies begin absorbing language while still in the womb, ...

Common advice to new parents is that the more words babies hear the faster their vocabulary grows. Now new findings show that what spurs early language development isn't so much the quantity of words as the style of speech ...

(Medical Xpress)—Tickling a baby's toes may be cute but it's also possible that those touches could help babies learn the words in their language. Research from Purdue University shows that a caregiver's touch could help ...

Recommended for you

Massachusetts General Hospital (MGH) researchers have developed what appears to be a significant improvement in the technology behind brain implants used to activate neural circuits responsible for vision, hearing or movement. ...

An achievement by UCLA neuroscientists could lead to a better understanding of astrocytes, a type of cell in the brain that is thought to play a role in Lou Gehrig's disease, also called amyotrophic lateral sclerosis, or ...

Time flows, time flies, time stands still. All these expressions show just how highly variable, depending on multiple factors, our perception of the passage of time can be. How is this subjective experience embodied in the ...

Duke University researchers have identified a common mechanism underlying separate forms of dystonia, a family of brain disorders that cause involuntary, debilitating and often painful movements, including twists and turns ...

A severely brain injured woman, who recovered the ability to communicate using her left eye, restored connections and function of the areas of her brain responsible for producing expressive language and responding to human ...

2 comments

Infants can tell the difference between sounds of all languages until about 8 months of age when their brains start to focus only on the sounds they hear around them.

The difference between the sounds of all languages was not available as a source from which all infants "can tell the difference between" until about 8 months of age?

What was the focus on of all infants until about 8 months of age?All infants can "tell the difference" until about 8 months of age.What was the source of sound giving all infants the ability, acquisition, or capability to "tell the difference" between the "sounds of all languages" in the absent of all the "sounds of all languages" until about 8 months of age?

Reading further..."...infants are responding to all speech sounds, whether or not they have heard the sounds before."

How is a response conditioned to a sound never heard before or for the first time?

Actually all these questions are answered during fetal development - before birth.

A.All infants basilar membranes at birth are organized tonotopically the same.External auditory stimuli will eventually strengthen those signal pathways to an auditory area of the brain most often used to process the sounds coming from the surroundings the infants are exposed to after birth the most.

A pathway's strength (myelination) will eventually determined the sensitivity and selectivity of sounds heard more often externally.