Manage your subscription

Born to chat: Humans may have innate language instinct

By Bob Holmes

Two blue shoes, or shoes blue two?

(Image: Abrams/lacagnina/Getty)

People instinctively organise a new language according to a logical hierarchy, not simply by learning which words go together, as computer translation programs do. The finding may add further support to the notion that humans possess a “universal grammar”, or innate capacity for language.

The existence of a universal grammar has been in hot dispute among linguists ever since Noam Chomsky first proposed the idea half a century ago. If the theory is correct, this innate structure should leave some trace in the way people learn languages.

To test the idea, Jennifer Culbertson, a linguist at George Mason University in Fairfax, Virginia, and her colleague David Adger of Queen Mary University of London, constructed an artificial “nanolanguage”.

Advertisement

They presented English-speaking volunteers with two-word phrases, such as “shoes blue” and “shoes two”, which were supposed to belong to a new language somewhat like English. They then asked the volunteers to choose whether “shoes two blue” or “shoes blue two” would be the correct three-word phrase.

Semantic hierarchy

In making this choice, the volunteers – who hadn’t been exposed to any three-word phrases – would reveal their innate bias in language-learning. Would they rely on familiarity (“two” usually precedes “blue” in English), or would they follow a semantic hierarchy and put “blue” next to “shoe” (because it modifies the noun more tightly than “two”, which merely counts how many)?

People chose to group the words by semantic hierarchy about three-quarters of the time. They were even more likely to choose phrases like “shoes blue these” over “shoes these blue”, in which the word “these” is even less tightly bound to the noun than the numeral. This suggests that the volunteers were consulting an internal hierarchy, not merely learning to invert the word order, says Culbertson.

The finding suggests that our brains learn language in a more complex way than simply working out which words are likely to go together in sequence, says Jeffrey Lidz, a linguist at the University of Maryland at College Park. This should add fuel to the debate over universal grammar. “For people who don’t believe in the Chomskyan idea, this will be a challenge,” he says.

Not everyone agrees. Our minds tend to group more similar objects in many different domains, says Adele Goldberg, a linguist at Princeton University. In a grocery store, for example, apples are more likely to be next to the oranges than next to the beer. A tendency to group adjectives close to nouns may reflect this general tendency, not any property universal to language in particular, she says.

Nonsense syllables

A second study, also released this week, hints at a second apparently innate facet to language. David Gomez, a neuroscientist at the University of Chile in Santiago, and his colleagues measured blood flow in the brains of 24 newborn infants as they listened to recordings of spoken nonsense syllables. The syllables differed in a linguistic property called “sonority”, which describes the consonants that most easily precede and follow one another.

Blood-flow changes revealed that the infants could tell the difference between syllables with well formed sonority, such as “blif”, and more poorly formed syllables, such as “lbif”, Gomez found. Since the infants had heard little speech in their brief lives, and certainly had never tried to pronounce the syllables themselves, it suggests an innate sensitivity to sonority, says Gomez.

In response to an enquiry from New Scientist, Noam Chomsky said the papers add little evidence to what is obvious. It’s like adding a toothpick to a mountain, he said.