Cognitive scientists develop new take on old problem: why human language has so many words with multiple meanings

January 19, 2012
by Emily Finn

Graphic: Christine Daniloff

Why did language evolve? While the answer might seem obvious -- as a way for individuals to exchange information -- linguists and other students of communication have debated this question for years. Many prominent linguists, including MITs Noam Chomsky, have argued that language is, in fact, poorly designed for communication. Such a use, they say, is merely a byproduct of a system that probably evolved for other reasons -- perhaps for structuring our own private thoughts.

As evidence, these linguists point to the existence of ambiguity: In a system optimized for conveying information between a speaker and a listener, they argue, each word would have just one meaning, eliminating any chance of confusion or misunderstanding. Now, a group of MIT cognitive scientists has turned this idea on its head. In a new theory, they claim that ambiguity actually makes language more efficient, by allowing for the reuse of short, efficient sounds that listeners can easily disambiguate with the help of context.

Various people have said that ambiguity is a problem for communication, says Ted Gibson, an MIT professor of cognitive science and senior author of a paper describing the research to appear in the journal Cognition. But once we understand that context disambiguates, then ambiguity is not a problem  its something you can take advantage of, because you can reuse easy [words] in different contexts over and over again.

Lead author of the paper is Steven Piantadosi PhD 11; Harry Tily, a postdoc in the Department of Brain and Cognitive Sciences, is another co-author.

What do you mean?

For a somewhat ironic example of ambiguity, consider the word mean. It can mean, of course, to indicate or signify, but it can also refer to an intention or purpose (I meant to go to the store); something offensive or nasty; or the mathematical average of a set of numbers. Adding an s introduces even more potential definitions: an instrument or method (a means to an end), or financial resources (to live within ones means).

But virtually no speaker of English gets confused when he or she hears the word mean. Thats because the different senses of the word occur in such different contexts as to allow listeners to infer its meaning nearly automatically.

Given the disambiguating power of context, the researchers hypothesized that languages might harness ambiguity to reuse words  most likely, the easiest words for language processing systems. Building on observation and previous studies, they posited that words with fewer syllables, high frequency and the simplest pronunciations should have the most meanings.

To test this prediction, Piantadosi, Tily and Gibson carried out corpus studies of English, Dutch and German. (In linguistics, a corpus is a large body of samples of language as it is used naturally, which can be used to search for word frequencies or patterns.) By comparing certain properties of words to their numbers of meanings, the researchers confirmed their suspicion that shorter, more frequent words, as well as those that conform to the languages typical sound patterns, are most likely to be ambiguous  trends that were statistically significant in all three languages.

To understand why ambiguity makes a language more efficient rather than less so, think about the competing desires of the speaker and the listener. The speaker is interested in conveying as much as possible with the fewest possible words, while the listener is aiming to get a complete and specific understanding of what the speaker is trying to say. But as the researchers write, it is cognitively cheaper to have the listener infer certain things from the context than to have the speaker spend time on longer and more complicated utterances. The result is a system that skews toward ambiguity, reusing the easiest words. Once context is considered, its clear that ambiguity is actually something you would want in the communication system, Piantadosi says.

Tom Wasow, a professor of linguistics and philosophy at Stanford University, calls the paper important and insightful.

You would expect that since languages are constantly changing, they would evolve to get rid of ambiguity, Wasow says. But if you look at natural languages, they are massively ambiguous: Words have multiple meanings, there are multiple ways to parse strings of words. This paper presents a really rigorous argument as to why that kind of ambiguity is actually functional for communicative purposes, rather than dysfunctional.

Implications for computer science

The researchers say the statistical nature of their paper reflects a trend in the field of linguistics, which is coming to rely more heavily on information theory and quantitative methods.

The influence of computer science in linguistics right now is very high, Gibson says, adding that natural language processing (NLP) is a major goal of those operating at the intersection of the two fields.

Piantadosi points out that ambiguity in natural language poses immense challenges for NLP developers. Ambiguity is only good for us [as humans] because we have these really sophisticated cognitive mechanisms for disambiguating, he says. Its really difficult to work out the details of what those are, or even some sort of approximation that you could get a computer to use.

But, as Gibson says, computer scientists have long been aware of this problem. The new study provides a better theoretical and evolutionary explanation of why ambiguity exists, but the same message holds: Basically, if you have any human language in your input or output, you are stuck with needing context to disambiguate, he says.

Related Stories

(PhysOrg.com) -- Linguists have thought for many years the length of words is related to the frequency of use, with short words used more often than long ones. Now researchers in the US have shown the length is more closely ...

Why are some words short and others long? For decades, a prominent theory has held that words used frequently are short in order to make language efficient: It would not be economical if "the" were as long as "phenomenology," ...

(Medical Xpress) -- Talking to children has always been fundamental to language development, but new research reveals that the way we talk to children is key to building their ability to understand and create sentences of ...

(PhysOrg.com) -- A new theory of meaning has the potential to revolutionise many artificial intelligence technologies and enable web searches that interpret the meaning of queries, according to its developer, a computer scientist ...

Native English speakers should give up their claim to be the guardians of the purest form of the language and accept that the ways it is used and changed by millions around the world are equally valid.

(Medical Xpress) -- Where does prejudice come from? Not from ideology, say the authors of a new paper. Instead, prejudice stems from a deeper psychological need, associated with a particular way of thinking. People who arent ...

Since 2008, MIT economist Tavneet Suri has studied the financial and social impacts of Kenyan mobile-money services, which allow users to store and exchange monetary values via mobile phone. Her work has shown that these ...

Researchers have discovered a dinosaur tail complete with its feathers trapped in a piece of amber. The finding reported in Current Biology on December 8 helps to fill in details of the dinosaurs' feather structure and evolution, ...

Reporting new research results involves detailed descriptions of methods and materials used in an experiment. But when a study uses computers to analyze data, create models or simulate things that can't be tested in a lab, ...

Nothing ruins a potentially fun event like putting it on your calendar. In a series of studies, researchers found that scheduling a leisure activity like seeing a movie or taking a coffee break led people to anticipate less ...

8 comments

The bug with this theory is that they looked at only languages which low homophony (English, Dutch and German) and not languages such as Chinese in which homophony is very dense (and word pronunciation has six different meanings!) It is not clear their findings apply to such languages. Second, if efficient use of pronunciations so important why is it that so many phonetically acceptable "nonword" pronunciations are not used?

I do not agree. In Chinese, the word order is much more important than in English or German. You can claim that the context - or word order - has an even larger effect on changing the meaning of a word.

So, even if there is a smaller set of syllables in Chinese than in English, the importance of context to differentiate meanings from words having the same sounds is even greater. I.e. the argument that sound ambiguity is allowed to re-use efficient sound combinations is even more valid.

Also, if you would repeat the study with Chinese, and look not at single words, but at groups of two or three words, I am sure you would find that there is more ambiguity in pairs than in triplets. Shorter combinations rely more on context to get the meaning accross.

Another argument. It would surprise me greatly if inefficient tone combinations are used as frequently as efficient ones. A tone number 3 (falling-and-then-rising) followed by another number 3, is - to my knowledge - not present at all.

I think the second tone number 3 modifies the first tone number 3 - I have forgotten exactly how - since the pronunciation would otherwise be very slow. You would have to perform an additional rising/falling tone compared to simplifying the first number 3 to a single rising tone (correct me if I remember incorrectly)

Ambiguity in syntagmatic constructions vitally enriches intended meaning and, paradoxically, contextualises in very relevant ways. Words are illuminated "peaks of meaning" in a linguistic landscape where they stretch down into a shadowed substrate that interconnects them all... The encoding of quantized intentionality in acoustic patterns is as old as human consciousness - it contains primary dynamics which have much in common with music, the latter also a language which accurately reflects an underlying reality.

How long are we going to hear the old funny story that the function of language is communication, and communication is exchange of information? Can one call himself a cognitive scientist if he seriously believes that this is the case? Or should we refer to such people as "voodoo scientists"? The function of language is co-ordination of consensual co-ordinations of behavior, it has nothing to do with so-called information transfer (after all, information is not a thing). I respect Chomsky as a very influential thinker, but errare humanum est, and Chomsky is only a human being.