These days we expect our sciences to have a practical side. We understand how things work and make use of the knowledge.

Science began as common sense put into theoretical shape by Aristotle. Thus, pretty much every advanced science has begun by showing what common sense missed and Aristotle got wrong. So common sense says the sun revolves around the earth. Then Aristotle developed a theory of physics that took common sense observations for granted. Aristotle’s physics, however, was purely theoretical without practical benefit.

Copernicus, Galileo and Newton overturned that common sense and introduced a more modern physics. The proof of the new science was that it led to practical applications, first in mechanics and later in space travel.

At the time of Galileo, Rene Descartes was also introducing a new theory of physics, one that relied solely on logical hypotheses and deduction. Although widely admired at the time, this work has not held up. For one thing, it did not address the common sense of earlier ages, for another it led to no practical or explanatory work.

Sixty years ago the study of language grew radical without addressing common sense or Aristotle. The common-sense proposition was that language is meaningful, and the Aristotelean theory was that language works by combining sounds with meaning. Reasonable as this definition sounds, nobody ever figured out how to use it and the practical traditions of rhetoric and composition pay no attention to Aristotle.

The linguistics’ movement of the late 1950s also ignored Aristotle and common sense. It pursued questions based on the logical hypothesis that language is a computation. Interestingly, the movement was led by a young thinker whose great hero was Descartes, and like Descartes, the movement’s work has led to no practical or explanatory success. It answers none of the traditional questions about language—e.g., Why are there so many and how can they be so different? What is meaning? How could it have begun? –and offers no practical clues to using language more effectively, or translating texts, or improving speech therapy, or overcoming dyslexia.

The problem seems to lie at the assumption that sentences are computations. On its own, the idea has some plausibility. If the brain is a computer, its output must be a computation. In computations, however, the same input produces the same result. In language, the result is not so predictable. If I participate in a soccer game and must report what just happened, I might say I kicked the ball or I sent the ball flying or The ball really jumped off my toe or I missed the goal or Joe was racing for the ball but I beat him to it or … and on and on ad infinitum.

This observation brings us back to meaning. Our utterances depend on what we have to say and language seems to communicate meaning. Could Aristotle have been right after all?

No. The proposition that language combines sound with meaning cannot be correct. The problem is that meaning is not a physical thing that we can somehow combine with sound waves. It is a ghost that Aristotle inserted into language back when inserting ghosts was no vice. He also inserting yearning into his list of elements: fire yearned to be high in the sky and rose toward the sun; earth yearned to go to the center of the world, so earthen matter fell and even accelerated as it approached its goal.

Kicking out the ghosts of physics was not easy because the things that Aristotle explained still needed explaining. The solution lay in saying that the rising smoke and falling meteors are effects of gravity.

My work on this blog has likewise persuaded me that meaning is an effect, rather than a cause.

The simplest example might be two people standing together when one of them points toward something. The other looks over and sees a policeman beating a man. The gesture directed the other’s attention. The meaning of the gesture came when the second person redirected attention and saw something new.

Suppose instead, one person tells another, “I saw a cop beating up a guy today.” The meaning is discovered by the same general principle of directing attention, the difference being that instead of directing a person’s eyes, the speaker directs the listener’s imagination. In both cases, the meaning is the result of the directed attention.

This reversal of meaning changes the task of speaker/writer. Instead of focusing on inserting meanings, the task to skillful language production lies in producing sentences that the audience can follow. How do we do that? By paying attention to the demands we place on the listeners’ attention.

The old man the boat. Oh, I’m sorry, did I lose you? It is not surprising. A reader first takes “The old man” as a noun phrase and needs a second look to grasp that “man” is a verb. This kind of sentence, known as a garden-path, is well known in linguistics and is strong evidence that listeners construct meaning as they go along. If they go astray, they must retrace their route, looking for the point where they got lost.

The old suffer many indignities. I hope that sentence was easier to follow. Why was it so? Because readers know to shift their attention from the old to suffer. This sentences helps the reader by making it easy to shift attention.

I have published a few papers on line (here and here) demonstrating that syntax directs attention, and that oddities proposed to illustrate a universal grammar can be readily explained as devices for directing attention.

I have been a decent writer for many years, but I am a better one now because I understand how to help readers make their way through complex sentences. So there has been a practical benefit to my years of wrestling with how language works. At last, rhetoric may be given a clear, theoretical footing.

This blog takes the position that language, in the sense of two or more people focusing together on a topic, is quite old. Archaeologists, Chomskyites and others tend to put it as a more recent in the human lineage, about 100 thousand or fewer years. I put it at approaching 2 million years. My main grounds for thinking such is based on cooperativeness and the idea that it took a long time to create the verbal environment that we now take for granted.

Slow evolution

I noticed an article from a couple of weeks back about the “truly” bilingual child, and I came across this passage, “Pediatricians routinely advise parents to talk as much as possible to their young children, to read to them and sing to them. Part of the point is to increase their language exposure, a major concern even for children growing up with only one language.”

It is a familiar sentiment, but it sparked me to think about the days when language was really new. At first people probably did not have too much to say to one another; talking was an occasional thing, and even today verbal richness is impaired if we are not surrounded by words. When language was new our ancestors could talk, but they were still linguistically impoverished when compared to today’s oral cultures. Their children did not grow up hearing a ceaseless yakety-yak and did not create a rich verbal environment themselves.

We can assume that language was first used to relate news of the here and now: there is a carcass we can scavenge yonder; I just saw a lion; your mother is down at the creek. News of this type is not going to produce chatterboxes. For that you need narratives, strings of two or more sentences: (1) there is a carcass we can scavenge yonder; (2) bring some cutting stones.

It seems unlikely that early talkers went straight to sentences. The pattern we see in children is probably a quick-time recapitulation of the developmental process—words, phrases, basic sentences; richer sentences; strings of sentences. The jump from words to phrases probably came quickly as a few captive bonobos have managed to join words meaningfully in sign language. I once heard a toddler use a phrase on her first birthday. I was inclined to attribute it to the excitement of a birthday party, but she quickly made phrases a regular part of her speech. Sentences, however, were another matter.

When we imagine early talkers—say, Homo erectus and precursors—we ought to think of their language like their tools, simple but persistently part of their lives. And we should try to imagine it staying that simple for perhaps a million years while their brain grew large enough to handle the load.

Full, transitive sentences join two things with an action, e.g., the zebra kicked the lion. Children use a few verbs right away—eat cookie; want juice—but most verbs are late in arriving. Some extra maturation of the brain appears to be required for a person to unite two things through a single action. Simply perceiving what happened requires a feat of attention that may be beyond a two-year-old. Anybody who has watched an unfamiliar sport knows how difficult it is to perceive just what happens in complex, unexpected actions.

Transitive verbs allow for mythological and abstract thinking. Abstract ideas like not fair are probably very old, but the idea of making something fair—as in I will weigh my mischief in the balance with three days labor—requires a very difficult concept. The verb weigh…in the balance is a metaphor that somehow compares apples (my mischief) and oranges (three days labor). We take for granted blind justice holding up scales, but the original person who spoke of such things was a first-class poet.

By 100,000 years ago, sentences, narratives, abstractions and metaphors were probably all there for the chatterboxes to drone on about, and to leave the archaeological clues that indicate cultures steeped in symbolism. But symbols did not spring fully ripened from the first talkers’ tongues.

Cooperation

The other line of reasoning that brings me to the same conclusion is Homo's hyper-sociality. The African savanna promotes togetherness. The grass eaters form herds and the predators hunt in groups. Loners like rhinoceroses and bull elephants need to be huge so the predators cannot harm them. With the savanna's emergence a few million years ago the already social primates that stayed on the plain had to become even more dependent on one another. What emerged from the process was a terrifying new species able to stand up to the predators and bring down the herders. The only way this success was possible was by regular cooperation and sharing.

Going back as far as Homo habilis we know that individuals taught other individuals how to make tools. The same tools turn up in many sites even thousands of miles apart and persisted unchanged for hundreds of thousands of years. It seems likely that the teaching relied more on demonstration than on telling, although words may have played a part.

Cooperation is not the first solution Darwinian processes attempt and most living organisms depend on themselves, but super-cooperative species like eusocial insects prosper because they share information. When cooperative sharing appears evolution has found a trick that pays off. The Homo lineage has probably been pointing and demonstrating since the beginning, meaning we have been motivated to help one another for almost two million years. Work with apes has already established that our ancestors had the brains to use words. If we combine the presence of brains and motivation, it seems strange to insist that words did not come for the first 1.7 million years. Indeed, I doubt anybody who insists language must be new. If they want to persuade me, find some evidence that cooperation is new, or that a properly motivated ape will have the tools to tell me a story.