Babel's Dawntag:typepad.com,2003:weblog-3679412018-10-14T12:21:03-04:00A blog about the origins of speech: from primate vocalizations to story tellingTypePadDeciphering a Metaphortag:typepad.com,2003:post-6a00d83452aeca69e2022ad3b871bd200b2018-10-14T12:21:03-04:002018-10-14T12:21:03-04:00I have been thinking about my last post and Hubert Haiber’s argument that “Natural languages have the properties they have because they reflect the properties which our language-learning and language-using human brain capacities can cope with.” ( See An anthropic...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p><span style="font-weight: 400;"> <a class="asset-img-link" href="http://www.babelsdawn.com/.a/6a00d83452aeca69e2022ad372b2d4200c-pi" style="display: inline;"><img alt="Viruses" border="0" class="asset asset-image at-xid-6a00d83452aeca69e2022ad372b2d4200c image-full img-responsive" src="http://www.babelsdawn.com/.a/6a00d83452aeca69e2022ad372b2d4200c-800wi" title="Viruses"></img></a><br></span></p>
<p><span style="font-weight: 400;">I have been thinking about my last </span><a href="http://www.babelsdawn.com/babels_dawn/2018/09/the-germ-of-language.html"><span style="font-weight: 400;">post</span></a><span style="font-weight: 400;"> and </span><a href="https://www.uni-salzburg.at/index.php?id=24896"><span style="font-weight: 400;">Hubert Haiber’</span></a><span style="font-weight: 400;">s argument that “Natural languages have the properties they have because they reflect the properties which our language-learning and language-using human brain capacities can cope with.” (</span><span style="font-weight: 400;"> See </span><a href="https://www.researchgate.net/publication/327601113_An_anthropic_principle_in_lieu_of_a_Universal_Grammar"><em><span style="font-weight: 400;">An anthropic principle in lieu of a “Universal Grammar</span></em></a><span style="font-weight: 400;">) This principle might seem self-evident, even circular, but it is a direct challenge to the concept of an innate, universal grammar. Indeed, Haiber is attacking Chomsky’s innate (a.k.a. nativist) principle directly: “Nobody has ever been able to produce immediate and compelling evidence in favour of the strong nativist hypothesis.” Disagreeing with Chomsky is hardly news, and I would probably let the paper pass by if it weren’t for a second feature, the metaphor of grammar as a virus. ("On the level of cognitive structures, grammars are self-reproductive in the same way as a virus…”) Viruses need a host to multiply and so, goes the metaphor, do grammars.</span></p>
<p><span style="font-weight: 400;">Only, Haiber insists he is not speaking metaphorically. “A Grammar is--even literally--a cognitive virus programme. It reproduces itself, but it needs a host that provides a replication environment, just as any virus does. Grammars ‘infect’ human brains as a result of language acquisition. The cognitive virus corresponding to the grammar of our mother tongue governs our language production behaviour.” But Haiber is confused. Is he talking about Grammar (as he specifies at the start of the quotation) or about language production behaviour (as he says at the end)? He thinks grammar can be a literal virus, so he starts talking about it, but at the end of the paragraph he describes language productions (utterances) going in and coming out. (“Children acquire their grammar on the basis of being exposed to language productions and they put it to use, Afterwards, their productions become part of the input for the next generation’s acquisition of grammar, and so on.”)</span></p>
<p><span style="font-weight: 400;">Haiber’s confusion over whether he is talking about language or grammar comes from his desire to be a reformer rather than a revolutionary. He hopes to replace the notion of an innate universal grammar with a learned mother tongue while leaving the rest of Chomsky in place. Sorry, professor. If utterances go in and utterances come out, it is utterances that are doing the evolving and, if that is so, Chomsky. who focuses exclusively on the non-utterances of internal language, collapses.</span></p>
<p><span style="font-weight: 400;">Is it so? Can we really talk about language as a virus? Yes, but only as a metaphor. Nonetheless, metaphors are extremely useful. You just have to be careful about where and when they apply and you can never argue ‘</span><em><span style="font-weight: 400;">since X is part of the metaphor, X is part of language</span></em><span style="font-weight: 400;">.<em> Since Y is not part of the metaphor, Y cannot be part of language.</em>’</span></p>
<p><span style="font-weight: 400;">One clear break in the metaphor is the process of replication. Viruses use a cell but replicate themselves. Utterances do not literally replicate themselves, yet they do change over time.</span></p>
<p><span style="font-weight: 400;">Viruses evolve via natural selection. Do utterances? Happily, the metaphor holds in this instance. We can say the brain that perceives the utterance is the equivalent of the cell that supports the virus. The brain’s cognitive functions include memory and perception. To be a candidate for selection, the brain must be able to perceive at least part of an utterance and remember it. Anybody who has tried to learn a new language has noticed how hard it is to perceive the details of the sounds rushing by the ear. “They speak so fast,” is a common complaint about any population of native speakers whose words overwhelm the newcomer. Perceiving an utterance is not a simple task. So we have at least two features that can play a role in selection. Can a speaker perceive an utterance and remember it?</span></p>
<p><span style="font-weight: 400;">There is also a feature of language that has no counterpart in a virus (or any other biological product for that matter): the topic. A conversation requires two or more people to pay attention to a shared topic. Without the topic and joint attention, language is impossible. This dramatic difference between language and evolving DNA strands explains many differences between a language and a virus. Viruses vary but only slightly, while language output varies greatly but not completely. We can produce never-before-uttered sentences, but the words are familiar. We can produce never-before-uttered words, but their syllables are familiar. We can produce never-before-uttered syllables, but their context had better be danged familiar. Utterances bear a family resemblance, suggesting they are part of a system, and utterances can be quite unusual, suggesting a creative or divine source for their content.</span></p>
<p><span style="font-weight: 400;">Haiber offers no hint as to how novel syntax structures (e,g,, phrases, clauses, sentences) emerge from utterances. I have been working for sometime on what I call attention-based syntax (see </span><a href="http://www.mind-consciousness-language.com/articles%20bolles2.htm"><span style="font-weight: 400;">paper 1</span></a><span style="font-weight: 400;"> and </span><a href="http://www.mind-consciousness-language.com/articles%20bolles3.htm"><span style="font-weight: 400;">paper 2</span></a><span style="font-weight: 400;">) and can therefore propose a way to use attention that has nothing to do with Haiber’s reliance on Chomskyan computations. Basically, a speaker uses language to direct attention from point to point, forming a gestalt (or whole). The task of the listener is to follow the utterance from point to point, and the task of the speaker is to shape the utterance so that it leads from point to point. </span></p>
<p><span style="font-weight: 400;">To sum up: using a language as a listener requires an ability to perceive an utterance in detail, remembering what is being said, and following the topic from point to point. Meanwhile, speaking requires an ability to form an utterance in detail while directing the topic from point to point. Selection of utterances thus depends on how much of an utterance a listener can perceive and how much can a speaker reproduce; how much of an utterance can a listener recognize and a speaker recall; how much of an utterance can a listener follow and how skilled is a speaker at directing a listener’s attention from point to point in a topic; and does a listener care enough to pay attention? </span></p>
<p><span style="font-weight: 400;">Is that all there is to language? It sounds too simple to credit. After all, the brain grew enormously over the past 2 million years. Was none of that an adaptation to language? I have thought so, but I may have been wrong. </span></p>
<p><span style="font-weight: 400;">Right now there are three competing theories: (1) language is entirely the product of cognitive operations (most famously supported by Chomsky, but there are many other versions of the idea); (2) language is the product of co-evolutionary adaptations by both the brain and language (most strikingly proposed by Terrence Deacon who has inspired many variations on his theme); and (3) language is entirely learned (most notoriously argued by B.F. Skinner who was beaten so badly that this very old idea has been recalled to life only by adopting radically different premises about how learning can proceed). </span></p>
<p><span style="font-weight: 400;">This blog has been pretty solidly on the side of theory #2, but now I want to consider #3 a bit. The idea of language as a kind of artificial virus that has been selected and replicated by the brain appeals to me, largely because I can see that, if true, the fantastic freedom of language is readily explained. People can speak with strange accents and disobey many of the formal rules insisted upon in school and still be understood. How is that possible? Theory 1 sees the brain as a rule-based set of behaviors, and is hard put to explain why we can follow a great deal of rule-violating speech. Computers certainly cannot stray from the rules. Theory number 2 is a bit less fixed, but still has trouble with speech’s rhetorical freedom. Theory 3, however, tosses aside rules (although there are habits), so the freedom is far less of a problem.</span></p>
<p><span style="font-weight: 400;">Thus, even with #3’s dubious simplicity, I think I will give it a closer look. I may find some undeniable road block, but it seems worth investigating. Next post, let’s take a look at language acquisition and the poverty of the stimulus. That was the issue that ruined Skinner and gave Chomsky staying power. So I might as well face that one head on.</span></p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=m3oMCaHCcX8:6MRFMEpWnK4:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>The Germ of Languagetag:typepad.com,2003:post-6a00d83452aeca69e2022ad36d44d3200c2018-09-23T09:11:39-04:002018-09-23T09:11:39-04:00Louis Pasteur (1822-1895) Let’s begin with a ridiculous question: there are many possible grammars, most of which are too complicated for humans to speak, and yet all the thousands of grammars used by humans just happen to be grammars that...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p><a class="asset-img-link" href="http://www.babelsdawn.com/.a/6a00d83452aeca69e2022ad3936746200d-pi" style="display: inline;"><img alt="Pasteur" border="0" class="asset asset-image at-xid-6a00d83452aeca69e2022ad3936746200d image-full img-responsive" src="http://www.babelsdawn.com/.a/6a00d83452aeca69e2022ad3936746200d-800wi" title="Pasteur"></img></a></p>
<p><span style="font-size: 10pt;"><strong>Louis Pasteur</strong> (1822-1895)</span></p>
<p>Let’s begin with a ridiculous question: there are many possible grammars, most of which are too complicated for humans to speak, and yet all the thousands of grammars used by humans just happen to be grammars that we can learn to speak, and learn quite readily. How did that improbable truth come about?</p>
<p>Presumably everybody can see immediately that only learnable grammars will be used and passed down through the generations, so of course we all use learnable grammars. There is, however, a hidden assumption in this explanation. Grammar itself must have been selected to fit the human cognitive environment. It could not be an innate. If it were somehow imposed on humans from the outside, the odds that the grammar built into our brain would be usable would indeed be so small as to be miraculous.</p>
<p>I have begun this report so strangely, in response to a paper that has been posted on the internet by <a href="https://www.uni-salzburg.at/index.php?id=24896">Hubert Haider</a> and titled <em><a href="https://www.researchgate.net/publication/327601113_An_anthropic_principle_in_lieu_of_a_Universal_Grammar">An anthropic principle in lieu of a “Universal Grammar”</a></em>. The ‘anthropic principle’ of the title is a proposed answer to the question that there are many conceivable universes where the arbitrary values of fundamental constants in physics are different, and in most of these universes people are impossible. How did the improbable come about so that, of all the possible universes, we got one in which human life is also possible?</p>
<p>At this point it is tempting to soar off on a tangent about the anthropic principle and the nature of the universe, but frankly I consider Dr. Haider’s introduction of the anthropic principle a red herring. In his notes, Haider says he had worked out his evolutionary argument and then a friend suggested it related to some outré ideas among physicists. I think on close reading, the ideas are fundamentally different and so I’m skipping the physics tangent.</p>
<p>Haider’s basic idea builds on Michael Arbib’s notion of the language-ready brain and Terrence Deacon’s notion of the co-evolution of language and brain, only Haider comes at them from a different angle. Arbib’s language-ready brain is ready to generate speech according to rules while Haider sees the brain as ready to host language as a kind of parasite that adapts ever more precisely to the brain’s operations. Deacon’s co-evolution involves a brain adapting to language and vice versa, whereas Haider focuses almost exclusively on the language adapting to the brain part. The difference between the older notions and the newer one is ultimately as stark as the difference between Louis Pasteur and the naturalist doctors who preceded him.</p>
<p>The naturalist sees language as a thing generated by the body, and thus, as Chomsky has directly stated, is a kind of organ that grows and becomes part of us. Like any other bodily organ, it can be studied by itself and found to possess all kinds of properties and structures. Meanwhile, the Pasteurite sees language as a kind of germ that enters the body and takes over some part of it. The evolutionary history of naturalist language studies the evolution of the organ. The Pasteurite’s evolutionary history explores how the germ changes in order to survive and grow.</p>
<p>One might object that Pasteur’s germs are real and have an existence of their own, while language has no separate existence, but is that so? Language is truly shared by contemporaries and through generations. Language can be preserved in writing and even recovered, as the Mayan, Egyptian and Linear B languages were recovered. This argument may seem more confounding than persuasive. Darn it; we surely know that language has no separate existence apart from the speakers, listeners, and readers who use it. Talk of germs merely muddies the water.</p>
<p>Perhaps we can gain a little ground by specifying what kind of germ language is. It is a virus and like any virus it can only come to life when it enters a host. Once stirred, to life it goes about its own tasks and has an identity separate from its host. We are used to thinking of viruses as bad things, but that is a prejudice. I would not be surprised to learn that there are helpful viruses, and it seems certain that the future will bring doctors who can cure disease through the introduction of artificial viruses built for medical purposes.</p>
<p>So let’s say that—language is an artificial virus introduced by people for their own communicative needs, but once introduced it took on a life of its own and over many thousands of years has evolved quite a bit.</p>
<p>One benefit of this approach is that it immediately frees the theorist from having to find selective advantages for humans who introduce a change in language. What is the selective advantage for people who add adverbs to speech? There seems no explanation more convincing than a just-so story. But flip it around. From the virus's point of view and adverb may just be a mutation that adapts itself to the human cognitive system and survives. Just-so stories go out the window.</p>
<p>So the virus concept could actually be helpful in thinking about language origins and development. In future posts I will explore this idea to see how far it can take us before collapsing under the weight of its own metaphor.</p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=eu7DSx6Cjc4:aj3RT0JcS4I:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>Twixt Brain and Tonguetag:typepad.com,2003:post-6a00d83452aeca69e2022ad3a73384200b2018-08-09T07:11:03-04:002018-08-09T08:13:24-04:00I have been pretty busy of late, but I have an urge to say something if only to prove I’m still alive so I thought I’d summarize what I know about the brain’s evolution and language. The main thing we...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p>I have been pretty busy of late, but I have an urge to say something if only to prove I’m still alive so I thought I’d summarize what I know about the brain’s evolution and language. The main thing we know for an absolute certainty is that the brain suddenly got a lot bigger, about 3.3 times larger than the ape brains of Australopithecus of 2.5 million years ago.</p>
<p>Brains are expensive body parts. They put heavy metabolic demands on the system and they do not forgive malnutrition, especially in childhood. So the expansion of the brain size was only possible if food was reliable throughout the year. There are some possible trade-offs. We can sacrifice other metabolic demands. Chimpanzees and gorillas are much stronger than humans, strong enough to be able to tear humans apart. So we might have sacrificed some serious muscle strength in order to support our brains, but that only works if the ancestors found some substitute for muscle power. Stone tools were part of the kit of the earliest <em>Homo</em> and perhaps there were other wooden and vegetable tools that did not survive. Fire is another solution. Cooked food provides many more calories than raw stuff, but when did cooked food come into style? A quarter million years ago? Half a million? A million? A million and a half? There are advocates for each of these answers. I lean toward old dates, but who can say? It seems fairly clear that the brain was already growing when fire came along, but cooked food probably made further growth possible.</p>
<p>Another source of reliable food is cooperation and moral duties. In many societies, especially poor ones, people with food have a duty to share it with people (especially kin) who don’t have it. How far back do those customs go? I’m willing to say that sharing food probably goes back 2 million years to the earliest <em>Homo habilis</em>. Presumably, their manner of sharing was simple and without moral complexity, but sharing is a way to get everybody through the rough patches. One of the peculiarities of humans of every society is the shared feast. Mealtime is a social time. People eat together and pass food between themselves. Thus, the sick, the weak, the young, and the unlucky do not starve with anything like the frequency that starvation plagues animal societies.</p>
<p>Shared food is my own candidate for the earliest source of sufficient calories to support a bigger brain. Tools are the other candidate. Fire seems to imply sharing already existed (the idea of everybody making their own fire to cook their own hard won bit of meat is too nutso to tolerate). When fire came along, people were already first-class cooperators.</p>
<p>Meanwhile language may have been tagging along for the ride. I am a proponent of very old language, 2 million years old, but that first language was very simple, perhaps no more complex or rich than the speech of today’s eighteen month old. Language grew more complicated as the brain grew big enough to support more elaborate speech. That brings us to Deacon’s theory of the co-evolution of brain and language. It seems clear that the brain evolves more slowly than language does, so there was probably no co-evolution that favored a particular set of syntactical rules (sorry, Noam) but a more general co-evolution is quite possible. Clearer enunciation, increased vocabulary size, the ability to understand complete sentences as a unitary image, the use of abstractions and metaphors… all these things are part of every language but were probably not part of the bah-bah speech of M. and Mme. Habilis.</p>
<p>This raises another question. What did we do with all our new smarts? We did not just scale up the chimpanzee brain. The overwhelming growth was in the neocortical region with the cerebellum following not that far behind. One brain part that was unable to keep up with the increase was the corpus callosum, the wiring that links the two halves of the brain. Ours cannot allow for as much synchronization between right and left portions of the brain. Thus, the mere fact of getting a much bigger brain meant that significant localization of functions, particularly with new functions, was likely. A function would evolve on one side of the brain and not the other.</p>
<p>That localization suggests a role for consciousness. If the left side of the brain organizes language syntactically and the right side organizes it rhythmically, perhaps it takes consciousness (or even talking out loud) to put it all together. Years ago, on this blog, I cited work on Nicaraguan sign language that found rhythm was essential to creating language. Perhaps here is a reason. It gets the two halves of the overstuffed brain working together.</p>
<p>That may be fanciful, but even if not, the main key to organizing language has been the rise of many circuits in the brain linking the many parts together electrically. You can wonder how valuable all this brain imagery really is, but it has definitely established that there is no region of the brain controlling language the way there is a visual region and an auditory one. Language construction connects all these different parts to create a whole. When we think in words, we think of ways to connect the previously unconnected. Listening to a great story gets the whole brain chugging along. (While listening to a great liar (as we do these days) puts different parts of the brain in conflict.) That connecting seems to account for much of what our brain has been up to for the past two million years. It has been running wires from a billion point As to another billion point Bs and we are still working out the consequences of the change.</p>
<p>In other animals, the neocortex is largely used to handle sensory input and perception. In humans, it might seem that we have not used our brains to improve our senses, but language lets us know what is going on in other heads. Talking as sharing sounds too much like a 12-step program’s concept to appeal to me but it seems to have much in its favor. There seems no question that the main ways people have gotten so much more powerful is that they learn from history and each other. Without our big brains we’d still be babbling, but without language our big brains would just be a few trillion more cells to feed.</p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=JWcaFyMiv1k:E6knrIh2fgs:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>Sprechen sie Neanderthal?tag:typepad.com,2003:post-6a00d83452aeca69e201bb09f46d53970d2018-02-18T18:16:01-05:002018-02-18T18:14:29-05:00Is language very, very old or just really old? By “just really old” I mean 80 to 100 thousand years. Our own line of Homo sapiens was until recently dated at about 200 thousand years and now seems to be...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p> </p>
<p class="asset-video"><iframe allow="autoplay; encrypted-media" allowfullscreen="" frameborder="0" height="281" src="https://www.youtube.com/embed/DzuMDjsXL9M?feature=oembed" width="500"></iframe></p>
<p>Is language very, very old or just really old? By “just really old” I mean 80 to 100 thousand years. Our own line of <em>Homo sapiens</em> was until recently dated at about 200 thousand years and now seems to be 300 thousand years old. The thought that for 2/3s of its history, even <em>Homo sapiens</em> was without language is startling, since so much of our species seems built for language. Our vocal system includes features like a wind pipe that is exposed to the mouth, increasing the risk of food going down the air pipe to the lungs by mistake. We got rid of the air sacs in our chests (still present in other apes) that reduces our ability to yell impressively, while improving our ability to speak clearly. A whole host of muscles and timing permit us to articulate a wider range of sounds more precisely and fluently. So, if language is only 100 thousand years old, these features must have evolved for reasons other than language. No one has much of an idea what those reasons might be.</p>
<p>Or language might be very, very old, that is to say 1.8 to 1.5 million years old. That would make it anywhere from 15 to 22.5 times older than common estimates. This figure would provide the time necessary to evolve all the physical traits we enjoy that support language, but it raises the question of what took us so long to conquer the world. If the <em>Homo erectus</em> of 1 million years ago already had half a million years of speech behind it, why was the culture still so crude and the tool box so simple?</p>
<p>The objection to the very, very old hypothesis does not seem insurmountable. The original speech may well have been much simpler than modern languages and it could have taken us a long time to evolve the ability to speak in complete sentences, use metaphors, remember the references in long sentences, etc.</p>
<p>One attractive test for the two views of language’s age is the Neanderthal. Did those guys speak, or not? The very, very old hypothesis predicts they did. The just really old hypothesis predicts they did not. So which is it? The question was once thought unanswerable, due to the lack of witnesses; however, evidence has been growing to indicate the Neanderthals did speak. The latest review of the data comes in the journal of <a href="http://www.mdpi.com/journal/behavsci"><em>Behavioral Sciences</em></a> in a paper by <a href="http://www.mpi.nl/people/dediu-dan">Dan Dediu</a> and <a href="http://www.mpi.nl/people/levinson-stephen">Stephen C. Levinson</a> titled “<a href="http://pubman.mpdl.mpg.de/pubman/item/escidoc:2521815:7/component/escidoc:2538918/Dediu_Levinson_2018.pdf">Neanderthal Language Revisited: not only us.</a>”</p>
<p>Of particular interest is the genetic evidence. In recent years we have studied the genetic makeup of as many as 20 Neanderthals, telling us things we thought beyond knowing. For one thing they have put an end to wondering whether there are any remaining descendants of Neanderthals. There are. Interbreeding occurred at least a few times. The authors don’t have much to say about genes and language beyond noting that “unfortunately, linking molecular genetics to language and speech is an extremely complex endeavor.” Nothing definitive here.</p>
<p>More promising is the evidence of how widespread the Neanderthals were, stretching from Siberia to Gibraltar. They adapted to their habitats, being mostly carnivorous in the northern climates and were skilled leather workers. They had stone and bone awls probably used to stich leather pieces together to create warm clothing, Further south the diets were more mixed. Fossils also indicate some dental work. They appear to have had the technology to apply handles to cutting edges a quarter of a million years ago and by 50,000 years ago could produce fire on demand. It used to be said that Neanderthals did not use symbolic decorations, but we now know they used shell and teeth necklaces and buried infants with things like antler horns that could have had no literal reason for being included with the corpse, and as much as 170 thousand years ago they made “circular constructions from broken stalagmites more than 300 meters deep in the Bruniquel cave [located in southern France], for which it is hard to imagine any reason other than ceremonial.”</p>
<p>The arguments in favor of Neanderthal speech are based on a preponderance of the evidence rather than established beyond a reasonable doubt. A preponderance of the evidence means that it is more likely to be true than false. The issue is not a coin toss, but one in which the odds of Neanderthal speech appear to be greater than 50%.</p>
<p>The main argument: “<strong>Language affords culture-carrying capacity</strong>,” that is to say you cannot have any but the simplest cultures if you don’t have language. The difference between simple cultures and advanced ones is that (1) advanced cultures include behaviors that serve no biological function. Chimpanzee cultures have tools that help them get food (a biological need) while all human cultures include ceremonies that serve no biological need (e.g., funerals, coming of age rituals, welcome-newborns ceremonies like baptisms and brises). The use of necklaces, burial rites and deep-cave ceremonies indicate that Neanderthals had advanced cultures that could only be sustained verbally; (2) advanced cultures allow for full adaptation to a new ecological niche. Chimpanzees, gorillas and orangutans are clever, but confined to a fairly narrow niche. They have not spread across the globe. Meanwhile, Neanderthal ability to adapt to radically different environments also argues for more advanced adaptive powers, most likely arising from verbal skills that allow a population to share insights and adapt as a group to new conditions; and (3) advanced cultures have advanced tools that cannot be learned just by watching. Neanderthals were attaching cutting blades to handles perhaps 250 thousand years ago. I don’t know how to do that today. Probably, one master worker had to show and tell apprentices how to do the work.</p>
<p>These findings do not settle the matter but they do indicate that Neanderthals are more likely to have had some language (not necessarily identical to our own) than not. Theories about language then are more likely to be correct if they allow for a long evolutionary history than a short or, in Chomsky’s case, an instantaneous origin.</p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=FOqGJDTxkys:xKSrNnJCPMg:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>A Blog for Internet Neutralitytag:typepad.com,2003:post-6a00d83452aeca69e201b7c9363e92970b2017-11-21T16:18:01-05:002017-11-21T16:18:01-05:00I try to keep my political opinions to myself on this blog, but I must speak up when the blog itself is under threat. It has been possible for me to maintain this blog (now in its 11th year) because...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p>I try to keep my political opinions to myself on this blog, but I must speak up when the blog itself is under threat. It has been possible for me to maintain this blog (now in its 11th year) because the Internet infrastructure plays no favorites. Google, Facebook and Amazon are big but not so big that they block out access to all the tiny voices that the Internet makes possible.</p>
<p>Internet neutrality, simply put, forbids Internet Service Providers from favoring certain providers. In effect, it prevents the rich Internet sites from slowing down or blocking entirely the sites of other, less rich websites. In other words, a no-money site like this one can still find its audience. My audience is small but surprisingly loyal. Some people have been with me for years.</p>
<p>If you believe that small sites are a valuable part of the Internet, please take the time to get a little informed and strive to make your voice heard in opposition to the proposed rule changes. The New York Times predicts there will be a huge lobbying effort both pro- and con- neutrality. The big money will be on the side of changing the Internet, but even in the age of Trump the little guy is not without hope of an even break. Let your voice be heard now, lest it be the last time your voice can be heard at all.</p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=alLBLslETak:X_ejwczhCZc:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>Chimpanzees Warning Calls -- How Close to Language?tag:typepad.com,2003:post-6a00d83452aeca69e201b7c934af13970b2017-11-16T19:49:23-05:002017-11-16T19:55:10-05:00The New York Times has a story in today's Science section about chimpanzees changing their warning call if they think other chimps already know about the danger: The significance of the finding, Dr. Crockford said, is that it challenges the...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p>The New York Times has a <a href="https://nyti.ms/2hC7BYH">story</a> in today's Science section about chimpanzees changing their warning call if they think other chimps already know about the danger:</p>
<p style="padding-left: 30px;"><em>The significance of the finding, Dr. Crockford said, is that it challenges the view that only humans keep track of what others know and change their communication to match. “This experiment shows they are monitoring their audience,” she said of the chimps.</em></p>
<p>That part did not interest me much. Chimps are smart and know something of what their fellows think. This is the kind of finding that gets a reaction when the finder (and Times reporter) have no theory about what matters.</p>
<p>But I have a theory and something else in the story struck me as quite important:</p>
<p style="padding-left: 30px;"><em>...chimps that thought their fellows were unaware of the road hazard made more alert hoo calls. They also stayed longer to look back and forth from the snake to where they thought their companions were. That’s the way chimps try to show their friends where a danger is.</em></p>
<p>Why do I think that's a big deal? Because the chimpanzees are drawing attention to something.</p>
<p>It sounds like they are drawing attention to their own location rather than the snake itself. It is not quite joint-attention. The signaler focuses attention on another chimp and the listener looks at the signaler rather than trying to make out the snake. But they have a topic (a snake) and wouldn't have to change much to have a true speech triangle. Keep your eye on chimp behavior during warning signals.</p>
<p> </p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=49JR9tNxBuU:N0v3s7nZVL0:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>Speech's Side Effectstag:typepad.com,2003:post-6a00d83452aeca69e201b8d2bdb827970c2017-11-12T21:30:38-05:002017-11-12T21:30:39-05:00Language, at its core and as presented on this blog, is a tool for sharing joint attention in contemplation of a topic. By now it has other functions as well, but the definition I just offered is the sine qua...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p><a class="asset-img-link" href="http://www.babelsdawn.com/.a/6a00d83452aeca69e201bb09d68431970d-pi" style="display: inline;"><img alt="Typing Chimp" border="0" class="asset asset-image at-xid-6a00d83452aeca69e201bb09d68431970d image-full img-responsive" src="http://www.babelsdawn.com/.a/6a00d83452aeca69e201bb09d68431970d-800wi" title="Typing Chimp"></img></a></p>
<p>Language, at its core and as presented on this blog, is a tool for sharing joint attention in contemplation of a topic. By now it has other functions as well, but the definition I just offered is the <em>sine qua non</em> of the phenomenon. When language appeared, it suddenly became possible to discuss or at least report matters of mutual interest. Most definitions ignore the business about joint attention and say something like language is a tool for communicating with symbols. But I have become persuaded that focusing on symbols misses language’s key feature, the harnessing of attention.Symbol-based theories of language origins look for the introduction of a words, but a better question asks how the human lineage managed to bring attention under control.</p>
<p>Attention itself is very old and reflexive. Animals do not control it; it controls them. Any of the senses can be startled and reflexively an animal directs attention to the surprise. Chimpanzees have figured out how to use that reflex. They have been observed slapping the ground and then, when a troop-mate turns its head, the slapper begs for food. Presumably, the apes of 6 million years ago did the same, but joint attention is something else. If a chimpanzee slapped the ground and then, upon catching another’s attention, pointed toward a third thing, perhaps a pineapple bush, we would have an example of harnessed attention producing joint attention. It turns out, however, that chimpanzees do not harness attention to point elsewhere. Their attention-claiming is very much a <em>look-at-me-dammit</em> kind of action. Joint attention is a double phenomenon. A person pays attention to something out there in the world, but is also is aware of the other attender.</p>
<p>Joint attention is more complicated than simply paying attention to the same thing. Two strangers can pay attention to the same thing just by standing at the corner and watching for the green light. Joint attention allows one person to say to another, “Boy, it is a long time coming,” and the listener replies, “Will it ever change?” In this case, their common attention of the light signal is complicated by their mutual awareness of the other’s focus on the same thing. That’s joint attention: focus on one thing along with shared awareness of each other.</p>
<p>Joint attention might have begun with a sound and a pointer. <em>Ork</em> and point toward a rival band of hunters on the horizon; <em>ork</em> and point toward vultures circling and landing off toward the horizon. <em>Ork</em> may have been just an attention getter, but once attention was combined with pointing, language became inevitable, assuming our ancestors had world enough and time. The cooperative benefits were just too great for evolution to ignore. But what happened to make our ancestors willing to share attention?</p>
<p>If speech is a side effect of joint attention, speech has several astonishing side effects of its own. First, talkers live much more of a conscious life than non-verbal species. Attention requires awareness. An animal is startled by a sound or a movement or odor and focuses attention on it, becoming aware of sensations and perceptions. Awareness is a total mystery, but I see no reason to suppose that an elephant at attention is any less aware than a human. However, humans have become such chatterboxes, paying joint attention to one thing after another, that we live in our consciousness much more than any other animal type does. Sure we have plenty of unconscious reflexes and associations shaping our behavior as well, but we can have conscious purposes too. Apes, especially orangutans, are clever and surely have conscious purposes at times, but human civilization is amazingly shaped by conscious purposes. Many people attribute these talents to language, but computers can use language (in a way) but they process it purely on the symbolic level; joint attention has no role in computer processing. Meanwhile, people use language to direct their attention and have prolonged conscious experiences. It is the joint-attention part of language, not the symbolic part, that keeps us conscious, allowing us to have novel purposes, pleasures, and powers.</p>
<p>Conscious attention has another strange side effect. It moves us out of the here and now. All the world’s other animals live in the moment. Their senses alert them to their present condition. From time to time they focus attention on something, but that is to understand the present more clearly. Suppose for some random neurological reason a chimpanzee’s brain flashes a picture of its mother’s face. Maybe some smell or sound has called up an association. The chimpanzee may be surprised but the moment passes and the chimpanzee is right back in the here and now. Now let’s suppose that an aged human is suddenly reminded of his mother. He has a name for the unexpected image (mother) and may use that term to start recalling other things about his mother. Suddenly thirty seconds have gone by in which the human was engaged with the past instead of the now. Is that good? Many would say no, but it is part of being human and has created a strange fact about human societies everywhere. They are engaged in a world very much of their own making. Every human community is full of symbols, laws, and beliefs that must be learned by its members. Is that good? Romantics say no, but it does not matter. We cannot escape living in a cultural world as well as the physical one. Today’s world is full of stories, religions, dramas, entertainments, concerts, and rituals that take us out of the immediate setting around us. We have harnessed attention and focused it on a something other than the physical present.</p>
<p>Breaking with the present also allows us to harness our thoughts. Thinking in language means directing our attention from one thing to another without losing the thread. When I was 11 years old, for example, I lived in Paris and thought about how I had learned English from my parents while my schoolmates had learned French from theirs. It was a random observation, but I was able to imagine back to the stone age when cave men first came up with language. I then imagined the Neanderthals meeting to agree on what to call things. My head jerked as I realized such a gathering was impossible without language already existing. Attention kept me focused on a topic long enough to imagine a series of incidents and understand something new. That kind of ability to have and recognize unexpected ideas is probably not confined to the <em>Homo</em> line, but language certainly makes it a lot easier to stay conscious and imagine a series of related associations until, pop, we think of something unexpected. I am pretty sure that every so often a chimpanzee has a good idea, but it is likely more difficult to push their imagination without having a reliable means of harnessing attention.</p>
<p>And then when the chimpanzee has a good idea, so what? Maybe the smart chimp benefits, but chimpanzeedom as a whole is none the wiser. Meanwhile, among the bipeds, another side effect of language is that we can have second-hand knowledge. By now, very little of what any of us knows is what we figured out for ourselves. The Royal Society was founded by scientists determined to take no man’s word for anything, but the scientific learning they promoted is probably the greatest, most hard-sought collection of second-hand knowing in history. That’s what makes science so powerful. People in many settings, with many varied points of curiosity set out not just to learn things but to share their discoveries. At this point, it does not matter whether the average chimp is as smart as the average human. The great stockpile of intellectual capital made possible by sharing our knowledge of every topic long ago outpaced whatever advantage apes might once have had in brains and brawn.</p>
<p>All of these side effects of language—consciousness, life beyond the present moment, thinking, and shared knowledge—have transformed our existence far more than would have been possible if we just processed symbols while an irrelevant awareness looked on. Just as startling may be that these side effects seem to be free or mostly free from the chains of Darwinian logic that rule the rest of the biological world. Language-based communities are far more able to cooperate and prosper than are the non-verbal societies of gorillas, chimpanzees and bonobos. That Darwinian edge allowed the <em>Homo </em>line to spread far and wide, but the other side effects—consciousness, life in an imaginary world of culture and thought, and the amassing of second-hand knowledge—all seem to have just come along for the ride without Darwinian selection voting on whether it is good or not to have those features. Of course, in the end we may fry ourselves in an intolerably hot climate or blow ourselves to bits in a series of nuclear explosions. Then Darwinian logic will have the last laugh. In the meantime, however, it seems to be sitting on its hands.</p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=OG5uF8XJwLs:_A2Jkj_p5dE:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>Language's First Usetag:typepad.com,2003:post-6a00d83452aeca69e201b7c92e7654970b2017-10-28T16:23:47-04:002017-10-28T16:23:47-04:00Why do people talk? That is the central question of this blog: what was the purpose of the utterance, the first time somebody said something? I have been taking it for granted that the first intention was informative, as in...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p><a class="asset-img-link" href="http://www.babelsdawn.com/.a/6a00d83452aeca69e201b8d2b8e755970c-pi" style="display: inline;"><img alt="Lincoln" border="0" class="asset asset-image at-xid-6a00d83452aeca69e201b8d2b8e755970c image-full img-responsive" src="http://www.babelsdawn.com/.a/6a00d83452aeca69e201b8d2b8e755970c-800wi" title="Lincoln"></img></a></p>
<p>Why do people talk? That is the central question of this blog: what was the purpose of the utterance, the first time somebody said something? I have been taking it for granted that the first intention was <span style="text-decoration: underline;"><strong>informative</strong></span>, as in <em>enemy</em> or <em>carcass thataway</em>. But other ambitions are possible. Maybe language began with a <span style="text-decoration: underline;"><strong>curse</strong></span> or a <span style="text-decoration: underline;"><strong>prayer</strong></span>. I seem to recall reading in Stephen Pinker that cursing uses a different part of the brain, so perhaps we can toss that purpose aside. But was the first utterance a prayer?</p>
<p>That doesn’t look impossible. Imagine <em>Homo earlymus</em> on a vast, grassy plain surrounded by barking hyenas. It looks like a good time for a prayer. But prayers require a concept of at least a higher power, and such a concept seems unlikely to arise without there already being a language with which to <span style="text-decoration: underline;"><strong>work out</strong></span> the notion of some kind of power to pray to. It seems a secondary reason to speak, that is a reason to be discovered by a person already endowed with speech.</p>
<p>Actually, it seems like a tertiary reason. You have language (for whatever purpose) and then you develop the ability to work or reason out such things as there must be a god of the hyenas, and then you start praying to said god to call off his earthly manifestations. But if prayer is too advanced a reason for using language, we cannot assume our ancestor trapped on the African savanna was forced into silence. He/She might have cried out with some sort of <span style="text-decoration: underline;"><strong>magical</strong></span> purpose – say <em>abracadabra</em> and the hyenas will leave. Yet even that seems a bit too advanced for the first use of language. Ancestors surrounded by yelping hyenas may have cried in despair or shrieked in horror, but these sorts of <span style="text-decoration: underline;"><strong>emotional </strong><strong>ejaculations</strong></span> are too primitive to be called language. It’s more of a joke than anything else to propose that the first linguistic utterance was <em>Oh no!</em></p>
<p>Magic, by the way, may have led to the whole range of <span style="text-decoration: underline;"><strong>speech acts</strong></span> in which people do accomplish effects by using words as in marrying someone or promising to do something. I don’t think I can rule out on first principles that the first word wasn’t something like <em>Selah</em> or something similar said to seal a new relationship.</p>
<p>Another use of language that requires pre-existing speech is <span style="text-decoration: underline;"><strong>signaling attention</strong></span>. One person may be telling a story (using language to <span style="text-decoration: underline;"><strong>amuse</strong></span>) while a listener periodically says <em>un hunh</em> or <em>wow</em> or <em>I see</em>. These interjections are socially important, but by definition require speech to have already existed before they were introduced into human communications.</p>
<p>Some people have suggested, tongue a bit in cheek, that language began as a method of <span style="text-decoration: underline;"><strong>deceiving</strong></span> others. Ogg said <em>carcass thataway</em>, when really it was <em>t’otherway</em> so Ogg could have the whole feast to his greedy self. The argument against deception as the original purpose is that language would never have survived if it had been lies from the beginning. For it to become an essential part of our lives, it had to be useful so that we kept language even as we recognized speech meant we would be surrounded by liars. This same argument can be used to dismiss a variety of anti-social purposes behind speech. Donald Trump often uses language to <span style="text-decoration: underline;"><strong>confuse</strong></span> people and situations, but if the first speaker had been a prehistoric Trump, language would have died aborning.</p>
<p>Trump also uses language to <span style="text-decoration: underline;"><strong>splinter a group</strong></span>, as happened in his announcement of his candidacy, when he denounced Mexican immigrants, costing him the support of one group but winning the support of anti-immigrant voters. Might the first word have been the prehistoric equivalent of <em>wetback</em>? It ousted one group while <span style="text-decoration: underline;"><strong>increasing the solidarity</strong></span> of another.</p>
<p>Language does not have to divide if it is to solidify. Many politicians are able to increase solidarity without splintering. The finest example is probably Lincoln’s Gettysburg Address, which does a splendid job of giving friends of the Union cause some principles to rally around yet the speech never attacks the Confederacy head on. But Lincoln’s use of language was quite sophisticated and depended upon earlier language. It seems unlikely that the first words were so independently noble. A more ordinary way of increasing solidarity is through social customs such as saying <em>thank you</em> or <em>hello</em>. I can't rule out <em>Thanks </em>as the first word, though if it was, it took a second reason for people to realize how useful language could be.</p>
<p>Other possible first uses might be as a <span style="text-decoration: underline;"><strong>command</strong></span> such as <em>go</em> with finger pointed toward the horizon, or as a <span style="text-decoration: underline;"><strong>request</strong></span> this time with the finger pointed toward a table top while the pointer utters <em>salt</em>. These kinds of usages, however, remind me of the old bow-wow theories of language origins that got the inquiry into such ill repute to begin with. It’s not that these usages are impossible, although they are impossible to prove/disprove, but they offer no clue as to how language got from such an unpromising start to the wonder that it is today.</p>
<p>Tom Wolfe wrote a book a year or so ago in which he had the unusual suggestion that language began as a way of improving one’s memory, and there is no doubt that <span style="text-decoration: underline;"><strong>naming</strong></span> aids in one’s memory. If you want to describe the route from New York City to Boston, it helps if you have names to remind you of the places in between. But that explanation is based on the out of date belief that language is naming. It is more than that. When Adam named the giraffe a giraffe he still needed verbs to tell us something about the giraffe and prepositions to locate it. When trying to understand where language came from, it is best to recall what language does in the first place.</p>
<p>So where does that leave us? Promising first uses may have been to inform, or to perform a speech act, or to splinter a group. The other uses seem to depend on language already existing, or point to dead ends.</p>
<p>*****</p>
<p>Note: I have put in bold-underline the various uses I see for language: inform/deceive; amuse/confuse; pray/curse; increase solidarity/splinter a group; perform magic; perform a speech act; give names; signal attention; request; work or reason out a notion; emotional ejaculation; command. What have I left out?</p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=C1EVQ9Athjc:9XocKYbBGkk:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>Rejecting the Axioms of Oldetag:typepad.com,2003:post-6a00d83452aeca69e201bb09c7bcfd970d2017-09-27T18:19:46-04:002017-09-27T18:19:19-04:00When I began this blog, I assumed the big step in developing language was the creation of the first word. I took it for granted that this was accomplished by yoking a sound and a meaning together to give us...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p> </p>
<p><a class="asset-img-link" href="http://www.babelsdawn.com/.a/6a00d83452aeca69e201bb09c7bce0970d-pi" style="display: inline;"><img alt="Speech Triangle" border="0" class="asset asset-image at-xid-6a00d83452aeca69e201bb09c7bce0970d image-full img-responsive" src="http://www.babelsdawn.com/.a/6a00d83452aeca69e201bb09c7bce0970d-800wi" title="Speech Triangle"></img></a><br>When I began this blog, I assumed the big step in developing language was the creation of the first word. I took it for granted that this was accomplished by yoking a sound and a meaning together to give us something like <em>chair</em>. I no longer believe either of those things.</p>
<p>Today I believe that the big step towards language came when our ancestors were willing to share their knowledge, and that language began when we started pointing things out to one another.</p>
<p>The change in my thinking resulted from a doodle I created early in the blog’s history: the speech triangle. Its corners mark a speaker and a listener who focus joint attention on the third corner, a topic. It might seem that we could eliminate the topic and just have that as something shared by speaker and listener, but the role of joint attention forces listener and speaker to focus on the topic rather than each other. If you try to eliminate the topic and redirect attention to the speech itself, you get pointless remarks—e.g., <em>this sentence is six words long</em>—or paradoxes such as: <em>This sentence is false.</em> The way out of this jumble is to realize that language works by directing attention away from the fact of communication to some other topic out there in the universe or in imagination. The topic is a distinct part of the speech triangle.</p>
<p>Embrace of the speech triangle puts an end to a search for any relevance in communication and information theory. Claude Shannon’s information theory presents a pair, speaker and receiver, and proposes that the function of communications is for the speaker to control a receiver at a distance. There is no role for either meaning or topic in such a definition. The theory is enough to explain computer networks, heredity, and the hormonal, immune, and nervous systems, but it is not rich enough to tell us anything about language. Efforts to calculate the information content of a sentence mix oranges and apples.</p>
<p>The speech triangle also implies that generative grammarians are on a wrong track. Traditional approaches to language imposes no function on verbal interactions; hence, grammar is not asked to contribute to any task. The speech triangle, however, locks in a function. Speaker and listener are paying joint-attention to a topic. Words must be organized in a way that directs attention from one point to another so that the shifts becomes meaningful. Generative grammar’s search for an underlying, common set of rules has been oblivious to the universal task of shifting attention.</p>
<p>Another benefit of the speech triangle doodle is that it give us something to look for in other animals when we ponder whether they are using language. Take vervet monkeys. They make one warning cry if they see a snake and another cry if they see a leopard. Is that a precursor to language? Like symbols, the cries have arbitrary meanings, so it might seem a step toward language. On the other hand it is nothing like a discussion of a topic. One vervet yells the equivalent of <em>leopard</em>. Other monkeys look around and when they see the leopard join in making the same warning cry. Soon the trees are filled with the chaotic racket of the jungle. Signals, yes. Speech triangle, no. Elephants, crows, parrots, dolphins… there may be another hypersocial species somewhere that pays joint attention to a topic. Or maybe not. But at least we have something concrete to test.</p>
<p>Meanwhile, I have been forced to notice that chimpanzees do not have a speech triangle. I had always thought of chimps as a very social animal. They live in groups, know one another as individuals, engage in some cooperative activities, and (Jane Goodall discovered) keep up family bonds. The absence of a speech triangle draws attention, however, to something they lack. They do not share information. Back in the days when captive apes were taught sign language, they could tell humans of their needs and would sign something’s name when asked. But they did not volunteer non-manipulative information to humans and did not ask their fellow apes to do something like pass the salt. They do not even have white eyeballs, making it harder to see where they are looking. It turns out that for all their sociability, chimpanzees are not given to sharing what they know. So there you have something even more fundamental to language than the words themselves—the urge to blab one’s secrets.</p>
<p>This approach also reduces the importance of several other matters. Symbols, for example, become secondary. Sure, words are symbols, but that is less important than their role in directing attention. </p>
<p>Again, this altered definition has radical implications. Much of the archaeology of language has focused on symbols and many people argue that if there were no symbols there could be no language. There could be no Shakespeare; that’s for sure, but how about the ability to say while pointing, “carcass yonder.” <em>Homo</em> groups could have been using words to direct attention to concrete things for a million and more years before they ever got around to inventing names for airy nothings.</p>
<p>I am getting on in years now, past the age where many a whippersnapper says a person can embrace new ideas. So it is particularly refreshing to have found that I can still toss out long-held axioms and make use of unexpected ones. Join me in the fun,</p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=PsXmx1VGN4w:vhwIexkap9g:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>How Does Language Work?tag:typepad.com,2003:post-6a00d83452aeca69e201bb09b988e6970d2017-08-16T12:14:55-04:002017-08-16T12:14:42-04:00These days we expect our sciences to have a practical side. We understand how things work and make use of the knowledge. Science began as common sense put into theoretical shape by Aristotle. Thus, pretty much every advanced science has...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p><a class="asset-img-link" href="http://www.babelsdawn.com/.a/6a00d83452aeca69e201b7c9164c93970b-pi" style="display: inline;"><img alt="Trail pointer" border="0" class="asset asset-image at-xid-6a00d83452aeca69e201b7c9164c93970b image-full img-responsive" src="http://www.babelsdawn.com/.a/6a00d83452aeca69e201b7c9164c93970b-800wi" title="Trail pointer"></img></a></p>
<p>These days we expect our sciences to have a practical side. We understand how things work and make use of the knowledge.</p>
<p>Science began as common sense put into theoretical shape by Aristotle. Thus, pretty much every advanced science has begun by showing what common sense missed and Aristotle got wrong. So common sense says the sun revolves around the earth. Then Aristotle developed a theory of physics that took common sense observations for granted. Aristotle’s physics, however, was purely theoretical without practical benefit.</p>
<p>Copernicus, Galileo and Newton overturned that common sense and introduced a more modern physics. The proof of the new science was that it led to practical applications, first in mechanics and later in space travel.</p>
<p>At the time of Galileo, Rene Descartes was also introducing a new theory of physics, one that relied solely on logical hypotheses and deduction. Although widely admired at the time, this work has not held up. For one thing, it did not address the common sense of earlier ages, for another it led to no practical or explanatory work.</p>
<p>Sixty years ago the study of language grew radical without addressing common sense or Aristotle. The common-sense proposition was that language is meaningful, and the Aristotelean theory was that language works by combining sounds with meaning. Reasonable as this definition sounds, nobody ever figured out how to use it and the practical traditions of rhetoric and composition pay no attention to Aristotle.</p>
<p>The linguistics’ movement of the late 1950s also ignored Aristotle and common sense. It pursued questions based on the logical hypothesis that language is a computation. Interestingly, the movement was led by a young thinker whose great hero was Descartes, and like Descartes, the movement’s work has led to no practical or explanatory success. It answers none of the traditional questions about language—e.g., Why are there so many and how can they be so different? What is meaning? How could it have begun? –and offers no practical clues to using language more effectively, or translating texts, or improving speech therapy, or overcoming dyslexia.</p>
<p>The problem seems to lie at the assumption that sentences are computations. On its own, the idea has some plausibility. If the brain is a computer, its output must be a computation. In computations, however, the same input produces the same result. In language, the result is not so predictable. If I participate in a soccer game and must report what just happened, I might say <em>I kicked the ball </em>or <em>I sent the ball flying</em> or <em>The ball really jumped off my toe</em> or <em>I missed the goal</em> or <em>Joe was racing for the ball but I beat him to it</em> or … and on and on <em>ad infinitum</em>.</p>
<p>This observation brings us back to meaning. Our utterances depend on what we have to say and language seems to communicate meaning. Could Aristotle have been right after all?</p>
<p>No. The proposition that language combines sound with meaning cannot be correct. The problem is that meaning is not a physical thing that we can somehow combine with sound waves. It is a ghost that Aristotle inserted into language back when inserting ghosts was no vice. He also inserting yearning into his list of elements: fire yearned to be high in the sky and rose toward the sun; earth yearned to go to the center of the world, so earthen matter fell and even accelerated as it approached its goal.</p>
<p>Kicking out the ghosts of physics was not easy because the things that Aristotle explained still needed explaining. The solution lay in saying that the rising smoke and falling meteors are effects of gravity.</p>
<p>My work on this blog has likewise persuaded me that meaning is an effect, rather than a cause.</p>
<p>The simplest example might be two people standing together when one of them points toward something. The other looks over and sees a policeman beating a man. The gesture directed the other’s attention. The meaning of the gesture came when the second person redirected attention and saw something new.</p>
<p>Suppose instead, one person tells another, “I saw a cop beating up a guy today.” The meaning is discovered by the same general principle of directing attention, the difference being that instead of directing a person’s eyes, the speaker directs the listener’s imagination. In both cases, the meaning is the result of the directed attention.</p>
<p>This reversal of meaning changes the task of speaker/writer. Instead of focusing on inserting meanings, the task to skillful language production lies in producing sentences that the audience can follow. How do we do that? By paying attention to the demands we place on the listeners’ attention.</p>
<p>The old man the boat. Oh, I’m sorry, did I lose you? It is not surprising. A reader first takes “The old man” as a noun phrase and needs a second look to grasp that “man” is a verb. This kind of sentence, known as a garden-path, is well known in linguistics and is strong evidence that listeners construct meaning as they go along. If they go astray, they must retrace their route, looking for the point where they got lost.</p>
<p>The old suffer many indignities. I hope that sentence was easier to follow. Why was it so? Because readers know to shift their attention from <em>the old</em> to <em>suffer</em>. This sentences helps the reader by making it easy to shift attention.</p>
<p>I have published a few papers on line (<a href="http://www.mind-consciousness-language.com/articles%20bolles2.htm">here</a> and <a href="http://www.mind-consciousness-language.com/articles%20bolles3.htm">here</a>) demonstrating that syntax directs attention, and that oddities proposed to illustrate a universal grammar can be readily explained as devices for directing attention.</p>
<p>I have been a decent writer for many years, but I am a better one now because I understand how to help readers make their way through complex sentences. So there has been a practical benefit to my years of wrestling with how language works. At last, rhetoric may be given a clear, theoretical footing.</p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=s83HhhHZieE:5Z16JBiTmac:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>How Old is Speech?tag:typepad.com,2003:post-6a00d83452aeca69e201b7c911e8ec970b2017-08-02T15:40:55-04:002017-08-02T15:40:55-04:00This blog takes the position that language, in the sense of two or more people focusing together on a topic, is quite old. Archaeologists, Chomskyites and others tend to put it as a more recent in the human lineage, about...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p><a class="asset-img-link" href="http://www.babelsdawn.com/.a/6a00d83452aeca69e201b7c911e66d970b-pi" style="display: inline;"><img alt="Scales of Justice" border="0" class="asset asset-image at-xid-6a00d83452aeca69e201b7c911e66d970b image-full img-responsive" src="http://www.babelsdawn.com/.a/6a00d83452aeca69e201b7c911e66d970b-800wi" title="Scales of Justice"></img></a></p>
<p>This blog takes the position that language, in the sense of two or more people focusing together on a topic, is quite old. Archaeologists, Chomskyites and others tend to put it as a more recent in the human lineage, about 100 thousand or fewer years. I put it at approaching 2 million years. My main grounds for thinking such is based on cooperativeness and the idea that it took a long time to create the verbal environment that we now take for granted.</p>
<p><span style="font-size: 14pt;"><strong>Slow evolution</strong></span></p>
<p>I noticed an <a href="https://nyti.ms/2tZNwQ5">article</a> from a couple of weeks back about the “truly” bilingual child, and I came across this passage, “Pediatricians routinely advise parents to talk as much as possible to their young children, to read to them and sing to them. Part of the point is to increase their language exposure, a major concern even for children growing up with only one language.”</p>
<p>It is a familiar sentiment, but it sparked me to think about the days when language was really new. At first people probably did not have too much to say to one another; talking was an occasional thing, and even today verbal richness is impaired if we are not surrounded by words. When language was new our ancestors could talk, but they were still linguistically impoverished when compared to today’s oral cultures. Their children did not grow up hearing a ceaseless yakety-yak and did not create a rich verbal environment themselves.</p>
<p>We can assume that language was first used to relate news of the here and now: <em>there is a carcass we can scavenge yonder</em>; <em>I just saw a lion</em>; <em>your mother is down at the creek</em>. News of this type is not going to produce chatterboxes. For that you need narratives, strings of two or more sentences: (1) <em>there is a carcass we can scavenge yonder</em>; (2) <em>bring some cutting stones</em>.</p>
<p>It seems unlikely that early talkers went straight to sentences. The pattern we see in children is probably a quick-time recapitulation of the developmental process—words, phrases, basic sentences; richer sentences; strings of sentences. The jump from words to phrases probably came quickly as a few captive bonobos have managed to join words meaningfully in sign language. I once heard a toddler use a phrase on her first birthday. I was inclined to attribute it to the excitement of a birthday party, but she quickly made phrases a regular part of her speech. Sentences, however, were another matter.</p>
<p>When we imagine early talkers—say, <em>Homo erectus</em> and precursors—we ought to think of their language like their tools, simple but persistently part of their lives. And we should try to imagine it staying that simple for perhaps a million years while their brain grew large enough to handle the load.</p>
<p>Full, transitive sentences join two things with an action, e.g., <em>the zebra kicked the lion</em>. Children use a few verbs right away—<em>eat cookie</em>; <em>want juice</em>—but most verbs are late in arriving. Some extra maturation of the brain appears to be required for a person to unite two things through a single action. Simply perceiving what happened requires a feat of attention that may be beyond a two-year-old. Anybody who has watched an unfamiliar sport knows how difficult it is to perceive just what happens in complex, unexpected actions.</p>
<p>Transitive verbs allow for mythological and abstract thinking. Abstract ideas like <em>not fair</em> are probably very old, but the idea of making something fair—as in <em>I will weigh my mischief in the balance with three days labor</em>—requires a very difficult concept. The verb <em>weigh…in the balance</em> is a metaphor that somehow compares apples (my mischief) and oranges (three days labor). We take for granted blind justice holding up scales, but the original person who spoke of such things was a first-class poet.</p>
<p>By 100,000 years ago, sentences, narratives, abstractions and metaphors were probably all there for the chatterboxes to drone on about, and to leave the archaeological clues that indicate cultures steeped in symbolism. But symbols did not spring fully ripened from the first talkers’ tongues.</p>
<p><span style="font-size: 14pt;"><strong>Cooperation</strong></span></p>
<p>The other line of reasoning that brings me to the same conclusion is <em>Homo's</em> hyper-sociality. The African savanna promotes togetherness. The grass eaters form herds and the predators hunt in groups. Loners like rhinoceroses and bull elephants need to be huge so the predators cannot harm them. With the savanna's emergence a few million years ago the already social primates that stayed on the plain had to become even more dependent on one another. What emerged from the process was a terrifying new species able to stand up to the predators and bring down the herders. The only way this success was possible was by regular cooperation and sharing.</p>
<p>Going back as far as <em>Homo habilis</em> we know that individuals taught other individuals how to make tools. The same tools turn up in many sites even thousands of miles apart and persisted unchanged for hundreds of thousands of years. It seems likely that the teaching relied more on demonstration than on telling, although words may have played a part.</p>
<p>Cooperation is not the first solution Darwinian processes attempt and most living organisms depend on themselves, but super-cooperative species like eusocial insects prosper because they share information. When cooperative sharing appears evolution has found a trick that pays off. The <em>Homo</em> lineage has probably been pointing and demonstrating since the beginning, meaning we have been motivated to help one another for almost two million years. Work with apes has already established that our ancestors had the brains to use words. If we combine the presence of brains and motivation, it seems strange to insist that words did not come for the first 1.7 million years. Indeed, I doubt anybody who insists language must be new. If they want to persuade me, find some evidence that cooperation is new, or that a properly motivated ape will have the tools to tell me a story.</p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=MWekqdNJJJI:_t1wTqGj1IY:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>Language Among the Topsy-Turvytag:typepad.com,2003:post-6a00d83452aeca69e201bb09afe139970d2017-07-17T19:23:46-04:002017-07-17T19:23:46-04:00In the last post I commented on the paper “Wild Voices” by Chris Knight and Jerome Lewis in Current Anthropology. The article focuses on the social changes that were required to make language possible. The changes should be generally familiar...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p><a class="asset-img-link" href="http://www.babelsdawn.com/.a/6a00d83452aeca69e201b7c90ca943970b-pi" style="display: inline;"><img alt="Topsy Turvy Chimp" border="0" class="asset asset-image at-xid-6a00d83452aeca69e201b7c90ca943970b image-full img-responsive" src="http://www.babelsdawn.com/.a/6a00d83452aeca69e201b7c90ca943970b-800wi" title="Topsy Turvy Chimp"></img></a></p>
<p>In the last post I commented on the paper “<a href="http://www.journals.uchicago.edu/doi/pdfplus/10.1086/692905">Wild Voices</a>” by <a href="http://www.chrisknight.co.uk/">Chris Knight</a> and <a href="http://ucl.academia.edu/JeromeLewis">Jerome Lewis</a> in <em>Current Anthropology</em>. The article focuses on the social changes that were required to make language possible. The changes should be generally familiar to regulars on this blog.</p>
<p>The main one is the switch from a society based on dominance and submission to a community held together by trust and a willingness to cooperate.</p>
<p>These behavioral changes have been accompanied by several biological changes as well. One, mentioned before on this blog, is the switch from black to white eyes that make it easy to see where one’s attention is focused. A couple of important reflexive changes have occurred as well. For example, apes respond to threats from others with a reflexive “fear grin” that indicates a nervous submission. That reflex has been transformed into the human smile which signals a relaxed good-humor in friendly company.</p>
<p>And laughter provides a weird combination of friendliness and aggression. An example of that not mentioned in the paper is the late night TV anti-Trump satire that bonds the laughing audience while humiliating its target. </p>
<p>The authors speak of a “principle of reversal,” i.e., a series of steps that result in a reversal of the old ape standard to something new. The change of the grin to a smile, turned a signal of submissive fear into one of confident trust.</p>
<p>Other reversals saw mothers who never let anyone else touch their infant become mothers who let many others help with the care and even delivery of infants.</p>
<p>Another reversal necessary for people using modern languages is the signaling of non-physical facts through ritual. A wedding ritual, for example, changes the way the entire community understands the relationship between the marrying people. In many contemporary societies this ritual includes vows to love one another, so that language is part of the ritual. And many groups include verbal prayers in their rituals, but more is claimed for the ritual than physical actions. Identities and spiritual natures are said to change.</p>
<p>Once introduced, these changes cannot be undone. A shift from black eyes to white eyes is one small shift, but as part of a series of changes that cannot be taken back. something novel and lasting appears.</p>
<p>A particularly important change was the new relationship between males and females. Studies of animal behavior typically find the males are dangerous and irresponsible. Male mammals fight for the right to spread their seed and then leave the females to raise any offspring. Particularly bad actors kill rival offspring and mate with the grieving mothers. Somehow humans have developed an enormous variety of cultures in which men help raise the children and keep the brawling over women to a minimum.</p>
<p>These changes combine to create a species that is motivated to help one another when trouble strikes, is routinely cooperative, and engages in a series of rituals and actions that cement trust. It might sound as though the authors have strayed pretty far afield from the question of how language emerged in human history, but the their point is that without trust it would be a foolish for speakers to risk revealing what they think, and it would be equally foolish for listeners to believe what they are told.</p>
<p>Trust is not easily found and maintained. It requires simple signals like smiles, bonding like shared laughter, and a series of reassuring ceremonies and actions, </p>
<p>This need for a trusting, helpful and cooperative species stands, no matter how you think language arose. Even if you accept Chomsky’s idea that language began as a way of thinking, it could only be externalized and become a means of communication once trust was established.</p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=EDCf6Afk8Ho:in0yW5A5JYM:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>Hey Interesting Topic, What’s Your Name?tag:typepad.com,2003:post-6a00d83452aeca69e201bb09adafe5970d2017-07-10T17:53:45-04:002017-07-10T17:53:45-04:00I want to propose the embrace of an ugly word: logogenology (low-go-jen-ahl-oh-gee). It comes from three Greek words, logos [word], gennesi [birth], and logia [study of], and it names the study of language origins. In other words, it refers to...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p><a class="asset-img-link" href="http://www.babelsdawn.com/.a/6a00d83452aeca69e201bb09adafde970d-pi" style="display: inline;"><img alt="Word Generator" border="0" class="asset asset-image at-xid-6a00d83452aeca69e201bb09adafde970d image-full img-responsive" src="http://www.babelsdawn.com/.a/6a00d83452aeca69e201bb09adafde970d-800wi" title="Word Generator"></img></a></p>
<p>I want to propose the embrace of an ugly word: <em><strong>logogenology</strong></em> (low-go-jen-ahl-oh-gee). It comes from three Greek words, <em>logos</em> [word], <em>gennesi</em> [birth], and <em>logia</em> [study of], and it names the study of language origins. In other words, it refers to this blog’s beat.</p>
<p>Normally I dislike academic coinages, but in this case I think we need to recognize that there is a community of scholars who began in many fields—e.g., linguistics, literature, biology, psychology, archaeology, and anthropology—who share common questions and are interested in one another’s results. Thus a biologist might learn from a linguist and come to a conclusion that is of more interest to that biologist than to most linguists. Instead of identifying themselves as biologists and linguists, it might be better to focus on their shared community and say , “I’m a logogenologist,” even if one has to add, “That’s somebody who studies language origins.”</p>
<p>I have come to this position after reading an interesting paper by two people calling themselves anthropologists, Chris Knight and Jerome Lewis. The paper is titled “<a href="http://www.journals.uchicago.edu/doi/pdfplus/10.1086/692905">Wild Voices</a>” and is published in <em>Current Anthropology</em>. They begin their essay, “Anthropology is the study of what it means to be human. So it must be at least part of our job to explain why it is that out of 220 primate species, only humans talk.” The authors seem to be claiming that explaining speech is a part of anthropology, but they concede immediately that their account of language origins requires taking the work from many other fields of study.</p>
<p>The third paragraph says: “A word of warning. The way we have constructed this article is novel, and we ask the reader not to be surprised that we conjoin a wide range of previously unconnected fields. Our basic idea is simple: using language is so closely bound up with everything else humans do—singing, ritual, kinship, economics, and religion—that no separate, isolable theory of its origins is likely to work.” While the authors seem to be writing for anthropologists, they acknowledge that their data comes from many other fields.</p>
<p>Members of the language-origins community will find nothing startling in the connections the authors make. So why not just admit that there is a community of scholars who use data originally developed in a variety of other fields to answer questions that are peculiar to the new community? The main logogenological question is <em>how did language begin</em>, and there are a variety of sub-questions too such as <em>when did it begin</em>, <em>what bodily and cognitive changes were required</em>, <em>how did it become universal to the species</em>, etc. The first section heading in the Knight/Lewis paper poses a common sub-question of the field, “Why Do Only Humans Talk?”</p>
<p>The authors give a shockingly brief answer: “Since language is not a system for navigating within the physical or biological world, it follows that nonhuman primates—creatures whose existence is confined to the realm of brute facts, not institutional ones—will have no need for either words or grammar.”</p>
<p>What? Where did that premise come from? It seems to be based on an anthropological dictum that “words and grammar are means of navigating within a shared virtual world.” Here we see the circular trap that comes from acting as though one of logogenology’s contributory fields is able to answer logogenological questions. Anthropology is the study of the various virtual worlds (cultures and institutions) created by humanity. Thus, the element of language that interests anthropologists is how language helps members of a group navigate that virtual world. This foundation forces the answer to at least two sub-questions: (1) why do only humans talk? Other animals have no need for speech, and (2) when did speech begin? After humans had begun to create a virtual world rich enough to require help in navigating it.</p>
<p>Knight and Lewis might respond that it just happens that anthropology alone is sufficient to answer these questions. But Chomskyan linguists offer different answers. (1) Only humans talk because they alone are able to organize words according to a recursive syntax, and (2) speech began after a number of humans had developed the ability to think using that recursive syntax. The result of these rival answers is that anthropologists and Chomskyans quarrel a great deal and the work of science—drawing conclusions from empirical data—bogs down. Indeed, the claim to be a science looks laughable.</p>
<p>Lets come at the questions from a logogenological perspective. (1) Why do only humans talk? The abstract answer is short enough: only the human lineage went through the series of evolutionary changes necessary to make language possible. What were those concrete changes? That is for logogenologists to determine. Anthropologists and Chomskyans both, if they want to work out these changes, must leave their field of training and work as members of the field studying language origins. (2) When did language begin? Before answering that we have to draw up a list of changes necessary for speech to be possible and discover when each of them appeared. The result will be a series of empirically validated answers, not a list of deductions based on a field’s a priori definitions.</p>
<p>The Knight/Lewis paper asks logogenological questions and takes its data from many fields but then tries to fit the answers into anthropology-shaped boxes. The authors need to recognize that they are no longer working as anthropologists and come at their conclusions from the same direction they asked their questions.</p>
<p>I am going to post a second report on the Knight/Lewis paper in a few days.</p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=rzwPnhgqHXw:GNpb7Yxcu84:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>Attention and Languagetag:typepad.com,2003:post-6a00d83452aeca69e201bb09a4a091970d2017-06-11T23:19:54-04:002017-06-11T23:18:58-04:00The most important thing I have learned in working on this blog has been the relationship between language and attention. Language, I have concluded, works by sharing and directing attention to a topic. It is really that simple, yet it...Blair<div xmlns="http://www.w3.org/1999/xhtml"><p><a class="asset-img-link" href="http://www.babelsdawn.com/.a/6a00d83452aeca69e201bb09a4a08a970d-pi" style="display: inline;"><img alt="Attention" border="0" class="asset asset-image at-xid-6a00d83452aeca69e201bb09a4a08a970d image-full img-responsive" src="http://www.babelsdawn.com/.a/6a00d83452aeca69e201bb09a4a08a970d-800wi" title="Attention"></img></a></p>
<p>The most important thing I have learned in working on this blog has been the relationship between language and attention. Language, I have concluded, works by sharing and directing attention to a topic. It is really that simple, yet it is rich in implications.</p>
<p><strong>Evolvability</strong></p>
<p>Attention is widespread in the animal world and all primates, certainly all apes, are well endowed with the ability to direct their attention to different points in their environment and stay focused on a task for an undefined length of time. Thus, any special human attention tasks such as joint-attention, interactive attention, etc. that language might demand only call for tweaks of the system, not wholly new mechanisms. Anybody interested in language origins should find this approach to language simplifies the evolutionary puzzles.</p>
<p><strong>Demystify meaning</strong></p>
<p>Meaning has always been a mysterious concept, rather like that of the soul, only meaning is the soul of the word rather than the body. How does meaning get into a word or sentence in the first place. Is it in the speaker’s head? Does the sound carry meaning to the listener’s head? Or is the meaning outside the body altogether?</p>
<p>These questions, which come up when considering thought experiments like the <a href="https://plato.stanford.edu/entries/chinese-room/">Chinese Room</a>, carry their own alarm bells. Where is the meaning? That question can only make sense if meaning is a thing. We can get rid of the confusion if we say meaning is not a thing but a response. Words pilot attention. All the many mysteries about where meaning is, how it is communicated, what changes it, etc. begin to look ridiculous as we see that all such questions assume that meaning has some kind of presence.</p>
<p>People can be said to understand a language when their attention is directed by the words and sentences of that language. Computers may be able to translate languages perfectly decently, but we can still maintain that they don’t know the meaning of what they are doing because their processing never involved directing attention. Any philosopher of language should appreciate the firmer basis on which to consider meaning. (Personal note: It was my recognition of the demystification that persuaded me to grab the attention idea and see how far I could run with it.)</p>
<p><strong>Grounds language in perception</strong></p>
<p>Attention is a function of perception, so it should not be surprising if language has many of the features of a perception.</p>
<ol>
<li>Perceptions are always perceptions of something and language is always about something.</li>
<li>Perceptions always have a point of view and speech does too.</li>
<li>Perceptions organize sensations into a foreground and background, and language can do the same. The foreground of an utterance is the focal point of attention. For example, if a person focuses auditory attention on a honking goose while only being vaguely aware of other sounds, a speaker can restrict an utterance to the focal point—<em>A goose honked angrily</em>—or include background details—<em>A goose honked angrily over the hens’ clucking sounds.</em></li>
</ol>
<p>There is enormous room for exploration here and this grounding in perception should provide much fodder for critics, gestalt psychologists, and psychologists of the newer, embodied-mind school.</p>
<p><strong>Explains syntactic structure</strong></p>
<p>Perception redirects attention and syntax works by controlling shifts in the listener’s attention.</p>
<p>I argue this case in detail elsewhere and am confident that attention based syntax can explain even the strongest observations made in favor of a Universal Grammar and it has the extra benefit of making sense. Syntactic structure reflects the limits of attention and memory and is not merely an arbitrary set of rules. Linguists with an interest in syntax should appreciate the approach and composition teachers should like the way it provides students with a way to use grammar as a help rather than a stumbling block to clear writing.</p>
<p>(See <a href="http://www.mind-consciousness-language.com/Attention%20Based%20Syntax.pdf">Attention-Based Syntax</a> and <a href="http://www.mind-consciousness-language.com/Reflexive%20Anaphors.pdf">Reflexive Anaphora in Attention-Based Syntax</a>.)</p>
<p><strong>Learnability</strong></p>
<p>The fact that children master speech so easily has long been a mystery. Is it inborn or learned? It turns out the innate part comes from our ability to attend, to shift attention, and to remember. Anybody interested in children’s acquisition of language should find that the approach simplifies the task to be explained.</p>
<p>The greatest objection to this approach is likely to be that it depends on conscious rather than mechanical or computational processes. Attempts to model attention on computers generally treat attention as a passive filter of input, whereas attention here is seen as an active power that selects elements for conscious contemplation. But the dogma that the mind is the brain and the brain is a computer, is only an assumption. When a different approach can make sense of so many aspects of a problem, it should take more than stubborn dogma to defeat it.</p></div><div class="feedflare">
<a href="http://feeds.feedburner.com/~ff/BabelsDawn?a=z8lf642WoN8:gPiImfoQ2YI:yIl2AUoC8zA"><img src="http://feeds.feedburner.com/~ff/BabelsDawn?d=yIl2AUoC8zA" border="0"></img></a>
</div>