Yesterday, I discussed Joan Acocella's strange misreading of two essays introducing the fifth edition of the American Heritage Dictionary ("Rules and 'rules'", 5/11/2012).

John Rickford wrote that "the patterns of variation and change … are regular rather than random, governed by unconscious, language-internal rules and restrictions" — but Ms. Acocella took this defense of "vernaculars that are commonly regarded as lacking rules", from a scholar known for his defense of "Ebonics", as a stalwart affirmation of prescriptive standards.

Steven Pinker tried to explain how false beliefs about standard usage, like No Split Verbs or No Final Prepositions, can become widespread — but Ms. Acocella took this attempt to distinguish between true and false beliefs, from the author of a popular book on "Words and Rules: The Ingredients of Language", as promoting the idea that "there are no rules", other than the false "old wives' tales" he debunked.

If you've read Acocella's review, you will have noticed something else about this hallucinated debate: she's really angry about it. In particular, she doesn't care for Hallucinated Steve Pinker at all.

You can see this animus in her gratuitous comment about his choice of terminology:

There are no rules, he declares. Or they’re there, but they’re just old wives’ tales—“bubbe-meises,” as he puts it, in Yiddish, presumably to show us what a regular fellow he is.

Pinker's own account of this choice:

I call them bubbe meises, Yiddish for “grandmother’s tales,” in tribute to the late language columnist William Safire, who called himself a language maven, Yiddish for “expert.”

Acocella's anger emerges again in excoriating the AHD's editors for publishing Pinker:

Most important is that the editors tried to pull descriptivists over to their side. In the most recent edition, the fifth, they have not one but two introductory essays explaining their book’s philosophy. […] For the editors of the A.H.D. to publish Pinker’s essay alongside Rickford’s is outright self-contradiction. For them to publish it at all is cowardice, in service of avoiding a charge of élitism.

But Pinker is the chair of the dictionary's Usage Panel, and so his essay, explaining "Usage in the American Heritage Dictionary", was included ex officio, not recruited as part of some sinister plot to placate descriptivists. (And the idea that Steve Pinker might have been recruited to protect John Rickford from "a charge of elitism" is truly hilarious.)

Acocella's ire at Evil Descriptivists emerges even more strongly in her discussion of the book she's reviewing, Henry Hitchings' The Language Wars: A History of Proper English. Since Hitchings debunks false claims about allegedly proper usage, he must be one of those "anything goes" guys — but wait, he admits that there are also real linguistic regularities! In her final two paragraphs, she combines this hallucinated contradiction with a common complaint about the hypocrisy of writers who use standard English in defending vernacular usage:

… the A.H.D.’s run for cover is not as striking as the bending over of certain descriptivists, notably Hitchings. Having written chapter after chapter attacking the rules, he decides, at the end, that maybe he doesn’t mind them after all: “There are rules, which are really mental mechanisms that carry out operations to combine words into meaningful arrangements.” We should learn them. He has. He thinks that the “who”/“whom” distinction may be on its way out. Funny, how we never see any confusion over these pronouns in his book, which is written in largely impeccable English.

No surprise here. Hitchings went to Oxford and wrote a doctoral dissertation on Samuel Johnson. He has completed three books on language. He knows how to talk the talk, but, as for walking the walk, he’d rather take the Rolls. You can walk, though.

If a student so badly misunderstood simple ideas expressed in clearly-written texts, the usual response would be to bemoan the decline of critical reading skills in kids today. But Ms. Acocella is no student — and her misreading of works on the norms of usage has, I suspect, been part of her magazine's culture for more than fifty years.

Consider this passage from E.B. White's letter to J.G. Case, his editor at Macmillan for The Elements of Style, dated 17 December 1958 (emphasis added):

I was saddened by your letter — the flagging spirit, the moistened finger in the wind, the examination of entrails, and the fear of little men. I don't know whether Macmillan is running scared or not, but I do know that this book is the work of a dead precisionist and a half-dead disciple of his, and that it has got to stay that way. I have been sympathetic all along with your qualms about "The Elements of Style," but I know that I cannot, and will-shall not, attempt to adjust the unadjustable Mr. Strunk to the modern liberal of the English Department, the anything-goes fellow. Your letter expresses contempt for this fellow, but on the other hand you seem to want his vote. […] In your letter you are asking me to soften up just a bit, in the hope of picking up some support from the Happiness Boys, or, as you call them, the descriptivists. (I can write you an essay on like-as, and maybe that is the answer to all this; but softness is not.) […]

All this leads inevitably to like-as, different than, and the others. I will let them lay for the moment, sufficient unto this day being the etc. My single purpose is to be faithful to Strunk as of 1958, reliable, holding the line, and maybe even selling some copies to English Departments that collect oddities and curios. To me no cause is lost, no level the right level, no smooth ride as valuable as a rough ride, no like interchangeable with as, and no ball game anything but chaotic if it lacks a mound, a box, bases, and foul lines. That's what Strunk was about, that's what I am about, and that (I hope) is what the book is about. Any attempt to tamper with this prickly design will get nobody nowhere fast.

P.S. When I said, above, that Macmillan would have to take me in my bare skin, I really meant my bare as.

Key elements of Acocella's 2012 screed are present in White's 1958 letter: in particular, the view that publishers are conspiring behind the scenes to placate hypocritical liberal descriptivists, "little men" who believe that "anything goes".

And a brief examination of E.B. White's "bare as" may help explain his anger — and Acocella's. There's an excellent discussion of this issue in the entry for like, as, as if in Merriam-Webster's Dictionary of English Usage. I invite you to read the whole thing, which I will summarize briefly here.

In the 1950s, shortly before E.B. White's letter to his editor, an advertising slogan ("Winston tastes good, like a cigarette should") created a storm of controversy in the popular press about the use of conjunctive like. In the New Yorker, the editors Viewed With Alarm this "obnoxious and ubiquitous couplet", with its "pesky 'like'". Strunk (and Strunk & White) condemned this "illiterate" usage, which "lately … has been taken up by the literate, … who use it as though they were slumming".

There was a problem with this picture, as some of those Happiness Boys noted: it got the facts wrong. To quote MWDEU:

… information published in the Middle English Dictionary shows that like by itself was used as a conjunction as long ago as like as was — from the late 14th century. […] Chaucer used it in about 1385 to introduce a full clause in The Complaint of Mars.

Conjunctive like was also used by Shakespeare and others around 1600, but it becomes rare during the 17th and 18th centuries. However,

In the 19th and 20th centuries conjunctive like becomes much more common. Jespersen 1909-49 (vol 5) tells us that "example abound" and lists them from Keats, Emily Bronte, Thackeray, George Eliot, Dickens, Kipling, Bennett, Gissing, Wells, Shaw, Maugham, and others. So we must conclude that Strunk & White's relegation of conjunctive like to misuse by the illiterate is uninformed.

This is the kind of "attempt to adjust the unadjustable Mr. Strunk" that E.B. White was so upset about. And when deeply-felt beliefs and allegiances come into stark conflict with the facts, an emotional response is normal.

Sometimes, people who are faced with such conflicts can adjust their beliefs and allegiances to accord with reality. But others respond with denial, willful misunderstanding, and conspiracy theories. You can see that sort of response in climate-change denialists — and it seems that the staff of the New Yorker have been English usage denialists for more than half a century.

62 Comments

"I will let them lay for the moment." Hmmm. In-ter-est-ing. Ver-r-r-r-y in-ter-est-ing. Would not have expected that from White, under the circumstances. Still doesn't swing me into the Pullum camp of total disdain for the man; but that's because I'm an editor, not a linguist.

[(myl) I took White's "let them lay" to be ironic, like his "bare as".]

@Dick Margulis: Total disdain is not required; White had great skill as a writer, and any editor can see that. What is required is (at a realistic minimum) a lack of respect for White's non-existent skills as a grammarian.

Plus some knowledge of the damage White's zombie "rules" have wrought in American letters, of which the Acocella article, the Hitchings book, and Mark's criticism are merely very recent instances.

Circe said,

But the most curious flaw in the descriptivists’ reasoning is their failure to notice that it is now they who are doing the prescribing. By the eighties, the goal of objectivity had been replaced, at least in the universities, by the postmodern view that there is no such thing as objectivity: every statement is subjective, partial, full of biases and secret messages. And so the descriptivists, with what they regarded as their trump card—that they were being accurate—came to look naïve, and the prescriptivists, with their admission that they held a specific point of view, became the realists, the wised-up.

When did this highlighted event happen? I only went to university in the 2000s, but no one told me we had done away with the notion of "objective truth" now.

Also, what is "subjective, partial, full of bias[es] and secret messages" about the statement: "The speed of light is a constant in every inertial frame of reference"?

I suspect that most "descriptivist" linguists, with their naive belief in (at least semi-) ascertainable facts, logical connections, and statistical evidence, will be quite happy to have been out of step with that fashion.]

Deniz Rudin said,

The most frustrating thing about popular accounts of the so-called usage wars, like Ms. Acocella's, or indeed like the late David Foster Wallace's, is that they make the mistake of considering descriptivism and prescriptivism as two opposing but more or less equal viewpoints—as two different approaches to the same problem. But the fact is that there is no such thing as a descriptivism vs. prescriptivism debate within a genuine scholarly community. The very idea is absurd, because descriptivism and prescriptivism are not the same type of thing. Descriptivism is an investigatory approach to the formal study of language, and it is uncontroversial in linguistics departments because it is the only sane approach—nobody opposes descriptivism in biology, or argues for a prescriptivist physics. Prescriptivism, on the other hand, is a branch of etiquette columnry—prescriptivists advise us of what the more embarrassing solecisms are, so that we can in avoiding them be judged by the cultured to be one of their own. To suppose that there is general conflict between the two -isms as competing philosophies is absurd.

There is a reason why dictionaries tend to be the fulcrum on which these discussions turn, and that is because dictionaries are more publicly visible than other products of language scholarship, and they have a great deal of symbolic value to prescriptivists. Linguistic intellectuals and classist pedants have different conceptions of a dictionary's use. To the grammarian-elitist a dictionary should be a tome of rigorously ascetic correctness, the harsh schoolmaster who will tell you that half the words and turns of phrase coming out of your mouth need to be purged in order for your language to be pure and correct; to the linguist, a dictionary should be a portrait of the language as it stands at present, always of course incomplete, and scrabbling to keep up with each new edition. The "battle," as the pundits refuse not to call it, between these two points of few has long ago been lost, because the prescriptivist ideal of a dictionary makes no sense: it requires the assumption that there is such a thing as the mystically-correct standard variety of the language, and that any deviation from that standard is a devolution, a contamination—a viewpoint that is impossible to sustain in the face of actual dictionarymaking work, which confirms again and again that language change is constant, implacable, and if not always beautiful and enriching then at least rarely corrosive and never debilitating. Our dictionaries increasingly embrace new and niche words, novel usages, and grammatical shifts, to the increasing carping of the literary community's pool of wannabe public intellectuals who have named themselves language's staunch defenders in absentia of any formal understanding of its operation. But how else should our dictionaries work? After all, language comes from us, and so why should we visit the dictionary to prove each other's natural usages wrong? When the way we speak isn't in the dictionary, we revise the dictionary!

The way I've described it, the whole descriptivism vs. prescriptivism mudslinging contest seems like an inconsequential quibble with no real effect on anything—the self-righteously irrelevant old guard and the cloistered academy taking potshots at each other from miles away. However, there is a reason that linguists get apoplectic about this stuff, and that reason, mentioned all too little in screeds on both sides, is pedagogy. When I read Ms. Acocella's review, what struck me is that her total ignorance of what language is and how it works, though exceptional in its audacity and shrill tenor, is extremely commonplace in its content and in its unselfconscious baldness. I have no knowledge of Ms. Acocella's schooling, but her understanding of language and linguistics is typical of a product of American schools—I say this as a product of one myself. In primary school in this country, the formal language curriculum consists almost entirely of prescriptivist etiquette taught as though it is science, resulting in the diploma'd horde going forth into the world believing that our language needs to be taught to us, and that most of us aren't doing a great job learning it. See, for example, Ms. Acocella's indignant remark in the closing paragraphs of her article: Acocella quotes a passage from Hitchings' book which she doesn't understand a word of except for "there are rules," which she snatches on to in her follow-up: "We should learn them."

This notion that the rules of language are things which need to be explicitly taught to us, and which we must struggle to learn in order to speak correctly, is a major piece of public ignorance, and a legacy of language education in our schools. The reason why linguists care about this stuff is that it is possible, in fact it would be extremely easy, to do meaningful language education in primary school—if our elementary language education involved, instead of lessons in what not to say, some basic linguistics, we could have future generations of graduates find it just as embarrassing to not know IPA as it is to not know elementary algebra today.

I have been in Ms. Acocella's position. When I entered the University as an undergraduate, I had no idea what linguistics was, but I thought of myself as a lover of language in spite of my almost-total ignorance of its workings. I took my first linguistics class because I was suffering under a misconception—inspired by the grammarian writings of David Foster Wallace, I took the class because I thought explicit knowledge of linguistic structure would make me a better writer. That linguistics became a major passion for me, and disabused me of many of my ignorant notions about language, is one of the happier accidents of my life—and if language was taught competently in our K-12 schools, it needn't have been an accident.

Pflaumbaum said,

So I've been having this argument with a friend, and I've hit a brick wall. He's very erudite and, like the freshman Deniz, a 'lover of language' who sets great store by the observance of traditional prescriptive rules. It started when I mentioned that most linguists don't classify the will construction as a future tense. He regards this as a silly and arbitrary denial of the ordinary usage of 'tense'. I've directed him to various LL posts and other sources, and he's dutifully read them all, but he disagrees from first principles – he doesn't see how linguistics can be a descriptive science.

I'm an ex-classicist who studied historical linguistics. We had a few classes on the history of linguistics – 'From Saussure to Chomsky', sort of thing. But I lack the theoretical knowledge of the discipline to make the case well. Has anyone on here read Hitching's book? If so, does it do so? Or can anyone direct me to a book or article that explains how the discipline has developed over the last century or so, and in what areas there is consensus and disagreement today?

Garrett Wollman said,

I'd like to second Deniz's comments. A great deal of the metaphorical impedance mismatch between language scholars and (certain) language users is that language scholars see language as a subject for scientific study, whereas the users see it as a reflection of the moral character of their fellow users. Science, of course, doesn't tell us how we ought to behave (or speak), it just tries to establish the facts of the observable world. But to those users and usage commentators, this is nothing but lily-livered moral relativism — for them, right and wrong are more salient categories than true and false. I think this is one of the reasons you see so much nonsense written about language in the journalistic setting: at the core, many journalists and their editors don't understand claims about language as matters of fact and fiction, but rather as moral questions or mere matters of opinion, and thus not subject to the sort of fact-checking that would apply, for example, to a direct quotation from a public figure. They are thus more or less impervious to evidence refuting their claims.

In the neighborhood of the intersection of analytical philosophy, statistics, and theoretical computer science, there is a way of thinking about the world that I think society would benefit enormously from, if it were taught starting at a very young age. (And I think this can be done, in a broad-brush qualitative sort of way, as a part of the science, math, and language arts curricula.) Students need to learn how to distinguish between factual and moral claims, what sort of evidence might be required to believe claims of each kind, and at a more fundamental level, just what the nature of evidence is. I don't know how you make this happen, however, when so many of the adults responsible for the educational system are at least as confused as the editors of The New Yorker.

Circe said,

Thanks for the pointers. es, I did hear of the Sokal affair (I work in an area very close to his own research interests), but I always had the idea that he was just making fun of a vocal minority in a very narrow sub-field of literary criticism. I never thought that the ideas of "post-modernism", at least as I understood them in my brief encounter with the Sokal affair, would pass muster in any other field, since disruptive contacts with reality will happen far too often.

Circe said,

In the neighborhood of the intersection of analytical philosophy, statistics, and theoretical computer science, there is a way of thinking about the world that I think society would benefit enormously from, if it were taught starting at a very young age.

It's not clear from the post to me, but are you referring to the paradigm of Bayesian learning?

John said,

Circe: You say that as if they considered such disruptive contacts a bad thing! That may be the very air the extremist po-mo breath, the blood that courses through their veins. Why use one word when twenty will do? Paragraph-length sentences only demonstrate the ineffable, superior mind. You find something said contradictory or contrary to fact? That's only because you're imposing your own, biased view of the fact. Untether all anchors… meaning is what you make of it, not what anyone tells you it means.

Thankfully, this infectious disease is slowly dying out in the liberal arts establishment. Not fast enough, but at least the disease is in remission.

Garrett Wollman said,

@Circe: in a word, no, I am not. Nothing so specific, at any rate. Much more fundamental: what is the nature of evidence, and of proof, and how do these things relate to truth and falsehood, right and wrong. Irreducible error of measurement (even in counting!) and various kinds of bias; reliability of eyewitness evidence and cognitive priming. What sorts of things are possible to compute, and what things are impossible even in principle? Is it legitimate to make inferences about the future from past observations (the problem of induction) and if so, under what conditions? The "hot hands" fallacy and other sorts of cognitive biases that make us see randomness in processes that are actually uniform, and structure in processes that are actually random. I think fourth-graders can learn these things, if presented properly, and I think that they need to introduced that early, before they start learning intellectual rigidity. The way these sorts of subjects are usually taught requires a certain level of rigor and mathematical sophistication that is not actually necessary for the sort of qualitative understanding that I'm looking for here; many of them can be easily demonstrated in the classroom.

Xmun said,

I do wish people would stop treating "descriptivism" as the alternative to "prescriptivism". Forget prescriptivism. The true contrast is with the historical approach to the study of language. Descriptivism is the study of a language as it is spoken and written today. Historical linguistics is the study of the development of a language (or languages) down the years. And it's a richly rewarding, though currently rather unfashionable, field of study.

[(myl) In my experience, most linguists are interested in language change and language history as well as in synchronic description, and rather than being unfashionable, the study of language change is the locus of some of the most interesting and highly-regarded current work.

I do agree that the pre-/de- dichotomy is a false one, though not for the reason you cite. John Rickford's work is an excellent case in point — he supports the idea of teaching American schoolchildren to master the standard formal language, but as a way of doing this, he advocates also teaching a respectful analysis of vernacular varieties. This is description squared, so to speak, combined with a frank discussion of the racial, regional and class associations of the various varieties, as well as a discussion of the role of formality and similar distinctions of register.]

if language was taught competently in our K-12 schools, it needn't have been an accident.

This is what I keep saying. America was at the forefront of linguistics for the better part of a century, a scientific approach to language was available well before World War II, and yet to this day schoolkids are taught nothing but the old wives' tales their great-grandparents were taught (mixed in with some sprinkles of "phonics" or whatever the latest fad is). Imagine how upset people would be if their children were being taught the four-humors theory in school, or Ptolemaic astronomy! And yet the parallel situation in the field of language is regarded with the utmost complacency, except by the few who have been lucky enough to study the field in college.

D.O. said,

Maybe the lack of English Academy is not such a boone after all. If you take Russian, for example, there rarely is any doubt to what is the correct way of saying somethings. All necessary rules and prescribed usages a known and written somewhere. So if an argument arises, it can be reasonably easy settled by consulting that "somewhere". It makes any linguistic discussion devoid of much of the emotion. Needless to say, it does not change the fact that actual usage may be quite far from the prescriptions, but who cares.

"This notion that the rules of language are things which need to be explicitly taught to us, and which we must struggle to learn in order to speak correctly, is a major piece of public ignorance, and a legacy of language education in our schools."

I just wrote much the same thing in my comment on the previous post. I think that combating this misunderstanding of what language is and how we learn language is a prerequisite. As long as people think of language and language usage and instruction in these terms, it's not possible to get beyond this peeving prescriptivism. Their incorrect assumption basically demands their conclusion (though it's likely that there's mutually reinforcing relationship between the assumption and the conclusion, both of which are intuitive and thus probably involve some emotional investment).

"Science, of course, doesn't tell us how we ought to behave (or speak), it just tries to establish the facts of the observable world. But to those users and usage commentators, this is nothing but lily-livered moral relativism — for them, right and wrong are more salient categories than true and false."

But this is a general and deep problem for all social science. It's interesting that Pinker is involved in this discussion because outside linguistics, we see the same acrimonious arguments where there's one side which is inclined to take a position that necessariy everything involved with human behavior, cognition, and social organization is moral because there are no underlying structures upon which these things depend. Every description of what humans do is, in that view, implicitly a discussion of what people should or shouldn't do. Yet, the extreme on the other side makes what also amounts to a moral argument by endorsing what is "naturally necessary" as what is "best" or "right" — at least implicitly, but often explicitly, this argument is made. When people think about what humans are and what humans do, they are almost always thinking on some level about morality and "should be" and not just rigorously about "what is".

I don't want to be naive and try to argue that, given how complex is the interaction between organic, inherent, implicit "rules" and arbitrary convention really is, that it's reasonable to claim that we could all do this science and have these discussion without implicitly taking some moral positions, even if inadvertently. However, I do agree with you that things could be greatly improved. And that would involve two requirements: that we understand and accept that how people behave is necessarily the complex product of both inherent nature (and not necessarily biological nature, but also the inherent emergent organization in complex systems that involves "rules") and arbitrary convention; and that when doing science on these things and discussing them in the public sphere, it's all-too-easy to present what are effectively (or intentionally!) normative and essentially moral arguments. We can be making a moral argument when we defend an arbitrary and authoritatively-deterimed practice as "necessary" and "best" or we can be making a moral argument when we defend a practice on the basis that it's inherently "necessary" and "natural". We can be making moral arguments when we attack those two positions. In other words, we need to be very careful about this stuff.

But that takes a lot of effort and I don't see that happening any time soon for the average class of non-specialists who, say, write about such things in popular magazines. And it's certainly not going to happen with regard to language usage until if and when a majority of such people understand that language is most certainly not some entirely arbitrary system that was designed and continues to exist because usage is explicitly taught.

Xmun said,

myl: "the study of language change is the locus of some of the most interesting and highly-regarded current work"
Would you be kind enough to list a few references? Preferably no more than five. My remark was based on the infrequency of mention of historical linguistics in LL.

The works I have on my shelves are no doubt rather elderly by now: Introduction to historical linguistics, 3rd edn (Terry Crowley); A History of language (Steven Roger Fischer); and the historical sections of English in Australia and New Zealand (Kate Burridge and Jean Mulder).

Rylon said,

If a student so badly misunderstood simple ideas expressed in clearly-written texts, the usual response would be to bemoan the decline of critical reading skills in kids today.

I would actually would like to test the idea that the essays are clearly written. I haven't read them (I don't own the latest AHD), so I don't have an opinion one way or the other. But I think it would be interesting to get 100 college educated adults who speak english fluently to read both essays and see if their interpretation matches the author's intention, more closely follows Acocella's, or something else.

[(myl) I would hope that they would read more carefully than you do: links to both essays were provided in the original post…]

Cy said,

if he can't concede linguistics as descriptive, then forget it. I mean, tense is marked in English on the verb – will eat and eat are both the same 'tense,' so to speak. Grammatically. Pragmatically of course they're different, with different usages etc. But if we had a different future tense, we'd have to have something to slot into the third position of 'eat-ate-…' If he can't see that, if he thinks maybe that 'will eat' is actually 'willeat' or something, well then that's a productive and interesting beginning to a good conversation. What are words? Verbs? How does orthography affect usage? Are 'shall,' 'going to,' and 'gonna' also different tenses? Where are my glasses? etc.

He thinks that the “who”/“whom” distinction may be on its way out. Funny, how we never see any confusion over these pronouns in his book, which is written in largely impeccable English.

This seems to imply that to the author, the fundamental linguistic question is: how should we decide which rules we require everyone to follow? Apparently in her world there are three kind of answers:

a) We should require everyone to follow the correct rules, and the way to find out which rules are correct is to ask someone who already knows. (The "prescriptivist" viewpoint).

b) We should require everyone to follow a fluid set of rules that we change regularly to make them match some kind of statistical average of real-world use of language. (The "descriptivist" viewpoint, as far as a prescriptivist understands it).

Now, since Hitchings speaks of rules, he probably isn't in group (c), and he sure as hell isn't in group (a), so by elimination he must be in group (b). But, Ha!, he's not himself following the rules he's constructing so he's either a hypocrite or dishonest. Between the lines I sense a conclusion that he's probably a wishy-washy relativist hippie who's trying to masquerade as an honest citizen by speaking of rules without really believing in them.

The underlying fallacy here, which seems to inform the entire diatribe, is the assumption that the inherent purpose of rules is so that one can require people to follow them. That is of course nonsense to anyone with a scientific background — in science a "rule" is just a pithy summary of an observed regularity of the world.

In contrast to Xmun and others, I think there is a meaningful distinction between "prescriptivism" and "descriptivism". However, it's not based on a difference in which conclusions one prefers — it's a difference in which question one asks in the first place.

Rylon said,

I didn't say they were unavailable to me. (Since, in addition to following the links, I could just read them at a library or possibly a book store.) My point was only to say that I didn't (and couldn't, yet) have an opinion on the matter.

I not entirely sure why you felt the need the be insulting. What if a lot of people really do misunderstand the essays? Wouldn't that be worth knowing?

Helena Constantine said,

When reading Pro. Rudin's excellent response, as a Classicist, my first response was that prescriptivists is easily described as wanting to turn English into a dead language. I can tell my composition students what is right and what is wrong with very great authority based on their work's agreement with the usage of Cicero. Stunk et al. want the same authority in English. But then it occurred to me that what Prescriptivists are doing is really more like Tolkien inventing Elvish on the basis of Finnish. They want their own version of English that is based on the spoken language but transformed according to that they consider ideal according to their own lights. Their version of English is entirely personal and in no way objective as they claim. Professor Liberman's comparison with Global warming denialists is quite apt.

Garrett Wollman said,

@Cy: the consensus in the world of linguistics these days seems to be that "tense" refers purely to morphology and nothing more. But both usage authorities and language instructors use the term much more loosely, to mean any sort of temporal marking in a verb phrase, whether in the morphology of the main verb or through use of auxiliaries. Similarly for "mood", "aspect", and so on. (This might be by false analogy to classical languages, which all have "loose tense = strict tense" structures, but the existence of this usage is a verifiable fact in any case — just look at any 50-year-old textbook or peevograph.)

Rubrick said,

The competition from elsewhere in this blog is tough, but this post may have set some sort of record for highest level of discourse in the comment thread of an internet post about language. (Of course, myl does "cheat" by moderating comments….)

Iain said,

Just an insignificant side-note, but I believe the "fear of little men" in E.B. White's letter is not referring to descriptivists as "little men", but rather referencing William Allingham's "The Fairies", equating his editors trying to keep with the times to following irrational superstitions about reading entrails and believing in fairies.

Imagine how upset people would be if their children were being taught the four-humors theory in school, or Ptolemaic astronomy!

They would be very upset indeed, but not for the reasons you think. People want their children to be taught *exactly the same things* that they were taught. If people had been taught four-humour theory when *they* were at school you can bet your bottom dollar they would be demanding that their children be taught the same.

marie-lucie said,

Many American parents would love to hear that their children were being taught Ptolemaic astronomy (without using the pedantic term "Ptolemaic"), since it is quite consistent with the cosmogony presented in the Old Testament.

J.W. Brewer said,

marie-lucie, do you have some polling data on what curriculum reforms are supported by what percentage of American parents that would help quantify that "[m]any"? From the internet (current as of 1999):

Probing a more universal measure of knowledge, Gallup also asked the following basic science question, which has been used to indicate the level of public knowledge in two European countries in recent years: "As far as you know, does the earth revolve around the sun or does the sun revolve around the earth?" In the new poll, about four out of five Americans (79%) correctly respond that the earth revolves around the sun, while 18% say it is the other way around. These results are comparable to those found in Germany when a similar question was asked there in 1996; in response to that poll, 74% of Germans gave the correct answer, while 16% thought the sun revolved around the earth, and 10% said they didn't know. When the question was asked in Great Britain that same year, 67% answered correctly, 19% answered incorrectly, and 14% didn't know.

Given notably higher levels of secularization in the UK and Germany, devotion to a certain literalistic approach the Old Testament would not seem to be the driving factor behind skepticism about heliocentrism. I don't know why the US would have a lower percentage of "don't knows."

More generally, I would suggest that people consider the hypothesis that the median New Yorker reader and/or writer believes in prescriptivist poppycock for more or less the same reasons he/she (prototypically a quasi-literary type who has at best vague memories of high school science classes) believes in some pop-science version of anthropogenic global warming and some pop-science version of Darwinism. These are all ways of signalling social class/tribal affiliation and distinguishing oneself from those benighted primitives on the other side of the Hudson. A devotion to truth for its own sake really does not enter into it. Some of these signalling beliefs may approximately coincide with the truth and others not, but which will be which will depend on highly contingent cultural/historical factors.

marie-lucie said,

bks, like most people you are confusing prescription (something needed from time to time to regulate behaviour) and prescriptivism, the belief that everything (especially in language) needs to be prescribed (including proscribed), otherwise there would be chaos. Some things are naturally ordered (say the motions of the planets, the march of the seasons), or order themselves spontaneously (eg the iron filings next to a magnet, the structural elements of a language), but others need a social convention (eg which side of a road to drive on). Whether you drive on the right or the left is the law in the sense of a social convention, imposed by a recognized authority as it cannot be left to the individual since having no convention at all would put everyone at serious risk. Whether a language puts the verb in first or second place in a sentence, or allows particles before or after a noun, and such, are also social conventions, although so old that we don't think of them in that way. These conventions were not handed on from on high but evolved naturally from the interactions between speakers of each language, sometimes influenced by other languages.

The roots of prescriptivism started when Europeans began to study ancient languages and realized that those languages had a structure quite different from their own. Since those languages were considered more prestigious than the modern ones, some people – especially in England, I think – formulated rules, or rather prohibitions, which would make their own languages less different from the ancient ones. As discussed above and in other posts over time, for English such rules started as guidelines for "improving" the language but for some people have evolved into almost the equivalent of the Ten Commandments. For a couple of centuries they have been regularly flouted, especially in casual speech, because they run counter to the normal functioning of the English language.

This is not because they are more difficult than more traditional English structures: not splitting an infinitive is a lot easier than forming a question, for instance (ask any ESL learner). Is "to go boldly" more difficult to say than "You wouldn't have seen my wallet, would you?" Yet the first one sounds unnatural and is very rarely uttered, while the second, a very complex structure, is nevertheless quite unremarkable for English speakers. No prescriptivist bothers to give rules for the second type of sentence: growing up among English speakers is enough to acquire it, while most foreign learners of English find it very hard going. So "don't split an infinitive" and a few others have become the "grammar" that needs to be forced on speakers, often with poor success and resulting feelings of humiliation, while the same speakers handle much more complex structures with ease and precision but are not made aware of their achievement.

marie-lucie said,

JWB: marie-lucie, do you have some polling data on what curriculum reforms are supported by what percentage of American parents that would help quantify that "[m]any"? From the internet (current as of 1999):

No, I don't have polling data or percentages on this topic, but I do read the American press. Since 1999 (thirteen years ago already) there have been "rightward" social changes in the US, with many more people home-schooling their children (for various reasons, often strict "Bible-based" religious ones), calls for teaching "creationism" in public schools, denigration of science in general, etc, along with the rise of extreme "conservative" (read "reactionary") politics. I used "many", not "most". 18% is far from a majority, but it is a sizable minority. I doubt that the percentage would be smaller nowadays.

I'd like to make a distinction between "teaching ptolemaic astronomy" and "teaching geocentric astronomy as fact". Because, as it happens, I studied ptolemaic astronomy in college. Aside from both its historical importance and its relevance to some later mathematical developments, it also was certainly not some simpleminded and obviously wrong theory. Until Galileo's telescope, it was empirically flawless. And the geocentrists were much more sophisticated and aware than people commonly think — the mathematically more elegant heliocentric theory was known even to the classic Greeks, and certainly Ptolemy himself. In fact, he mentions in Almagest that a heliocentric model simplifies the geometry elegantly…but that it would require accepting some ideas that are clearly absurd (that the Earth is in motion and that because of the lack of observed parallax of the stars, they are unimaginably, impossibly far away).

It would be stupid to teach geocentricism as scientific fact, of course. But it's not stupid at all to approach it as part of a more comprehensive understanding of basic astronomy, math, and the history of science. People are taught in school many facts about astronomy today of which they have no real comprehension — it's not as if a lot of this stuff is actual knowledge in any true sense.

When science is taught as a body of facts that every educated/intelligent person knows today is true and which repudiates all past, and ignorant, contemptible falsehoods, it badly misrepresents to students what science actually is, how it works, and what it means to know something. It encourages people to be credulous when they should still be skeptical and it encourages people to be skeptical, even cynical, when they are confronted by examples of how science is ambiguous, difficult and always a work-in-progress. People with an authoritarian bent, especially, will latch onto whatever absolute truths are presented to them as science and then later feel betrayed when consensus scientific opinion changes. This drives a good portion of the climate denialism, not just economic self-interest.

It's also involved in issues like we're discussing here, because many people were taught prescriptivist ideas about language as empirical facts about language as an object of study by misguided high-school teachers or parents or other people who spoke authoritatively about language qua language. It's no accident when and where prescriptivism really took hold — it's the product of a naive and typical late-nineteenth/early-twentieth scientific reductionism as applied to language in the context of fetishizing Latin. A lot of prescriptivism is presented as empirically-derived analytical facts about the nature of English. And my point is that a reflexive appeal to contemporary scientific authority is not helpful because that's precisely the mindset that the prescriptivists are working from. It's just that their version of "contemporary scientific authority" is the authority which was presented to them first, one which actually isn't an authority on linguistics, but certainly seems to be an authority on English.

I do absolutely agree with the lament that linguists isn't taught pretty much at all, in any way, to language students. There has been accumulated some pretty important scientific knowledge about language that is utterly absent from basic education and common knowledge and this absence allows a space in which a lot of prescriptivist nonsense thrives. But it shouldn't be taught/told with similar hubris as White and so many others show. That's just laying a foundation for later problems of the same kind, but with different specifics.

Yet another John said,

@marie-lucie: "The roots of prescriptivism started when Europeans began to study ancient languages and realized that those languages had a structure quite different from their own…"

Surely the roots of prescriptivism (if it even has roots, if its beginnings are not as old as human culture itself) go back much further than that. The *Appendix Probi*, a Latin text from the 3rd century AD, is a compendium of supposed errors in popular speech (a.k.a. Vulgar Latin, or Proto-Romance):

I recall reading that the written language(s) of the ancient Egyptian civilizations (written in hieroglyphs) were very conservative across several millennia of use. Maybe an Egyptologist will clarify/correct me, but surely some of the Ancient Egyptian scribes must have had the same kinds of pseudo-moral prescriptivist beliefs about the "incorrectness" of newly-evolved forms in their own language ("back in the time of Cheops, people knew how to speak and write well, but ever since Rameses…"). And I'd make a nickel bet that in any complex society which was large enough that different social classes and different regions had different ways of speaking, the same kinds of prescriptivist beliefs would have flourished (among the Mayans, the Hittites, etc…). This isn't just a 19th-century neo-Classical thing.

Yet another John said,

It seems that the degree to which people feel righteous anger over a subject is highly correlated with how certain they feel about the correctness of their own beliefs. Unfortunately I think that it is less well correlated with the degree to which it affects human welfare and happiness. Start to talk about the Monty Hall Problem at a party, and it's normal that some guests might start to raise their voices in frustration trying to argue that "obviously you can't win more than 1/2 the time." But bring up the subject of the resurgence of Dengue fever in the past half century, which kills thousands of people each year, and it is doubtful that any such shouting match will ensue; if one of the guests showed as much passion about the effective distribution of mosquito nets as Ms. Acocella does about the "language war," then most people would find that a bit strange.

I'd conjecture that this has a lot to do with the level of fury that we see from some of the language prescriptivists. Once you learn some of these zombie grammar "rules" like not to use passives, not to split infinitives, and so on, it is not so hard to work yourself into a lather by the sheer *number* of examples of other people writing things that are so *obviously* incorrect, who are too lazy to merely change "to boldy go" to "to go boldy." But getting just as passionate about Dengue fever requires actually defending one's proposal for action against a gamut of alternatives where the ideal course of action seems less clear: how much to promote the use of DEET vs. mosquito nets, when perhaps there are health and environmental concerns with DEET? How to get the affected communities to use mosquito nets more, how best to distribute them to those who need them the most, how to most effectively advocate for their use, and so on?

In the face of uncertainty caused by considering two or more alternatives that have been presented in a way that makes each of them sound reasonable, people's passions tend to be subdued and they shrug their shoulders — "it's complicated." If they're in the right mood, they may even start to think. But plant in their mind the idea that one side is right or correct in a way which is impervious to evidence, and they will have full license to express anger, aggression, annoyance, etc. And so they will.

[And if any actual psychologist reading this wants to comment, I'd be fascinated to hear… it's not so clear to me how to begin to search the psychology literature on "causes of anger."]

Xmun said,

marie-lucie said,

Yet another John: thanks for the link to the Appendix Probi, which is a collection of vocabulary, mostly concerned with pronunciation, not with syntactic structure like the various "rules" or rather prohibitions of English such as not splitting an infinitive, etc. As such the AP is a gold mine for linguists concerned with the evolution of spoken Latin. The proscribed items reflect popular pronunciation (a stage toward the modern Romance languages), the prescribed ones reflect an older pronunciation probably taught along with the Latin classics in the Roman Empire.

It is true that prescriptions must have existed in many cultures, but not necessarily prescriptivism, the attitude that naturally spoken language is wrong unless rules are formulated, not according to how educated people such as "our best writers" actually speak or write, but how they should speak and write according to self-appointed pundits, by explicit or implicit reference to another, more prestigious language such as Latin. Unlike the Egyptians, etc you mention, whose reference would most likely have been their own ancient tradition, the 18th century English pundits were not deploring the degradation of the language since Shakespeare, for instance, but condemning much of his work as hopelessly ungrammatical – the standard reference being the grammar of Latin.

the other Mark P said,

* Driving on the right, not just a nice idea, it's the law. (U.S. English)

* Driving on the left, not just a nice idea, it's the law. (British English)

There is a role for prescriptivism.

But language prescriptivists tend not to believe that their prescriptions are arbitrary. They will insist that there is a good logical reason for not splitting infinitives or not ending a sentence in a preposition.

Once you accept that most rules are arbitrary it becomes very hard to defend them as "the truth".

"But language prescriptivists tend not to believe that their prescriptions are arbitrary. They will insist that there is a good logical reason for not splitting infinitives or not ending a sentence in a preposition."

As has been exhaustively cataloged in this blog, precriptivists are not that picky about their justifications for prescriptivist claims, what they're adamant about is that there necessarily is some justification. They'll pick and choose as convenient, they'll argue in the alternative, and they'll switch justifications on a dime when it suits them.

And I think this has a lot to do with why I think you've got this exactly backwards. If prescriptivists were so certain that the rules they affirm are so rationally correct and inherently "true", then they'd not feel the need to defend them so vigorously from threat. Rather, it's that they're aware on some level that their justifications are so fragile that they so rabidly defend these rules.

Really, what's actually going on in 95% of all language prescriptivist peeving are people defending their perceived accrual of cultural capital. It's exactly the same as someone defending the debasement of music or art from those with bad taste — almost all arguments about taste, despite most everyone understanding on some level that taste is subjective and greatly varies, involve claims about inherent, objective merit (or its lack). Almost everyone who defends their taste against a critic reaches for supposed objective truth. It's no mere coincidence that language peevers often talk about their "love of the language" and that a dance and book critic would write such a review of a technical text on a topic she has absolutely no competence in. Or that these sorts of prescriptivist rants about usage have exactly the same tone and character as rants against, say, Thomas Kinkaid's painting.

Acocella might as well have written a piece that ended with her criticism of an English professor's defense of Stephen King with a swipe at the evident hypocrisy that the professor of course doesn't actually read King, that his/her defense of King is a hypocritical pandering to the masses and the elements who want to debase literature.

These defenses of cultural capital are opportunistic, they utilize any and all available arguments. And, importantly, they're never just defenses, they're just as much attacks. And necessarily so, because the value of accumulated cultural capital, like other capital, is dependent upon relative scarcity and requires distinction. It's critical to demonstrate others' lack of cultural capital to bolster the value of one's own.

Jerry Friedman said,

All the linguists who post here do so in standard English. I'd be very interested to read an explanation by a linguist, or another person free of desires to peeve about language and possess scarce cultural capital, of when writing (or speaking?) in standard English is a good thing and why, and how they know what the standard is.

On tense: Most of us learned in school that "have done" and "will do" are examples of the same verb in different tenses. John Lawler says here, "English really has only two actual tenses: present (go/goes), and past (went)." However, the CGEL says English has a present perfect tense, but no future tense. I suspect a lot of this disagreement is just about terminology.

Whether that's true or not, I don't see that teachers are in a position to teach grammar based on linguistics until there's some kind of consensus on this point and many others. We should all live so long. What teachers can do is say "school English" and "homestyle English" or some such, instead of "right" and "wrong", "good grammar" and "bad grammar", "literate" and "illiterate", etc.

On another subject, the "not just a good idea" thing started, as far as I know, with the publicity campaign for the 55 mph (or lower) speed limit in the U.S., which was in force from 1974 to 1987. "55 mph: It's not just a good idea. It's the law." The best parody was "186,000 miles per second" instead of "55 mph".

@Iain: I think White's "little men" refers to both fairies and insignificant or unintelligent people.

J.W. Brewer said,

@Jerry Friedman, the problem at the Acocella level is that we're not only or primarily talking about when if ever people should be encouraged to use "Standard Educated American English" or whatever you want to call that variety versus some other dialect with (for contingent historical reasons) lower social prestige which may happen to be their native dialect. Before you can even get to that you need to deal with the problem of bogus prescriptivism / zombie rules etc. that mischaracterize the actual (descriptivist)grammar of StEdAmEng. To be egocentric just by way of illustration, StEdAmEng is my native dialect, as it was that of my parents and (modulo the odd regionalism/archaism like one of my grandmothers using "davenport" to mean couch/sofa) grandparents. In speech I have a smallish number of pecularities of pronunciation that might be nonstandard and probably mostly reflect a Middle Atlantic / Delaware Valley substrate (not to mention the odd Anglicism/hypercorrection that I picked up from a high school teacher and never put down again), but in syntax and lexicon, either oral or written, I have as far as I can tell no such variations and have as high a degree of native-speaker command as a linguistic anthropologist doing fieldwork on StEdAmEng could possibly want. I never use "might could" or "needs washed" and actually have trouble using "ain't" other than self-conciously/affectedly/ironically. But I split infinitives, ignore bogus that/which distinctions, employ the passive voice, etc etc all the time. That is because, imho, the former set of issues would or might deviate from the grammar of StEdAmEng, but the latter do not.

That's right. Linguists quarrel endlessly about terminology; Geoff and I both agree on everything about how the Perfect construction works, but I prefer to call it a construction, while he prefers to call it a tense, and he has his reasons. OK, we're both licensed, and we both understand what the other means. Close enough.

It might be fun sometime to argue with him about it someday, over a few beers, but we both have better things to do, and besides, now that I'm retired, nobody's paying me to argue about inconsequential stuff like that.

"There is no realistic hope of linguistics ever becoming part of the American primary or secondary curriculum –- and even if it did, it would be ignored, like everything else. Because students know that school is boring. And they’re right."

For one thing, too many generations of teachers teaching teachers to teach teachers shibboleths have gone by; there's nobody who could teach it.

Deniz Rudin said,

That's a terrible argument. It's true that school is boring, but it's also evident that it works. I was a thoroughly bored student, and yet I came out of high school with some fundamental working knowledge of science and mathematics, and of the inner workings of the United States government, and of various methods of literary interpretation, and of anatomy and obstacles to health, to name a tiny fraction of the vast body of data that was drilled into me whether I liked it or not. And though I did not always enjoy being in the classroom as a child and especially as an adolescent, I'm very grateful as an adult for the body of information that my government mandated that I be taught. There's a tremendous submerged ocean of knowledge that we all take so thoroughly for granted that it seems like it's always been with us, but without our schooling it wouldn't be. No matter how disinterested they may be, reasonably bright students will retain at least some portion of the information they are taught. Ironically, as far as this conversation is concerned, you need to look no further than the proliferation of mistaken attitudes about language to see the insidious staying power of what is ignored in grade school.

As for the teachers teaching teachers echo chamber, that idea is also evidently incorrect—just look at the rate at which new scientific developments are incorporated into primary school curricula. To choose only one potent example, the plate tectonic theory of continental drift, and its ramifications in the explanation of earthquakes, mountains, and volcanic activity, has only been widely accepted in the geological community for about fifty years, and yet it is taught to every schoolchild in the country. I can hardly think of a more widespread and basic piece of scientific information, and it has wormed its way through the minds of the world's youth in a number of decades that you can count on one hand. The Chomsky-Halle "Sound Pattern of English" model of rule-based phonology is hardly more recent, and certainly no more difficult to teach to children.

Ruben Polo-Sherk said,

For me, too, Standard American English is my native dialect by definition, which sounds a little silly, but there's no other way to define it without encountering inconsistencies.

"Standard American English" or whatever is basically a hypothetical concept into which my native dialect happens to fall. If you look in detail at two speakers' understanding of particular elements of the language, there will be some differences, but as far as the boundaries of what "Standard American English" are are concerned, they are the same.

Jerry Friedman said,

@J. W. Brewer: I think the order is the other way around—you can very well start with when and why a standard variety is appropriate, then how you know what it is, then examples. (Though in a real essay, you might whet the reader's appetite by starting with examples.) After all, it's hard to explain why split infinitives are allowed in StEdAmEng and "ain't" isn't unless the reader understands the criteria for standardness.

Anyway, I wasn't proposing an order in that comment, and I wasn't requesting the explanation for Acocella and those like her. I'd be interested in it myself and to pass on to native speakers of non-standard dialects, such as some of my students.

@John Lawler: I have no disagreement with anything you say in that comment—and I'm grateful for your posts in a.u.e., such as the one I quoted, which I've learned a lot from. But it can be hard for us non-linguists to tell what disagreements are substantive and what are terminological.

Rod Johnson said,

I shudder to think about teach SPE-based phonology to children. Especially since almost no one still actively pursues that kind of heavily abstract approach to phonology anymore.

The story of Paul Roberts' attempt to reform English grammar teaching should be instructive. Since it was based on a highly theory-bound version of syntax (roughly Aspects-style transformational grammar), and that theory was rapidly evolving, it quickly found itself vulnerable to charges of being empirically wrong and theoretically baseless, and when it fell out of favor it took a fair amount of linguistics's credibility in education with it.

For a truly linguistics-informed curriculum to succeed, it will have to be built on a core consensus that's fairly theory-neutral and fairly "phenomenological" (i.e., focused on observable features of language and not highly abstract theoretical constructs). You can't teach people about, say, parasitic gaps (to take a fairly extreme example), because they won't even be able to recognize them as a phenomenon without buying into the whole apparatus of binding theory and its descendants.

So the question is, is there such a core? Perhaps in phonology, though even the concept of the phoneme is pretty contested, and SPE-type questions of rule order and markedness would be practically unteachable. But in syntax? Once you get beyond word categories, grammatical relations and maybe phrase structure it's hard to say. There are certainly a lot of notions–Fillmorean case, mappings between that and grammatical relations, some simple notion of movement, headedness… that might be pedagogically useful, but many theoretical linguists think they're flat wrong. And what happens if you create a generation of schoolchildren who are invested in a particular theoretical approach, and theory moves on, as it always does? Will there be a bunch of dispirited linguists posting on Languagelog in 2040 about how peevers are always griping about how nobody does extraposition right anymore?

Rod Johnson said,

Shorter version: whereas I'm confident that geology moves "forward," I'm less so that linguistics does, at least at the descriptive level. Do we really know more about English than Jespersen did? What do we know that deserves K-12 curricular status? (That's not snark, I would genuinely like to see a list of ideas.)

@Rod: I agree about Roberts. I actually had to use it in class once. *Shudder*

As for the list of ideas for K-12 curriculum, one of my undergraduate students wrote a term paper on the topic. She was more sanguine than I am, however; I'm afraid it is no longer evident that the US educational system is working.

hector said,

"95% of all language prescriptivist peeving are people defending their perceived accrual of cultural capital"

For centuries, Latin was the language of academia. When it started being replaced by the vernaculars, appeals-to-Latin prescriptivism arose. Basically, what had been a closed shop (you had to know Latin to gain entry) became an open shop, and some of the initiated felt the need to defend the value of their acquired property, which was losing value.

Ruben Polo-Sherk said,

@Jerry Friedman: the problem is that, though in general what the standard dialect is is a knowable thing, there are many gray areas where the boundary is not clear. For example, to me, never splitting infinitives is definitely non-standard: if someone doesn't split an infinitive when they should, what they say cannot possibly be what they mean. But many people who use what must be StEdAmEng won't split them.

Another example: sometimes the use of 'should' with the subjunctive seems to me to be much more British than American; e.g., 'I suggested to him that he should go.' Someone from the U.S. would tend to say 'I suggested to him that he go' (if they wanted to use the subjunctive—if I were to interpret the 'should go' Americanly, it would be as an indirect statement of 'You should go' (with 'go' as an infinitive)). But the difference is not such that I would say that 'that he should go' as a subjunctive is something that exists outside StEdAmEng.

Ruben Polo-Sherk said,

@Rod Johnson: I don't think that in linguistics, things are objectively knowable, as they are in geology. Which is ironic, because as something that is created in each person's mind, it's something that is totally knowable–no one knows more about English than I do, and no one ever will–which is not something that is true in science.

The kind of descriptive linguistics that would have relevance to creating a K-12 curriculum involves, well, describing these things that people know about English, and as such, the result varies according to individuals' perspectives. So in order do avoid the influence of the differences in these perspectives, you can't go into detail, which makes it pretty useless.

The 'answer' as far as this is concerned might be a semantically-based description. On the other hand, it would probably be very hard to teach, since in principle you have one rule for every semantic element (some do overlap, of course, but it's much less organized than a syntax-based approach).

Nelson said,

Whether or not it's practical to implement, I've always felt that the IPA should be taught in high school. While you can make theoretical quibbles, the basic concepts of classifying sounds are still useful, and the system's done a pretty good job of standing the test of time already. It's not too abstract, and should be rather easier to learn for most students than (say) trigonometry. And really, some elementary knowledge of how the sounds most of us use all the time are put together should be a part of the basic stock of knowledge of educated people, like plate tectonics, the composition of the solar system, or the cellular composition of complex organisms.

Nelson said,

@John Lawler- the earlier the better, probably. High school just came to mind since I thought it might be a little easier to work in new ideas there, rather than fight against weight of tradition in 1st grade spelling classes. Not that the task would be easy at any level- though I'm not really convinced of its outright impossibility, if a reasonably vigorous and intelligent campaign tried to get it done. Or is there a lesson from history here, of someone trying to incorporate basic phonology/IPA into the educational system and failing (for reasons other than the ineptness of the reformers)?

Anyway, I was mostly thinking about _what_ to teach, rather than when or how to get it taught. Focusing on a relatively concrete thing like the IPA (which is, after all, a spelling system, aside from anything conceptual about language it conveys) might have more palatability over more abstract and theoretically controversial things like generative syntax or semantic analysis (or the more theoretical areas of phonology, for that matter).

[…] only ones who care about good writing and proper English. She was subsequently lambasted by just about everyone, which compelled the New Yorker to publish a follow-up article that was not only equally […]