It’s National Grammar Day, so as usual, I’m taking the opportunity to look back on some of the grammar myths that have been debunked here over the last year. But before I get to that, let’s talk briefly about language change.

Language changes. There’s no question about that — just look at anything Chaucer wrote and it’s clear we’re no longer speaking his language. These aren’t limited to changes at the periphery, but at the very core of the language. Case markings that were once crucial have been lost, leaving us with subject/object distinctions only for pronouns (and even then, not all of them). Negation, tense marking, verbal moods, all these have changed, and they continue to do so now.

Some people take the stance that language change is in and of itself bad, that it represents a decline in the language. That’s just silly; surely Modern English is no worse than Old English in any general sense.

Others take a very similar, though much more reasonable, stance: that language change is bad because consistency is good. We want people to be able to understand us in the future. (I’m thinking here of the introductory Shakespeare editions I read in high school, where outdated words and phrases were translated in footnotes.)

So yes, consistency is good — but isn’t language change good, too? We weed out words that we no longer need (like trierarch, the commander of a trireme). We introduce new words that are necessary in the modern world (like byte or algorithm). We adapt words to new uses (like driving a car from driving animals). This doesn’t mean that Modern English is inherently better than Old English, but I think it’s hard to argue Modern English isn’t the better choice for the modern world.

Many writers on language assume that the users of a language are brutes who are always trying to screw up the language, but the truth is we’re not. Language users are trying to make the best language they can, according to their needs and usage. When language change happens, there’s a reason behind it, even if it’s only something seemingly silly like enlivening the language with new slang. So the big question is: is the motivation for consistency more or less valid than the motivation for the change?

I think we should err on the side of the change. Long-term consistency is nice, but it’s not of primary importance. Outside of fiction and historical accounts, we generally don’t need to be able to extract the subtle nuances from old writing. Hard though it may be to admit it, there is very little that the future is going to need to learn from us directly; we’re not losing too much if they find it a little harder to understand us.

Language change, though, can move us to a superior language. We see shortcomings in our native languages every time we think “I wish there was a way to say…” A language is probably improved by making it easier to say the things that people have to or want to say. And if a language change takes off, presumably it takes off because people find it to be beneficial. When a language change appears, there’s presumably a reason for it; when it’s widely adopted, there’s presumably a compelling reason for it.

The benefits of consistency are fairly clear, but the exact benefit or motivation for a change is more obscure. That’s why I tend to give language change the benefit of the doubt.

Enough of my philosophizing. Here’s the yearly clearinghouse of 10 busted grammar myths. (The statements below are the reality, not the myth.)

There is nothing wrong with I’m good. Since I was knee-high to a bug’s eye, I’ve had people tell me that one must never say “I’m good” when asked how one is doing. Well, here’s an argument why that’s nothing but hokum.

Amount of is just fine with count nouns.Amount of with a count noun (e.g., amount of people) is at worst a bit informal. The combination is useful for suggesting that the pluralized count noun is best thought of as a mass or aggregation.

Verbal can mean oral. In common usage, people tend to use verbal to describe spoken language, which sticklers insist is more properly described as oral. But outside of certain limited contexts where light ambiguity is intolerable, verbal is just fine.

Whom is moribund and that’s okay. (from Mike Pope) On rare occasions, I run across someone trying very hard to keep whom in the language, usually by berating people who haven’t used it. But the truth is that it’s going to leave the language, and there’s no reason to worry. Mike Pope explains why.

Uh, um, and other disfluencies aren’t all bad. (from Michael Erard, at Slate) One of the most interesting psycholinguistic papers I read early in grad school was one on the idea that disfluencies were informative to the listener, by warning them of a complicated or unexpected continuation. Michael Erard discusses some recent research in this vein that suggests we ought not to purge the ums from our speech.

Descriptivism and prescriptivism aren’t directly opposed. (from Arrant Pedantry) At times, people suggest that educated linguists are hypocritical for holding a descriptivist stance on language while simultaneously knowing that some ways of saying things are better (e.g., clearer, more attractive) than others. Jonathon Owen shines some light on this by representing the two forces as orthogonal continua — much more light than I’ve shone on it with this summary.

Some redundant stuff isn’t really redundant. (from Arnold Zwicky, at Language Log) I’m cheating, because this is actually a post from more than five years ago, but I found it within the last year. (This is an eleventh myth anyway, so I’m bending rules left and right.) Looking at pilotless drones, Arnold Zwicky explains how an appositive reading of adjectives explains away some seeming redundancies. If pilotless drones comes from the non-restrictive relative clause “drones, which are pilotless”, then there’s no redundancy. A bit technical, but well worth it.

Want to see somewhere between 10 and 30 more debunked myths? Check out some or all of the last three years of NGD posts: 2011, 2010, and 2009.

About The Blog

A lot of people make claims about what "good English" is. Much of what they say is flim-flam, and this blog aims to set the record straight. Its goal is to explain the motivations behind the real grammar of English and to debunk ill-founded claims about what is grammatical and what isn't. Somehow, this was enough to garner a favorable mention in the Wall Street Journal.

About Me

I'm Gabe Doyle, currently a postdoctoral scholar in the Language and Cognition Lab at Stanford University. Before that, I got a doctorate in linguistics from UC San Diego and a bachelor's in math from Princeton.

In my research, I look at how humans manage one of their greatest learning achievements: the acquisition of language. I build computational models of how people can learn language with cognitively-general processes and as few presuppositions as possible. Currently, I'm working on models for acquiring phonology and other constraint-based aspects of cognition.

I also examine how we can use large electronic resources, such as Twitter, to learn about how we speak to each other. Some of my recent work uses Twitter to map dialect regions in the United States.

Top Rated

59 comments

“Long-term consistency is nice, but it’s not of primary importance. Outside of fiction and historical accounts, we generally don’t need to be able to extract the subtle nuances from old writing.”

Thanks for addressing this point. So many people simply assert that it would be better if language simply stopped changing (or at least changed much more slowly). It’s an attractive idea, but it’s not very realistic, and not just because slowing down or stopping change is nigh impossible. As you say, very few of us need to be able to read stuff written hundreds of years ago, and chances are good that very little of what’s written today will be read hundreds of years from now. This sort of stability across centuries may sound appealing, but it’s just not very important.

„Case markings that were once crucial have been lost, leaving us with subject/object distinctions only for pronouns (and even then, not all of them).“

Do English pronouns really distinguish between subject and object? Sure, most of them have two different forms, but do they really mark subject and object? Consider the sentences „John and me went to town“ and „We have got a long way to go to ensure that the Australian community knows Kevin and I, trusts Kevin and I and wants us to be the prime minister and deputy prime minister of this country.“

They’re both perfectly grammatical, and yet they completely mix up the so-called subject and object pronouns. Linguists usually call those exceptions and make up excuses for them, but, to quote Mark Twain, there are more exceptions to the rule than instances of it. When to use the so-called subject and object pronouns is nowadays dependent on completely different factors.

Thomas Voß: In this instance, I’d say it’s an exaggeration to say that there are more exceptions to than instances of this rule. Although the presence of “and” can cause use of pronouns to get a bit erratic, few if any competent native speakers will use subject pronouns for objects or vice versa when the pronoun is the entire subject/object. (In other words, while many native speakers would say “John and me went to town” or “The Australian community knows Kevin and I”, almost nobody would say “Me went to town” or “The Australian community knows I”.) Furthermore, I would say the number of speakers who perform this sort of pronoun reassignment are in the minority.

For my own part, I will sometimes use object pronouns as subjects — although I’m more likely to say “Me and John went to town” than “John and me went to town” — but I never use subject pronouns as objects. I’m curious to know if anyone does the opposite.

On the subject of pilotless drones, there are two types of drones: ones with complete on-board computerized control and remotely piloted ones. I haven’t done an exhaustive survey of any corpus but it would make sense to call the former pilotless drones and the latter piloted drones.

As to the subject of the ‘justly’ moribund whom, I have read Mike Pope’s article and disagree with him about it. One of his justifications for the needlessness of whom—and figuring it out—is his claim that “[native speakers] just don’t have this sort of trouble with other case-marked pronouns”. This is quite untrue: too many times we hear, “Would you like to come to dinner with Mike and I?” I, too, should like to think that other case-marked pronouns be un-troublesome, but they are not, sadly. Me and my girlfriend are in agreement with this—the first, a wronged entity, both grammatically and morally! (“Mos-mori” morally, that is.)
And so, let us not give up on who and whom quite yet, even in demotic English.

In general, English speakers are extremely consistent about case marking of pronouns. In simple, declarative clauses mistakes are almost non-existent. The exceptions are case marking in conjoined noun phrases (x and me, me and x, x and I, I and x) , where some sense of formality and politeness seems to complicate the issue, and the who/whom issue. It’s not particularly difficult to learn to use whom, but generally it’s dispensable. You can use a zero relative marker in relative clauses, and who usually sounds just fine in question formation.

Misunderstood, perhaps not; insufferable to educated ears, yes. What’s more, the sense that the speaker is referring to two people in the accusative, in my mind, changes the spirit of the question considerably, though granted, not the meaning.

“On the subject of pilotless drones, there are two types of drones: ones with complete on-board computerized control and remotely piloted ones. I haven’t done an exhaustive survey of any corpus but it would make sense to call the former pilotless drones and the latter piloted drones.”

The blanket use of “drone” is simply all-purpose bumpkin journalese. The military makes a clear distinction between drones (dated technology: their flight path was preprogrammed before launch) and remotely-piloted vehicles – which is what these “drones” we hear about actually are.

There are different ways of thinking about these things and different purposes that language users, writers, and editors might have, Cody Franchetti. I’m not against using whom in edited English when it is deemed appropriate for formality or clarity.
Prescriptive rules are essentially about editorial policy. Descriptive rules are empirical. The criterion you mention – “insufferable to educated ears” – is not empirical.
Since you mention ears, you may be referring to spoken English. From what I hear, whom is seldom used in informal spoken English. In that context, I do think we can get along without it since, as you say, the meaning is still usually clear. If the listener looks puzzled, we can rephrase our utterance. (Would you like me to introduce anybody to the others?)
Think of it this way. Common nouns and proper nouns are not case marked at all, yet nobody is confused about whether the dog bit the man or visa-versa.
When you write something informal or colloquial, quite likely you’ll use who for accusative interrogative pronouns. When you write a formal piece for a magazine or a literary journal, perhaps you’ll use whom. Judgements about usage issues always depend on context – formal/informal, written/spoken, edited/unedited.

Context is indeed of paramount importance. And yet I favor judgments about usage on conditioners, other than the ones you have specified: formal, morphological—and to mention the most important of all, aesthetic. The latter relates to your statement about the ’empirical’ criteria of descriptive grammar; to me, usage must be firstly alert to aesthetic considerations, because, in the end, I seek a humanistic rather than a scientific approach. And so, in addition to logical intuition, which governs grammatical considerations and its formal constructions, I think “aesthetic intuition” is just as important in matters of language. It is for that reason that I should never want to substitute “Whom would you like me to introduce to whom?” with “Would you like me to introduce anybody to others?”—unless, of course, I knew that my interlocutor might be confused by two whoms—because the spirit of that question phrased in that way is completely lost in your rendition.
A magisterial case of this approach and conception of language is Henry Fowler’s. He had a prescriptive stance, and yet he was never dogmatic. Otto Jespersen on the other hand, was eminently descriptive but ended up being infinitely more dogmatic and intransigent.
That seems counter-intuitive until one realizes that Fowler possessed a sense of language—born out of aesthetic intuition—which make his century-old books on grammar still relevant today; Jespersen on the other hand, has made fundamental contributions to linguistics, but has been largely superseded, because, alas, he was logically inductive—scientific in fact—with that endlessly pliant, plastic thing we call language.

Fowler remains relevant because he took evidence into account.He observed what good writers did and tried not to impose his own whims. He declared that “don’t end a sentence with a preposition” and “don’t split the infinitive” were not rules but superstitions.
If you look at what good writers are doing with who/whom, you’ll find a lot of variation. Variation in language is as normal, natural, and inevitable as variation in biology.
Again, my main point is that prescriptive standards apply in particular (formal, edited, written) contexts. These are conventions, not rules. There is no privileged position from which to judge, aesthetic or otherwise. Language is much bigger than that. It’s a complicated natural system. Taking a few grammatical issues and asserting what’s right and wrong is like capturing a few animals, putting them in a zoo, and declaring that you have defined what animals are like.

A good attempt at reductio ad absurdum, and guilefully distorting my argument. But I am not advocating to possess or even point to a privileged, judging position: I am merely stating what I perceive to be more wide-ranging parameters to judge language. And yes, I fancy taking some particularly beautiful specimens of linguistic flourishes—a Ciceronian period, par exemple—and observe it in its “own form beautiful”, and, perhaps, draw from it some knowledge of how certain conventions fomented such eloquence and splendor. (That is valid for a sentence littered with spirited solecisms: I am an equally opportunity onlooker: but I prefer wearing aesthetic lens in doing so.)

As to the “split infinitive” your bearings on Fowler’s view of it are not precise: he only spoke of “superstitions” in his entry of MEU* concerning “Prepositions at End”—and very rightly so. But in “Split Infinitive”, he wrote a very long article distinguishing five categories of authors who use with the split infinitive, and put himself in the last one (“those who know and discern”), saying that it is better to avoid it, if possible, because those who consistently employ it show they don’t have an “ear for the English language”. He thus displays exactly what I referred to earlier as “aesthetic intuition” for language, which I’ve been batting for all along.

Are we still allowed to make value judgments on language based on whether anyone can understand what one says?

Granted, there has never been a “standard” form of any language, except what was taught artificially to impose a “nation state” (or its appearance), but there is still a question of how many people will understand you when you use something a certain way.

kitchenmudge, I don’t think anyone is saying that we shouldn’t use language that our audience will understand. But just because we don’t understand something doesn’t mean we should put a value judgment on it. If someone’s English dialect is so different from mine that I don’t understand it, that doesn’t make their dialect worse than mine.

Corey, your writing style comes across as artificially formal and fairly pretentious. I will not be taking any usage advise from you, lest I sound like an overeducated nincompoop.

I really have an issue with “Standard English” and its litany of arbitrary rules that serve to purpose but to play social gatekeeper to people who don’t have time/means to memorize a bajillion arbitrary rules. I say if communication isn’t impaired, it should be impolite to criticize.

You wish there WERE a way to say ….
There isn’t even any reason to change that. It doesn’t make it more clear, it makes it less clear. It’s not any faster. It doesn’t convey more information. And it sounds horrible.

Abbie, I do not know whether your criticism refers to me, but, as I see no other Coreys and my name somewhat resembles it, I will take your criticism upon myself and advise you about three things. Be less socially disquieted—means or time have nothing to do with a person’s proper use of grammar, and your social apprehension confuses your thought; secondly, learn to use the verb of the protasis in the subjunctive; thirdly, learn the difference between should and would: by mastering the last two, your spite shall be much sharper.

As I wrote my last entry for Abbie at 4:45 am, I should like to rewrite my entry without the patches my weakened mind caused.

Abbie, I do not know whether your criticism was directed toward me, but, as I see no other Coreys above and my name somewhat resembles it, I will take your criticism upon myself and advise you about three things. Firstly, be less socially disquieted; means or time have nothing to do with a person’s proper use of grammar; good usage is not a social gatekeeper—it may be an intellectual marker, but you don’t seem to mind the latter—and your social apprehension confuses your thought. Secondly, learn to use the verb of the protasis in the subjunctive, or, if you don’t know how to do so, keep the verb of the apodosis in the indicative. Thirdly, learn the difference between should and would: by mastering the last two, your spite shall acquire much vigor.

[…] have not have noticed that March 4 was National Grammar Day. That’s okay. But Gabe Doyle over at Motivated Grammar and others have written some interesting thoughts on the subject, and we’ve been wondering about […]

Yes. English grammar, where matrix (main) and subordinate are the common terms. Somebody should write blog called (Un?)Motivated Greek and Latin grammar. It sounds like fun.
However, we’ve drifted very far from the original points under discussion, so let’s end this. Prescriptivism has its place. If you are the editor of a literary journal and you want to require whom in dative and accusative contexts, fine. If you are engaged in describing contemporary English usage, things are a lot more complicated, and you’ll have to appeal to the evidence.
I think that’s the general spirit of things on Motivated Grammar.
All in fun, right?

Somewhat: it was you who made a snide comment—not I. Nevertheless, I enjoyed our discussion, though, you, Eugene, have not answered my ferreting your rawboned, rhetorical attempts to reduce my arguments in your post of March 9th against my position on what I think ought to be linguistic yardsticks.

(Strange, though, I read protasis and apodosis all the time in textbooks of contemporary Functional Grammar written in English.)

So if they possess a goodly amount, they should not be “strange” to “protasis” and “apodosis”, which are far from being “subordinate” and “matrix” (or “main”), since, as my friend Eugene fails to note, the Greek-derived terms are specifically applied to the conditional—in turn to the conditional clause of the conditional sentence and the consequent clause of the same.

It may very well be useful to have special, specific terms for the matrix and subordinate clauses in conditional constructions. I’d say you’ll get more mileage out of the general terms. They apply to all complex sentences.
Either way, the discussion is completely off topic. This has nothing to do with any of the grammar myths discussed in the blog post.
Everybody else dropped out of the discussion two days ago, and we should take the hint.

I do not cater to everyone else, especially when it comes to linguistic whims—shall we call it that?—which is what we all have, which is what we like to discuss, to put forth, to champion—to encourage others to take on.
By all means, though, let’s bury the who/whom hatchet and unearth a new one: shall we speak of the lamentable misuse of words such as “mutual”, “aggravate”, “individual”, and, worse of all, the ‘black whole’ of the compound preposition “as to”, which nobody seems to know how to command and generally uses as a flourish—often as a complete pleonasm—with the pitiful intent of sounding formal or well-educated?

Words really are misused sometimes. One issue that people cite is imply vs. infer. I haven’t observed it myself, but the two words mean different things, and anybody with a dictionary could choose the right one. Still, if a speaker or writer makes a mistake in informal contexts, I don’t get bent out of shape about it. In formal/edited situations, it’s a different story. I’d expect an editor to catch the mistake.
Someone could start their (his/her) own blog on the topic of misused words and expressions. I’d love to see the entry on the “black whole” (just kidding – typo, right?) of compound prepositions.

During my seminars, students who speak always use infer and imply correctly: I have never heard them misused for a while now, so I am relieved; however, like you, it used to drive up the mirror before, because I never heard it used correctly before.

Since you mentioned it, and with the hope that someone else reads our exchanges, I should like to set straight the ‘as to’ question, since, I hear it far too often improperly used, and has become a fixation of mine to eradicate spurious ‘as to’s.
‘As to’ has one—and quite useful—quite specific specific use. It is used for emphasis by bringing into prominence something that would normally occur toward the end of the sentence; it does not link verbs to prepositional phrases.*
And so, “As to his political convictions, Michael never spoke of them.” The sentence could be easily recast as, “Michael never spoke of his political convictions.” Thus ‘as to’ is a rhetorical tool that is used to show that the speaker’s—or the conversation’s—concerns are about the political convictions of Michael rather than, say, the later.
Here are some examples of popular ‘as to’s: “He asked as to what we were doing out there in the cold.” Here “as to” should be stricken entirely: it is employed to give an air of sham dignity. The same goes for, “I wonder as to whether he is late to our meeting” and should be omitted. Other times, we find an ‘as to’ in place of a simple prepositions: “Mary was skeptical as to John’s ability to operate the machine.” Read simply to. “I think everything as to the creative end of the company, should be left to Henry.” Change “as to” with “concerning”. “I’m not so sure as to John’s merits.” The “as to” should read “about”.
This brings me to another point, which also plagues me: ‘about’ seems to have become a dreaded preposition, as if it were lax or even ‘uneducated': “I’m thinking about Jane” is absolutely correct and needs no decoration.

*The constructions “so/as to” is an exception: ‘as to’ appears in the middle of a sentence, because of the ‘so much’. “He pulled the lever so hard as to render it inoperable.” But the sentence could also be changed to, “He pulled the lever so hard, that he rendered it inoperable/broke it.” But the later has a different flavor from the first sentence: in the first example the speaker is underscoring the brutality of the action.

As to the legitimate, useful, and beautiful tool that ‘as to’ may be, I hope to have shown that.

Disregard the two repeating “specific” and the “drive [me] up the mirror”: they are fruit of my mindless practice of not proof reading—of my impetuosity.
And you were quite right, Eugene, in pointing out the “black whole” typo: as to my lack of proof reading, you can usually spot in my writing a number of them due to it!

The typos might prove to be interesting, either along the lines of Fromkin’s work on mistakes and what they show about language structure, or just as novel constructions. I’ve been thinking about what “black whole” might mean, but I’m having trouble forming an image. On the other hand, “drive me up the mirror” is easy to interpret – irritated, yet introspective. His criticism of my work really drove me up the mirror. As much as I resent the critique and prefer to stick to my initial thoughts, I might have to revise them. I think the expression could catch on.

Good question.
The term, compound preposition, perhaps works best for something like “on top of” or “apart from” or “in case of.” The item in question, “as for,” is on the list. If it takes a noun phrase complement (as for your new hairstyle…) then it makes sense to treat it as a preposition.
The word “as” has several functions, one of which is adverb. It can also be a conjunction, pronoun, etc.http://dictionary.reference.com/browse/as

I am not speaking of the word “as” indifferently; I’m speaking about “as to” and its fairly specific use.

Your psychological approach to idiomatic phrases has long been proven fecund by linguists and anthropologists, but your interpretation may be just as easily reversed: “drive me up the mirror” may not necessarily be “irritated, yet introspective”; it could be “formidably irritated and extroverted”, for example, since, I cannot think of a more slippery and arduous common surface than a mirror, and it would take a lot of exasperation and a positively uninhibited person to climb it.

I like it – not introverted, but extroverted. Still, you’re looking in the mirror. In such a case there’s usually nobody else around. If you’ve ever looked at other people in the mirror, I’d like to hear how they reacted. I hope you got out of there without any trouble.
As for “climbing up the mirror,” (or climbing up the wall) that’s quite a different thing than being driven up the mirror or the wall. When somebody drives me up the wall, I’m exasperated. When somebody drives me up the mirror, I might look at myself from a higher perspective.
Somehow, I’m thinking about Spiderman.

Thomas/Daniel/Eugene: Personally, I view most of the case assignment confusions arising from conceptual noise in production, which is, I suppose, very similar to the idea Thomas mentioned, that there is a subject/object distinction, and then that there are tons of exceptions. The advantage of the noise framework is that it unifies the exceptions, which tend to appear in situations where it’s difficult to assess the case assignments.

kitchenmudge/goofy: Agreed. I think one can make value judgments related to the comprehensibility of what someone says or writes. But this set of value judgments is somewhat different from the value judgments on what is or is not proper or standard or valid English.

Rilian: I disagree. In my dialect, I can use either “was” or “were” there, and they mean the same thing. I admit that “were” sounds a bit better to me, but it also sounds stuffier, and thus I alternate between the two options depending on the formality that I want the sentence to have. I mean this honestly: how does “was” make the sentence less clear in your dialect?

Cody: I’ve previously exploded the “aggravate” myth. I don’t know what myths are attached to “mutual” and “individual”, though. In fact, I have to confess that I don’t know much of what you’re talking about.

Then it is not my place to make further comments: bad blood is only good if it spurs meaningful conversation, which, you intimate quite clearly, is not the case. And so, I defer to you; it is your blog after all. But let me make one last observation. With your mathematics and computational linguistic competency I beseech you to never lose sight that language is a humanistic pursuit that does not follow the natural sciences’ method or any law covering model: it is based on hermeneutics—understanding—which seems to me the right frame of mind to approach those amethystine hues that language possesses and from which rational abstarctions shall keep a blind eye.
Good day and many compliments on your site.
(If you’re interested in the “mutual” and “individual” myths, I relay you to p.62-67 of Fowler’s ‘The King’s English’.)

Sometimes saying “was” instead of “were” in the subjunctive causes like temporary confusion. It depends on whether the rest of the sentence is ambiguous. But usually I can figure out what the person meant. But it’s annoying because it’s a distraction from the content, and if they’re still talking, then I miss what they say next.

[…] post from Motivated Grammar, which is a bit wordy at the beginning, so just skip down to the debunked grammar myths, which are fab. (No one said good grammarians didn’t need good editors; for that try Sentence […]

[…] In honor of National Grammar Day, which was earlier this month, the author of the grammar blog Motivated Grammar explains 10 words/phrases that people often wonder about, including me, like these: each other / one another, I’m good, backward(s), toward(s), verbal/oral. Language changes. … Take a look. […]