It’s National Grammar Day 2013, which has really snuck up on me. If you’ve been here in previous years, you know that I like to do three things on March 4th: have a rambling speculative discussion about the nature of grammar and/or linguistics, link to some people’s posts I’ve liked, and link to some of my posts. Unfortunately, I’ve been so busy with dissertation work lately that I’m a bit worn out on discussion and haven’t been adequately keeping up with everyone’s blogs. So I hope you’ll forgive my breach of etiquette in making this year’s NGD post all Motivated Grammar posts.

Well, not entirely. Everyone in our little community gets in on National Grammar Day, so let me mention a few good posts I’ve seen so far. Kory Stamper discusses her mixed feelings on the day, as well as on correcting people’s language in general. Dennis Baron looks at the abandoned, paranoid, wartime predecessor of NGD, “Better American Speech Week”. And from last year, but only better from the aging process, Jonathon Owen and goofy had posts asking what counts as evidence for grammatical correctness or incorrectness, and why we’re so often content to repeat grammar myths.

Below you’ll find this year’s collection of debunked myths. As usual, the statements below are the reality, not the myth, and you can click through for the original post and the rest of the story.

Gender-neutral language isn’t new. Some people get up in arms about gender-neutral language (e.g., firefighter for fireman), claiming that everyone was fine with gendered language up until the touchy-feely ’60s or ’70s. But that’s not the case, and this post discusses gender-neutral language well before our time, over 200 years ago.

Off of is perhaps informal, but not wrong. There is nothing linguistically or grammatically incorrect about off of. It’s nonstandard in some dialects and informal in most, so you should probably avoid it if you’re concerned about your writing seeming formal. But when formality isn’t a concern, use it as you see fit.

Since for because is fine. In fact, almost no usage guides complain about this, though it’s a persistent myth among self-appointed language guardians. A surprising number of style guides (such as that of the APA) are against it, but historically and contemporaneously, English has been and remains fine with it.

Formal language isn’t the ideal; informal language isn’t defective. Informal language has its own set of rules, separate from formal language. It’s the “normal” form of the language, the one we’re all familiar with and use most. At different times, formal or informal language is more appropriate, so we shouldn’t think of formal language as the best form.

Someone can know more than me.Than is fine as a conjunction or a preposition, which means that than me/him/her/us is acceptable, as it has been for hundreds of years. The belief it isn’t is just the result of trying to import Latin rules to a distinctly non-Latinate language.

Comma splices aren’t inherently wrong. Comma splices, where two (usually short) sentences are joined by nothing more than a comma, became less prominent as English’s punctuation rules codified. But historically speaking, they’ve been fine, and to the present day they’re most accurately viewed as informal, but hardly incorrect. That said, one has to be careful with them so that they don’t just sound like run-ons.

Psst. Hey, down here. You want more debunked myths? We’ve got four more years of ’em for ya. Check out 2012, 2011, 2010, and 2009. 40 more myths for your pleasure. Check out singular “they”, “anyway(s)”, “hopefully”, and more.

It’s National Grammar Day, so as usual, I’m taking the opportunity to look back on some of the grammar myths that have been debunked here over the last year. But before I get to that, let’s talk briefly about language change.

Language changes. There’s no question about that — just look at anything Chaucer wrote and it’s clear we’re no longer speaking his language. These aren’t limited to changes at the periphery, but at the very core of the language. Case markings that were once crucial have been lost, leaving us with subject/object distinctions only for pronouns (and even then, not all of them). Negation, tense marking, verbal moods, all these have changed, and they continue to do so now.

Some people take the stance that language change is in and of itself bad, that it represents a decline in the language. That’s just silly; surely Modern English is no worse than Old English in any general sense.

Others take a very similar, though much more reasonable, stance: that language change is bad because consistency is good. We want people to be able to understand us in the future. (I’m thinking here of the introductory Shakespeare editions I read in high school, where outdated words and phrases were translated in footnotes.)

So yes, consistency is good — but isn’t language change good, too? We weed out words that we no longer need (like trierarch, the commander of a trireme). We introduce new words that are necessary in the modern world (like byte or algorithm). We adapt words to new uses (like driving a car from driving animals). This doesn’t mean that Modern English is inherently better than Old English, but I think it’s hard to argue Modern English isn’t the better choice for the modern world.

Many writers on language assume that the users of a language are brutes who are always trying to screw up the language, but the truth is we’re not. Language users are trying to make the best language they can, according to their needs and usage. When language change happens, there’s a reason behind it, even if it’s only something seemingly silly like enlivening the language with new slang. So the big question is: is the motivation for consistency more or less valid than the motivation for the change?

I think we should err on the side of the change. Long-term consistency is nice, but it’s not of primary importance. Outside of fiction and historical accounts, we generally don’t need to be able to extract the subtle nuances from old writing. Hard though it may be to admit it, there is very little that the future is going to need to learn from us directly; we’re not losing too much if they find it a little harder to understand us.

Language change, though, can move us to a superior language. We see shortcomings in our native languages every time we think “I wish there was a way to say…” A language is probably improved by making it easier to say the things that people have to or want to say. And if a language change takes off, presumably it takes off because people find it to be beneficial. When a language change appears, there’s presumably a reason for it; when it’s widely adopted, there’s presumably a compelling reason for it.

The benefits of consistency are fairly clear, but the exact benefit or motivation for a change is more obscure. That’s why I tend to give language change the benefit of the doubt.

Enough of my philosophizing. Here’s the yearly clearinghouse of 10 busted grammar myths. (The statements below are the reality, not the myth.)

There is nothing wrong with I’m good. Since I was knee-high to a bug’s eye, I’ve had people tell me that one must never say “I’m good” when asked how one is doing. Well, here’s an argument why that’s nothing but hokum.

Amount of is just fine with count nouns.Amount of with a count noun (e.g., amount of people) is at worst a bit informal. The combination is useful for suggesting that the pluralized count noun is best thought of as a mass or aggregation.

Verbal can mean oral. In common usage, people tend to use verbal to describe spoken language, which sticklers insist is more properly described as oral. But outside of certain limited contexts where light ambiguity is intolerable, verbal is just fine.

Whom is moribund and that’s okay. (from Mike Pope) On rare occasions, I run across someone trying very hard to keep whom in the language, usually by berating people who haven’t used it. But the truth is that it’s going to leave the language, and there’s no reason to worry. Mike Pope explains why.

Uh, um, and other disfluencies aren’t all bad. (from Michael Erard, at Slate) One of the most interesting psycholinguistic papers I read early in grad school was one on the idea that disfluencies were informative to the listener, by warning them of a complicated or unexpected continuation. Michael Erard discusses some recent research in this vein that suggests we ought not to purge the ums from our speech.

Descriptivism and prescriptivism aren’t directly opposed. (from Arrant Pedantry) At times, people suggest that educated linguists are hypocritical for holding a descriptivist stance on language while simultaneously knowing that some ways of saying things are better (e.g., clearer, more attractive) than others. Jonathon Owen shines some light on this by representing the two forces as orthogonal continua — much more light than I’ve shone on it with this summary.

Some redundant stuff isn’t really redundant. (from Arnold Zwicky, at Language Log) I’m cheating, because this is actually a post from more than five years ago, but I found it within the last year. (This is an eleventh myth anyway, so I’m bending rules left and right.) Looking at pilotless drones, Arnold Zwicky explains how an appositive reading of adjectives explains away some seeming redundancies. If pilotless drones comes from the non-restrictive relative clause “drones, which are pilotless”, then there’s no redundancy. A bit technical, but well worth it.

Want to see somewhere between 10 and 30 more debunked myths? Check out some or all of the last three years of NGD posts: 2011, 2010, and 2009.

It’s March 4th again, which means that it’s National Grammar Day again, which means that it’s time to dig through the archives again and pull out some of the grammar myths that have been debunked here on Motivated Grammar this year. And that is the only fun part about National Grammar Day for me.

If you’re new here, you might be surprised at that. “But Gabe!” you cry, “Aren’t you all about grammar? Wouldn’t you love a day celebrating it?” And my response to that question is a curt no. You see, I’m all about grammar and language and the like. Hell, I’m in grad school studying it. But when most people say they’re interested in grammar, they mean they’re interested in learning a set of rules. And the rules they’re trying to learn hold about as much relationship to English as runway models’ clothes hold to the clothes in your wardrobe. These grammar rules — or to be more accurate, myths — are viewed as signs of high culture and linguistic erudition, but the truth is that they are far from the truth, and are at best harmless.

At their worst, these myths serve as a means for those who shout the loudest to shut up those who meekly try to use the language. I’ve known many people who’ve sought to improve their grammatical knowledge, only to be dismayed by the sheer number of un- and counter-intuitive rules that met them. In fact, in my younger years I was one of them. For you see, I grew up in a working-class family in a working-class town, and I thought that one of the keys to class mobility was an impeccable command of the English language. (As Peter Gabriel put it in “Big Time”, I was stretching my mouth to let those big words come right out.) And that command, I thought, would come through the study of grammatical primers.

But like my failed attempt to master the rules of etiquette, my attempt to master the so-called rules of grammar too met with defeat, as I found myself unable to keep so many seemingly arbitrary rules in my head. And so I gave up and figured I could learn all I needed to know about the English language by observation of skilled writers and speakers. I spent some substantial effort in high school mimicking the speech styles of friends whose speech I admired, and the writing style of good authors.

Through it all, though, I kept entertaining the notion that I’d eventually know all the rules. And then, over the course of a couple years and a couple courses in linguistics, I came to realize that my very goal was a load of hokum. Yes, there are rules to English, like verb conjugation, or that adjectives usually precede nouns. But every native speaker already knows these rules. The ones discussed in the books, the ones I was trying to learn, they’re just nits to pick. And the nits aren’t even ones that correspond to any real form of English anyway.

If you want to know the rules of English, look in an English-as-a-second-language textbook, not Strunk and White. If you want to know how to use English effectively, read and listen to those whose language you enjoy and admire. Good English is constrained by rules, not defined by them.

But now I’m rambling, so let me stop that and move on to presenting the truth behind ten of these minor myths that people dress up as rules. I’ve included a brief summary of why the myth is untrue, but for the full story, follow the links:

There’s nothing wrong with anyways.Anyway is the more common form, but that’s a historical accident. Related forms always and sometimes are more common than their s-less companions, so clearly anyways isn’t inherently ungrammatical.

Nothing’s wrong with center around. Despite the claims that this usage is logically inconsistent, and that centers on is necessary, center around has been a valid part of English for around 200 years now. No reason to stop now.

There’s not just one right way to say something. Do you worry if the past tense of dive is dived or dove? Or do you worry about shined and shone? Well, a lot of the time there isn’t a single right or best way of saying it. As it turns out, a lot factors can affect the decision. And often it’s best to go with your gut feeling.

Ending a sentence with a preposition is always acceptable. The myth that it isn’t is the result of a half-baked argument John Dryden concocted in the 17th century to explain why he was a better playwright than Ben Jonson. He was wrong about being better than Jonson, and he was wrong about the prepositions, too. Unfortunately, three-and-a-half centuries of people have fallen for his myth.

“Ebonics” isn’t lazy English. Ebonics, or African-American Vernacular English as linguists generally call it, isn’t a deficient form of English. It’s a dialect, or possibly even a creole, of English with its own distinctive and systematic syntactic, phonological, and morphological features.

Gender-neutral language isn’t bad language. Using words like spokesperson doesn’t harm the language, and doesn’t start us down some slippery slope where the word human will have to be replaced by huperson or something. Similarly, using they to refer to a single person of unknown gender is a usage that’s been going on for centuries.

Ms. is a standard and useful abbreviation. Sure, Ms. is newer than Mrs. and Miss, but it’s a standard title. It’s a good solution to the asymmetry that female titles depend on maritial status and the male title does not.

Jealous can be used to mean envious. Some people try to claim that jealousy and envy are totally distinct, but they’re not, and they’ve been used in overlapping senses since Chaucer’s time.

And a few myths from other blogs:

Non-literal literally is perfectly standard. This one’s a three-fer. Stan Carey, me, and Dominik Lukes all wrote posts, each inspired by the other, about non-literal uses of literally. All of us share the conclusion that non-literal literally has been used for years, by writers good and bad, and is here to stay. But the three of us disagree on whether or not it’s a stylistically good usage. I found this an interesting exercise in seeing how different descriptivists dispense usage advice.

A lot of what gets called “passive” isn’t really. Language commentators often denigrate an impersonal usage by calling it a “passive”, and demanding that it be converted to an active form. But lots of impersonal forms are active already, and there isn’t anything wrong with the passive anyway(s). Geoff Pullum explains the English passive over at Language Log.

Redundancy doesn’t make something ungrammatical or unacceptable. Stan Carey points out that English is threaded through with redundancy, so it’s clear that redundancy isn’t inherently a bad thing. In fact, given that we’re communicating with people who might not catch the full message (or be paying full attention), redundancy is often a logical thing to add to your language.

Lastly, if you want another 20 myths debunked (or another 20 minutes’ break from work), check out our Grammar Day mythbusting from 2010 and 2009.

[Update 03/04/2012: Another National Grammar Day means ten more myths, looking at matters such as each other, anyways, and I’m good.]

Every time National Grammar Day comes around, I’m struck with a spot of dread. Any of my friends or acquaintances might, at any moment, spring upon me and shout “Hey! It’s totally your day! So don’t you hate when people use the passive voice, since you’re all into grammar?” And then I will be forced, as the crabby old coot I am, to meet their well-meaning inquiry with the level of vitriol normally reserved for a hairdresser who’s decided to treat your head as a testing ground for a new theory of hair design. “No,” I’ll shout, “that’s not it at all! I love the passive, I love variation! Grammar isn’t about telling people what they can’t say; it’s about finding out what people do say, and why they say it!” And through that outburst, my Facebook friend count will be reduced by one.

My problem with National Grammar Day (and most popular grammarians in general) is that it suggests that the best part of studying language is the heady rush of telling people that they shouldn’t say something. But if you really study language, you know that there’s so much more to it than that. Each time March 4th comes and goes, we’re missing an opportunity to show people how wonderful the field of linguistics is. So if you’ll permit me to steal a moment, let me show you the two papers that really made me fall in love with the field.

The first is from Murray, Frazer, and Simon: “Need + Past Participle in American English“, which is the first in a series of three papers on the Midwestern/Appalachian construction needs done (e.g., this article needs re-written, my cat needs washed). This paper made me realize how deep the rabbit-hole of colloquial and dialectal speech goes. (Sadly, you need a subscription to JSTOR to read it.)

The second paper is the one that launched me into the exciting world of alternation studies, Bresnan & Nikitina’s “On the Gradience of the Dative Alternation“. (This paper has since been superseded by revised versions, but I think this draft is still the best version for an alternations newbie.) If you ever have the chance, take a look at these papers. Maybe they won’t do anything for you, but then again, maybe they will, and maybe you’ll understand why I think so many celebrants of National Grammar Day are missing the point.

On to the meat of the post. As you might remember from last year, my favorite way to celebrate National Grammar Day is by debunking popular grammar myths. Here’re 10 facts about the English language that run counter to the rubbish that pedants prescribe. The first eight are from the last year of posts here at Motivated Grammar. The last two are from other sites. Explanations and justifications for the statements below are found by following the links, so if you disagree, please don’t grouse to me that I must be wrong until after you’ve read the reasons why you are.

Singular they is standard English. What’s wrong with the sentence Everyone celebrates today in their own way? Historical usage, contemporary usage, the usage of revered writers, acceptance by language authorities, analogous constructions, and issues of ambiguity all agree: absolutely nothing.

Slow is an adverb. It has been used as such for years, for centuries even. Shakespeare, Milton, and Thackeray all used adverbial slow, so it’s even fine with the literary set and style manuals. You may resume drinking Dr Pepper if you so choose.

People are using hopefully correctly.Hopefully has two distinct usages, one a regular adverb meaning “in a hopeful manner”, and the other a sentence-modifying adverb meaning approximately “I hope” or “With any luck”. The latter usage has been unreasonably derided, because it is a sentential adverb and it is a new meaning for an old word. But neither of those complaints is valid, especially since…

The meanings of words can and do change over time.Hopefully isn’t the only word with a new-meaning stigma; prescriptivists often vilify words that have sprouted new meanings. But this is a very standard part of the English language. In fact, not only hopefully, but also of course, snack, naturally, enthusiasm, and quarantine have all changed their meanings over time.

You can eat healthy food. This meaning was fine for 300 years, and then Alfred Ayers came along and declared it wrong. Of course, it was he who was wrong, but his edict has stuck around at the edges of prescriptivism ever since.

I’m good is good. Every once in a while, someone gives me guff about my careful avoidance of the phrase I’m well when I am asked how I am. There’s nothing wrong with I’m well, but it isn’t what I mean to say. There is also nothing wrong with I’m good, and it is what I mean to say.

Between and among differ not in number, but in vagueness. The rule that between can only be used with two items, and among with more than two, is specious. The real tendency of English favors between when the connections are conceptualized as being specifically between individuals, and among when the connections are more vague and collective.

An invite is informal, but hardly wrong. It’s a minor point, of course, but the noun has been around for 500 years. I mention this post mostly because there was a great discussion in the comments about the psychology of prescription.

Choosing between which and that is more interesting than you’d think. It’s nearing five years old now, but Arnold Zwicky posted about his understanding of different contexts in which which and that can be used as relativizers in a relative clause. It’s much more interesting and rewarding than just saying that which is to be limited to non-restrictive clauses. It’s also much more accurate.

Want more debunked myths? 10 more are available on last year’s post! See why 10 items or less, different than, and alright are all right. Want still more, preferably in fewer-than-140-character chunks? Follow Motivated Grammar on Twitter.

[Update 03/04/2011: For National Grammar Day 2011, I’ve listed another 10 grammar myths, addressing topics such as Ebonics, gender-neutral language, and center around.]

[Update 03/04/2012: And again for 2012. Ten more myths, looking at matters such as each other, anyways, and I’m good.]

About The Blog

A lot of people make claims about what "good English" is. Much of what they say is flim-flam, and this blog aims to set the record straight. Its goal is to explain the motivations behind the real grammar of English and to debunk ill-founded claims about what is grammatical and what isn't. Somehow, this was enough to garner a favorable mention in the Wall Street Journal.

About Me

I'm Gabe Doyle, currently a postdoctoral scholar in the Language and Cognition Lab at Stanford University. Before that, I got a doctorate in linguistics from UC San Diego and a bachelor's in math from Princeton.

In my research, I look at how humans manage one of their greatest learning achievements: the acquisition of language. I build computational models of how people can learn language with cognitively-general processes and as few presuppositions as possible. Currently, I'm working on models for acquiring phonology and other constraint-based aspects of cognition.

I also examine how we can use large electronic resources, such as Twitter, to learn about how we speak to each other. Some of my recent work uses Twitter to map dialect regions in the United States.