It’s a dark night; you’re in an unfamiliar city, slightly lost, but pretty sure you’ll know where you are if you just get to the next corner. The streets are quiet. A stranger steps out of the gloom in front of you, and announces that certain words don’t mean what you think they mean. They’re words that you use but have never really felt comfortable with, words that you use mostly because you’ve heard them in set phrases, words like plethora.

Plethora, you wonder, could it be I’m using it wrong? That niggling uncertainty kicks in, the same niggling uncertainty that’s pushed you to educate yourself all these years. It creeps further, darkening your mind. Have I been using words wrong? Your breath quickens — how many others have thought heard me say them before this stranger came up and told me I was wrong? Have I used one of them lately? Have I been judged? Your pulse races. Did I just say one? — is, is that why this stranger materialized to announce it was wrong?

The stranger says more words are being used wrong, by others, by you. These words are more common, common enough to be known but not common enough to be well-known: myriad, enormity. Oh God, you think, I’ve used those words in business writing! The uncertainty changes into certainty, certainty that you are wrong, and worse, that people know it. Important people know it. That’s why you haven’t been promoted, it’s why your friends were laughing that one time and didn’t say why. The stranger has you now. The stranger knows the dark spots on your soul. The stranger is almost touching you now, so close, so close. Your eyes meet. The stranger’s eyes widen; this is it, the final revelation. Do you dare listen? You can’t listen, you must listen:

“And you’re using allow wrong, too!”

At which point the spell is broken — because c’mon, you’re not using allow wrong. You’d definitely have noticed that. You push the stranger out of the way, and realize your hotel’s just on the next block.

In the unfamiliar city of the Internet, I encountered such a stranger: Niamh Kinsella, writer of the listicle “14 words you’ve been using incorrectly this whole time“. Kinsella argues that your usage doesn’t fit with the true definition of these words, by which she usually means an early, obsolete, or technical meaning of the word.

Her first objection is to plethora, which she defines as “negative word meaning a glut of fluid”. And so it was in the 1500s, when it entered the language as a medical term. This medical meaning persists in the present day, but additional figurative meanings branched off of it long ago — so long ago, in fact, that one of the meanings branched off, flourished for 200 years, and still had enough time to fade into obsolescence by now. The extant figurative meaning, the one that most everyone means when they use plethora, is antedated to 1835 by the Oxford English Dictionary, at which point it was usually a bad thing (“suffering under a plethora of capital”, the OED quotes). But by 1882 we see the modern neutral usage: “a perfect plethora of white and twine-colored thick muslin”.

The second objection is to myriad, and here Kinsella deviates by ignoring the early usage. She hectors: “It’s an adjective meaning countless and infinite. As it’s an adjective, it’s actually incorrect to say myriad of.” But in fact myriadentered English as a noun, either as a transliteration of the Greek term for “ten thousand”, or as an extension of that very large number to mean “an unspecified very large number” (both forms are antedated by the OED to the same 1555 work). The adjectival form doesn’t actually appear until two centuries later, the 1700s. Both nominal and adjectival forms have been in use from their inception to the present day; claiming that one or the other is the only acceptable form is just silly.*

There’s no point in continuing this after the third objection, which is to using allow in cases that do not involve the explicit granting of permission. To give you an idea of what folly this is, think of replacements for allows in a supposedly objectionable sentence like “A functional smoke alarm allows me to sleep peacefully.” The first ones that come to my mind are lets, permits, gives me the ability, and enables. That’s the sign of a solid semantic shift; four of my top five phrasings of the sentence are all verbs of permission with the permission shifted to enablement. Kinsella herself has no beef with it when she isn’t aiming to object, judging by her lack of objection to an article headlined “Are we allowed optimism now?”.

This enablement usage isn’t new, either; the OED cites “His condition would not allow of his talking longer” from 1732. (Permit without permission is antedated even further back, to 1553.) This oughtn’t even to be up for debate; even if it were completely illogical — which, as an example of consistent semantic drift, it’s not — the fact that it is so standard in English means that it is, well, standard. It is part of English, and no amount of insisting that it oughtn’t to makes a difference. It’s similar to the occasional objection I see to Aren’t I?: even if I agreed it didn’t make sense, virtually every (non-Scottish/Irish) English speaker uses it in place of amn’t I?, so it’s right. End of discussion.

Why do we fall for this over and over again? Why do we let people tell us what language is and isn’t based on assertions that never have any references (Kinsella cites no dictionaries) and rarely hold up to cursory investigation? I don’t know, but my guess is that it appeals to that universal mixture of insecurity and vanity that churns inside each of us.

We are convinced that we must be doing everything wrong, or — and perhaps worse — that we’re doing most things right but there’s some unexpected subset of things that we have no idea we’re doing wrong. So if someone tells us we’re wrong, especially if they candy coat it by saying that it’s not our fault, that everyone’s wrong on this, well, we just assume that our insecurities were right — i.e, that we were wrong. But then, aware of this new secret knowledge, these 14 weird tricks of language use, our vanity kicks in. Now we get to be the ones to tell others they’re wrong. Knowing these shibboleths gives you the secret knowledge of the English Illuminati. Between our predisposition to believe we’re wrong, our desire to show others up by revealing they’re wrong, and our newfound membership in this elite brotherhood, what incentive do we have to find out that these rules are hogwash? All that comes out of skepticism is, well, this: me, sitting on my laptop, writing and rewriting while the sun creeps across a glorious sky on a beautiful day that I could have been spending on the patio of my favorite coffee shop, approaching my fellow patrons, dazzling them with my new conversation starter: “I bet you use plethora wrong. Allow me to explain.”

—

*: In fact, Kinsella undermines her own definition of “countless and infinite” in her supposedly correct example by using “countless and infinite” to describe the finite set of stars in the universe, so maybe she’s just in love with the sound of her own hectoring.

If you believe the grammar doomsayers, the English subjunctive is dying out. But if this is the end of the grammatical world, I feel fine — and I say that even though I often mark the subjunctive myself.

The most talked about use of the subjunctive is in counterfactuals:

(1) Even if I were available, I’d still skip his party.

For many people, marking the subjunctive here is not required; either they never mark it, using the past indicative form was instead, or they (like me) sometimes mark it with were, and sometimes leave it unmarked with was. For this latter group, the choice often depends on the formality of the setting. I’m calling this “not marking” the subjunctive, rather than “not using” it, because it seems less like people making a choice between two moods for the verb and more like a choice between two orthographic/phonemic forms for it.

It’s similar to the alternation for many people (incl. me) of marking or not marking who(m) in the accusative case, discussed by Arnold Zwicky here and here, and Stan Carey here. That said, I believe that (at least some) people who never use were in (1) do not have a grammatical rule saying that counterfactuals trigger the past subjunctive, and I’m not worried about that either.

For being such a foolish war, World War I did generate some artistic propaganda.

This blitheness about the subjunctive does not go unmourned. I recently found myself being Twitter-followed by someone whose account just corrects people who fail to use the subjunctive in sentences like (1).* And Philip Corbett, associate managing editor for standards at the New York Times, annually rants about people failing to mark the subjunctive. Consider one of Corbett’s calls to man the ramparts, which he begins by quoting, in its entirety, a 90-year-old letter complaining that the subjunctive must be saved from impending destruction.** Corbett continues:

“[…] despite my repeated efforts to rally support for [the subjunctive] the crisis has only grown. For those few still unaware of the stakes, here is a reminder from The Times’s stylebook”

What are the stakes? What would we lose without the subjunctive? Corbett cites sentences such as these:

The mayor wishes the commissioner were retiring this year.
If the commissioner were rich, she could retire.
If the bill were going to pass, Secretary Kuzu would know by now.

If these were the stakes, I’d ditch the subjunctive. Corbett points out that in each of these we’re referring to a counterfactual condition, which should trigger the subjunctive. But note that using the indicative/unmarked was doesn’t make that any less clear. There is nothing to be gained from using the subjunctive in these cases but a sense of superiority and formality. (Not that I’m against either of those.)

But here’s the weird thing: all this defense of the subjunctive, all these worries — they’re all only about the past subjunctive. And the past subjunctive is weird, because it’s only marked on be, and it’s just a matter of using were for singular as well as plural. For everyone worrying that this is some crucial distinction, please note these sentences where it is insouciantly the same as teh indicative form:

(2a) The mayor wishes the commissioners retired last year.
(2b) If the commissioner wanted to, she could retire.
(2c) If the billswere going to pass, Sec. Kuzu would know by now.

If anything, the loss of past subjunctive were strikes me as regularization of English, the loss of the last remaining vestige of what was once a regular and widespread marking system. Losing the past subjunctive makes English more sensible. I don’t see that as a bad thing.

And anyway, the subjunctive probably isn’t going to disappear, not even the past subjunctive. The past subjunctive is, to my knowledge, necessarily marked in Subject-Auxiliary Inversion constructions:

(3) Were/*Was I a betting man, I’d say the subjunctive survives.

A quick look at Google Books N-grams makes it look like were subjunctive marking has been relatively constant over the last 40 years in written American English, so maybe this is all just a tempest in a teacup.

Plus all of this worry about the subjunctive ignores that the present subjunctive is going strong.*** I’ve written about sentences where the present subjunctive changes the meaning (though I wrote with a dimmer view of the subjunctive’s long-term prospects), and Mike Pope supplied an excellent example:

(4a) I insist that he be there.
(4b) I insist that he is there.

In cases where marking the subjunctive is important, it’s sticking around. In cases where it isn’t important, and the subjunctive follows a strange paradigm, identical to the indicative for all but one verb, it may be disappearing. This is no crisis.

Summary: People who write “if I was” instead of “if I were” aren’t necessarily pallbearers of the English subjunctive. It may be regularization of the last remaining irregular part of the past subjunctive, with the present subjunctive remaining unscathed. And if the past subjunctive disappears, there will be, as far as I can tell, no loss to English. Go ahead and use it if you want (I often do), but to worry that other people aren’t is wrinkling your brow for nothing.

—
*: I do respect the tweeter’s restraint in seemingly only correcting people who’re already talking about grammar.

**: That this destruction has been impending for 90 years has somehow not convinced the ranters that their panic may be misplaced. Also, Corbett keeps titling his posts “Subjunctivitis”, which I think sounds great, but not in the same way he probably does. -itis usually means an unwelcome inflammation of the root word, and I can’t help but see all this as an unhelpful inflammation of passions over the subjunctive.

***: In fact, and I think this is pretty cool, (Master!) Jonathon Owen directed me to a classmate’s corpus work suggesting that for at least some verbs, marked subjunctive usage is increasing.

We’re all Rolling Stones fans here, right? I mean, we’re all here on a grammar blog, so I don’t think I’m jumping to too wild a conclusion to assume that we’re almost all oldsters, whether in actual age or personality. So let’s talk about the classic “Get Off of My Cloud”:

As it turns out, the Stones weren’t terribly fond of this song; they felt it was a rushed follow-up to the runaway success of “Satisfaction”. But some grammar peevers dislike it for an unrelated reason:

“‘Off of’ is no way to talk. It IS really, really bad English.”

Hatred of off of is widespread. It pops up commonly in peevelists. Some professional grammar commentators share this complaint: the quote above is from Patricia O’Conner of Grammarphobia*, and Grammar Girl tersely dismisses it with “You jump off the pier, not off OF the pier”.

So what’s supposedly wrong with off of? The main problem seems to be that the of is unnecessary, but another common one is that since it’s on and not on of, it must be off and not off of. I also see complaints that it’s dialectal or informal or American, that one can’t put two prepositions next to each other, or that it ought to be from. And worse, given all of these problems, the phrase is supposedly spreading.

Let’s take these in reverse order. First, I’m unconvinced that it’s spreading, unless you’re talking about a very recent (last 20 years) spread. Here’re the Google Books counts, and you’ll note that modern off of usage is still below its peak in 1910. The Corpus of Historical American English has a slightly different picture, with more-or-less stable usage from 1900 to the 1980s, and then a jolt up in the 90s and 2000s. Maybe it’s spreading, maybe not. But let’s talk about why it’s not bad either way.

I’ll start with the easiest objections. No, it shouldn’t just be from. Consider:

(1a) The numbers station is broadcasting from a shed off of Route 395.
(1b) *The numbers station is broadcasting from a shed from Route 395.

And yes, you can put two prepositions next to each other, as in this unobjectionable example:

(2) I pulled a coat out of the closet.

Going on to a somewhat more complex objection, antonymic phrases do not have to share structures or prepositions. The fact that you get on and not on of a train doesn’t mean that you have to get off and not off of it. Consider:

(3a) I put the sandwiches into the picnic basket, but someone has pulled them out of it.
(3b) One velociraptor was in front ofMuldoon, the other next to him.

And now on to the involved discussions. One question is whether off is always sufficient, and off of thus always unnecessarily wordy. And the answer, I think, depends on that of a second question: whether off of is dialectal.

In my idiolect, off of is perfectly standard. I was probably in my twenties before I heard someone object to it. That’s not to say I can’t use off without of. To the contrary, I prefer (4) without of, though both forms are acceptable to me:

(4) The leaves fell off the tree.

That said, of is not always superfluous to me. A few examples where I find removing of to make the sentence noticeably worse:

(5a) It’s a way of profiting off of something you expect to drop in value.
(5b) My new invention will knock the socks off of the scientific community.
(5c) I broke your statue by knocking the top off of it.

You may not agree, even if you come from an off of idiolect, that these forms are better, but that’s not important. The key point is only that sometimes, to some people, off of is distinctly more mellifluous than off. Dismissing off of out of hand as superfluous is valid only in dialects that already don’t allow it.

Let me elaborate this “necessity depends on dialect” point by proposing an insane argument. I’ve mentioned before that, being from Pittsburgh, I am perfectly content to say The car needs washed instead of The car needs to be washed. Within my dialect, to be is often superfluous, and there are some sentences that I find greatly improved by omitting it. Thus, I could see arguing that to be is, at least sometimes, unnecessary. But if I argued this to someone speaking a “standard” dialect of English, I would sound crazy. Saying that of in off of is across-the-board unnecessary sounds equally crazy to me.**

So is off of dialectal and/or informal? The answer would seem to be yes to both. The Oxford English Dictionary calls it “only colloq. (nonstandard) and regional” in current use. The Merriam-Webster Dictionary of English Usage says it’s “primarily a form used in speech”. The Columbia Guide to Standard American English says it’s avoided at “Planned and Oratorical levels and in Semiformal and Formal writing.”

Those sources are generally pretty trustworthy with their opinions, and given the amount of people who find off of unacceptable, I’m inclined to believe that it really is dialectal. When that’s coupled with its primarily spoken usage patterns, it’s no surprise that it would feel informal, especially to people from other dialects. And using the Corpus of Contemporary American English as a measuring stick of informality, off of occurs in speech twice as often as in written fiction, about four times as often as in newspapers/magazines, and almost ten times as often as in academic writing. The more formal the style, the less likely you’ll see off of.

All that said, its informality doesn’t mean it’s an illiteracy. Off of used to be standard in English; the MWDEU starts off with a Shakespearean usage [1592] and continues with Pepys [1668] and Bunyan [1678]. In the last century, they show it used by Hemingway, Faulkner, and Harry Truman, among others. So if it is making a comeback, it’s no harbinger of linguistic doom, just a return to form.

Summary: There is nothing linguistically or grammatically wrong with off of. It’s nonstandard in some dialects and informal in most, so you should probably avoid it if you’re concerned about your writing seeming formal. But when formality isn’t a concern, use it as you see fit.

—

*: This is a surprising stance, because it comes from Patricia O’Conner of Grammarphobia, who’s normally a lot less judgmental about such things. In fact, three years later, she softened her stance, although she remains against off of. I included her original opinion because her reconsideration shows that even hard-line opinions can (and should) be altered in the face of evidence, so long as the commentator is reasonable.

**: In fact, I and others within my dialect seem to have strong intuitions about times when the to be can and can’t be felicitiously dropped, in the same way as I see off of. It’s not a matter of necessity but of felicity.

If English words were Norse gods, perhaps than would be the best candidate to play the role of Loki, the trickster god.* It causes confusion not only due to its similarity with then, but also by raising the question of what case the noun phrase it governs should have. Of course, there’s no confusion if you are a brilliant grammaticaster, as in this example from a list of peeves:

12. I/Me: We had several different takes on this, with one correspondent nailing it thus: “The correct choice can be seen when you finish the truncated sentence: He’s bigger than I am. ‘He’s bigger than me am’ actually sounds ridiculous and obviates the mistake.”**

Now, there’re two questions one should be asking of this explanation. First, can you just fill in the blank? By which I mean, does a sentence with ellipsis (the omission of words that are normally syntactically necessary but understood by context) necessarily have the same structure as a non-elided sentence? There are many different types of ellipsis, so this is a more complex question than I want to get into right now, but the short answer is no, and here’s a question-and-answer example:

(The asterisks indicate ungrammatical forms.) The second question to ask here is whether there even is an elision. We know that “He eats faster than I do” is a valid sentence, but does than necessarily trigger a clause after it? Could it be that “He eats faster than me” is not an incomplete version of the above structure but rather a different and complete structure?

This boils down to the question of whether than is strictly a conjunction (in which cases the conjoined things should be equivalent, i.e., both clauses) or can function as a preposition as well (in which case the following element is just a noun phrase, with accusative case from the preposition). It’s pretty easy to see that than behaves prepositionally in some circumstances, pointed out in the Cambridge Grammar of the English Language, via Language Log:

(2a) He’s inviting more people than just us.
(2b) I saw no one other than Bob.

But this prepositional usage has become frowned upon, save for instances like (2a) & (2b) where it’s unavoidable. Why? Well, it’s an interesting tale involving the love of Latin and the discoverer of dephlogisticated air. It all starts (according to the MWDEU) with Robert Lowth, who in 1762 claimed, under the influence of Latin, that than is a conjunction and noun phrases following it carry an understood verb. The case marking (e.g., I vs. me) would then be assigned based on the noun phrase’s role in the implied clause. So Lowth’s grammar allows (3a) and (3b) but blocks (3c):

(3a) He laughed much louder than I (did).
(3b) He hit you harder than (he hit) me.

(3c) *He laughed much louder than me.

Lowth’s explanation was not without its wrinkles; he accepted than whom as standard (possibly due to Milton’s use of it in Paradise Lost), and worked out a post hoc explanation for why this prepositional usage was acceptable and others like (3c) weren’t.

Lowth’s explanation was also not without its dissenters. Joseph Priestley, a discoverer of oxygen, argued against it in his Rudiments of English Grammar (1772 edition). Priestley noted that the combination of a comparative adjective and than behaved prepositionally, and that good writers used it as such. Regarding the Lowthian argument against it, he wrote:

“It appears to me, that the chief objection our grammarians have to both these forms, is that they are not agreeable to the idiom of the Latin tongue, which is certainly an argument of little weight, as that language is fundamentally different from ours”

Another pro-prepositionist was William Ward, whose 1765 Essay on Grammar argues that than in phrases like to stand higher than or to stand lower than is akin to prepositions like above or below, and thus that the prepositional usage of than should be allowed. That’s not much of an argument, because semantically equivalent words and phrases don’t have to have the same syntax — consider I gave him it vs. *I donated him it. But then again, Lowth is basing his argument on a completely separate language, so this is a slight improvement.

The real key, of course, is usage, and the MWDEU notes that Priestey and Ward are backed by standard usage at the time. Visser’s Historical English Syntax strengths the case with examples of prepositional than from a variety of estimable sources, including the Geneva Bible (1560), Shakespeare (1601), Samuel Johnson (1751, 1759), and Byron (1804). And of course prepositional than continues in modern usage — why else would we be having this argument?

Despite the irrelevance of his argument, Lowth’s opinion has stuck through to the present day, reinvigorated by new voices repeating the same old line, unwilling to concede something’s right just because it’s never been wrong. It’s the MWDEU, not the commenter mentioned at the top, who nails it:

So in the end, we’re left with this. Than I and than me are both correct, in most cases. Than I is often regarded as more formal, but interestingly it’s the only one that can be clearly inappropriate.*** Using the nominative case blocks the noun phrase from being the object of the verb. I can’t write The wind chilled him more than I to mean that the wind chilled me less than it chilled him.

Sometimes this can be used to disambiguate; I love you more than him is ambiguous while I love you more than he isn’t. This only matters for clauses with objects, and in general, these tend not to be ambiguous given context, so the benefit of disambiguation must be weighed against the potentially over-formal tone of the nominative case. (I, for instance, think the second sentence above sounds less convincing, even though it’s clearer, because who talks about love so stiltedly?)

I’m branching a little off topic here, but I’d like to conclude with a general point from Priestley, a few paragraphs after the quote above:

“In several cases, as in those above-mentioned, the principles of our language are vague, and unsettled. The custom of speaking draws one way, and an attention to arbitrary and artificial rules another. Which will prevail at last, it is impossible to say. It is not the authority of any one person, or of a few, be they ever so eminent, that can establish one form of speech in preference to another. Nothing but the general practice of good writers, and good speakers can do it.”

Summary:Than can work as a conjunction or a preposition, meaning that than I/he/she/they and than me/him/her/them are both correct in most situations. The latter version is attested from the 16th century to the present day, by good writers in formal and informal settings. The belief that it is unacceptable appears to be a holdover from Latin-based grammars of English.

—

*: Interestingly, and counter to the mythology I learned from the Jim Carrey movie The Mask, Wikipedia suggests that Loki may not be so easily summarized as the god of mischief.

**: I’m pretty sure the use of obviate is a mistake here. I can just barely get a reading where it isn’t, if thinking of the full version of the sentence causes you to avoid the supposed mistake. But I suspect instead that our correspondent believes obviate means “make obvious”, in which case, ha ha, Muprhy’s Law

***: Is there a case where than me is undeniably incorrect? I can’t think of one.

About The Blog

A lot of people make claims about what "good English" is. Much of what they say is flim-flam, and this blog aims to set the record straight. Its goal is to explain the motivations behind the real grammar of English and to debunk ill-founded claims about what is grammatical and what isn't. Somehow, this was enough to garner a favorable mention in the Wall Street Journal.

About Me

I'm Gabe Doyle, currently a postdoctoral scholar in the Language and Cognition Lab at Stanford University. Before that, I got a doctorate in linguistics from UC San Diego and a bachelor's in math from Princeton.

In my research, I look at how humans manage one of their greatest learning achievements: the acquisition of language. I build computational models of how people can learn language with cognitively-general processes and as few presuppositions as possible. Currently, I'm working on models for acquiring phonology and other constraint-based aspects of cognition.

I also examine how we can use large electronic resources, such as Twitter, to learn about how we speak to each other. Some of my recent work uses Twitter to map dialect regions in the United States.