It’s a dark night; you’re in an unfamiliar city, slightly lost, but pretty sure you’ll know where you are if you just get to the next corner. The streets are quiet. A stranger steps out of the gloom in front of you, and announces that certain words don’t mean what you think they mean. They’re words that you use but have never really felt comfortable with, words that you use mostly because you’ve heard them in set phrases, words like plethora.

Plethora, you wonder, could it be I’m using it wrong? That niggling uncertainty kicks in, the same niggling uncertainty that’s pushed you to educate yourself all these years. It creeps further, darkening your mind. Have I been using words wrong? Your breath quickens — how many others have thought heard me say them before this stranger came up and told me I was wrong? Have I used one of them lately? Have I been judged? Your pulse races. Did I just say one? — is, is that why this stranger materialized to announce it was wrong?

The stranger says more words are being used wrong, by others, by you. These words are more common, common enough to be known but not common enough to be well-known: myriad, enormity. Oh God, you think, I’ve used those words in business writing! The uncertainty changes into certainty, certainty that you are wrong, and worse, that people know it. Important people know it. That’s why you haven’t been promoted, it’s why your friends were laughing that one time and didn’t say why. The stranger has you now. The stranger knows the dark spots on your soul. The stranger is almost touching you now, so close, so close. Your eyes meet. The stranger’s eyes widen; this is it, the final revelation. Do you dare listen? You can’t listen, you must listen:

“And you’re using allow wrong, too!”

At which point the spell is broken — because c’mon, you’re not using allow wrong. You’d definitely have noticed that. You push the stranger out of the way, and realize your hotel’s just on the next block.

In the unfamiliar city of the Internet, I encountered such a stranger: Niamh Kinsella, writer of the listicle “14 words you’ve been using incorrectly this whole time“. Kinsella argues that your usage doesn’t fit with the true definition of these words, by which she usually means an early, obsolete, or technical meaning of the word.

Her first objection is to plethora, which she defines as “negative word meaning a glut of fluid”. And so it was in the 1500s, when it entered the language as a medical term. This medical meaning persists in the present day, but additional figurative meanings branched off of it long ago — so long ago, in fact, that one of the meanings branched off, flourished for 200 years, and still had enough time to fade into obsolescence by now. The extant figurative meaning, the one that most everyone means when they use plethora, is antedated to 1835 by the Oxford English Dictionary, at which point it was usually a bad thing (“suffering under a plethora of capital”, the OED quotes). But by 1882 we see the modern neutral usage: “a perfect plethora of white and twine-colored thick muslin”.

The second objection is to myriad, and here Kinsella deviates by ignoring the early usage. She hectors: “It’s an adjective meaning countless and infinite. As it’s an adjective, it’s actually incorrect to say myriad of.” But in fact myriadentered English as a noun, either as a transliteration of the Greek term for “ten thousand”, or as an extension of that very large number to mean “an unspecified very large number” (both forms are antedated by the OED to the same 1555 work). The adjectival form doesn’t actually appear until two centuries later, the 1700s. Both nominal and adjectival forms have been in use from their inception to the present day; claiming that one or the other is the only acceptable form is just silly.*

There’s no point in continuing this after the third objection, which is to using allow in cases that do not involve the explicit granting of permission. To give you an idea of what folly this is, think of replacements for allows in a supposedly objectionable sentence like “A functional smoke alarm allows me to sleep peacefully.” The first ones that come to my mind are lets, permits, gives me the ability, and enables. That’s the sign of a solid semantic shift; four of my top five phrasings of the sentence are all verbs of permission with the permission shifted to enablement. Kinsella herself has no beef with it when she isn’t aiming to object, judging by her lack of objection to an article headlined “Are we allowed optimism now?”.

This enablement usage isn’t new, either; the OED cites “His condition would not allow of his talking longer” from 1732. (Permit without permission is antedated even further back, to 1553.) This oughtn’t even to be up for debate; even if it were completely illogical — which, as an example of consistent semantic drift, it’s not — the fact that it is so standard in English means that it is, well, standard. It is part of English, and no amount of insisting that it oughtn’t to makes a difference. It’s similar to the occasional objection I see to Aren’t I?: even if I agreed it didn’t make sense, virtually every (non-Scottish/Irish) English speaker uses it in place of amn’t I?, so it’s right. End of discussion.

Why do we fall for this over and over again? Why do we let people tell us what language is and isn’t based on assertions that never have any references (Kinsella cites no dictionaries) and rarely hold up to cursory investigation? I don’t know, but my guess is that it appeals to that universal mixture of insecurity and vanity that churns inside each of us.

We are convinced that we must be doing everything wrong, or — and perhaps worse — that we’re doing most things right but there’s some unexpected subset of things that we have no idea we’re doing wrong. So if someone tells us we’re wrong, especially if they candy coat it by saying that it’s not our fault, that everyone’s wrong on this, well, we just assume that our insecurities were right — i.e, that we were wrong. But then, aware of this new secret knowledge, these 14 weird tricks of language use, our vanity kicks in. Now we get to be the ones to tell others they’re wrong. Knowing these shibboleths gives you the secret knowledge of the English Illuminati. Between our predisposition to believe we’re wrong, our desire to show others up by revealing they’re wrong, and our newfound membership in this elite brotherhood, what incentive do we have to find out that these rules are hogwash? All that comes out of skepticism is, well, this: me, sitting on my laptop, writing and rewriting while the sun creeps across a glorious sky on a beautiful day that I could have been spending on the patio of my favorite coffee shop, approaching my fellow patrons, dazzling them with my new conversation starter: “I bet you use plethora wrong. Allow me to explain.”

—

*: In fact, Kinsella undermines her own definition of “countless and infinite” in her supposedly correct example by using “countless and infinite” to describe the finite set of stars in the universe, so maybe she’s just in love with the sound of her own hectoring.

Oh, I love Thanksgiving! It’s such a silly holiday, where we kid ourselves that we’re counting our blessings when we’re really counting how many plates of food we can devour. It’s a day that proclaims that it’s awesome to get along with one another and to trust each other and to share, when the real message of the First Thanksgiving is that doing those things will only result in your land, livelihood, and lives being taken away from you. Wait, that’s not a good holiday at all! But I get to pardon my gluttony for one day, so it’s all right, I guess.

"And, God, please deliver unto me an Xbox 360 with a Kinect for less than $200 tomorrow at 4 a.m."

Anyway, let me tell you a little about my upcoming Thanksgiving dinner. Suppose I confessed to you that I anticipate that the overall quality of the dinner will be high, despite the fact that I am in charge of preparing a not-insubstantial portion of the meal. Would I be correct in my confession? Specifically, I’m wondering if I’m justified in my use of the word anticipate, which I’ve used rather like expect.

To hear prescriptivists tell it, I wouldn’t be. They say that anticipate can be used only when the subject has prepared for the expected event. For instance, Fowler’s Modern English Usage (2nd edition, 1965) insists that:

“The use of anticipate as a synonym for expect, though very common, is a slipshod extension. The element of forestall present in anticipate ought to have been preserved and is still respected by careful writers.”

Ambrose Bierce similarly asserts in Write It Right (1909, but I’m using Jan Freeman’s 2009 edition):

“To anticipate is to act on an expectation in a way to promote or forestall the event expected.”

The Merriam-Webster Dictionary of English Usage finds this claim first made in 1881 by Alfred Ayers, who was something of a giant in the world of 19th century grammaticasters. Ayers justified his claim through Latin etymology, which is never a valid argument. And now a bunch of people parrot it.

Well, are they right and I wrong? Am I right and they wrong? Well, in the cheerful spirit of the forthcoming holiday, let’s say we’re both right. Some of the definitions of anticipate do have a substantial preparatory component. Among these are (I’m paraphrasing the OED’s definitions here) to spend income in advance, to deal with before another actor has a chance to, to forestall, to observe in advance of the due date, to cause to happen earlier, and to take into consideration before the appropriate time.

Now, most of these are fairly uncommon usages for anticipate in contemporary English. I’d wager that few people would now say “I’ve anticipated my wages after taking out the payday loan,” using the “spend income in advance” definition. Same with “I anticipated the fall of the Jenga tower by bumping into it,” using the “cause to happen earlier” meaning.* Of the preparatory definitions listed above, the two I see most commonly are “to forestall” and “to take into advance consideration”:

(1) I played with one new player who asked what dice to roll every time he was instructed to roll for initiative. After the fourteenth time, I anticipated his question and handed him a 20-sided die when combat started.

(2) He anticipated the question “What was the last movie you saw?” but not, “What was the most recent favorite movie you saw?” This interviewee was stumped.

This last definition of anticipate is already pretty close to that of expect. Because you expect something to happen, you consider it in advance. The question here — one noted by Jan Freeman in her discussion of Bierce’s opinion — is what counts as preparation or consideration. If you expect something to happen, you’re almost certainly going to prepare for it, even if only mentally. And in some cases preparing for an event involves a specific type of inaction. For instance, I expect that my friend who told me he was going to get me Scott Pilgrim vs. The World on DVD will follow through, so I have anticipated this action by not watching the movie. The line is blurry, and that’s a good thing.

Why? It gives us flexibility, at least in my idiolect. Using anticipate when there isn’t obvious preparation triggers an implicature that the anticipater has undergone some sort of mental preparation for the event, even if nothing more than psyching oneself up for it. The OED lists “to look forward to” as a definition of anticipate. Expect, on the other hand, is far more neutral in its view of future events. That’s why fans anticipate their favorite bands releasing a new song, in addition to expecting it. There’s also a related sense in the OED of anticipate meaning expect as certain.

And I think that’s the trick with anticipate. People don’t generally use anticipate as a mere synonym of expect. I see it as “expect-plus”, where the addition can be positive feeling, preparation, certainty, or a range of other things. The “slipshod extension” Fowler mentions is not anticipate as mere expect, but anticipate with a wider range of preparations.

I’ve one last sentence, from Jack Lynch, to offer if you remain worried about anticipate and expect fraternizing:

“William Blake certainly didn’t expect Modernist poetry, but in some ways he anticipated it by doing similar things a century earlier.”

The meaning of this sentence is obvious even to someone like me, for whom the primary meaning of anticipate is “expect-plus”.

—

*: Are there words for either of these meanings (“spend in advance” or “cause to happen earlier”) in contemporary English? If you know of any, let me know, because I want to use them.

Last month, we grammar bloggers were all abuzz about the Queen’s English Society and their quixotic quest for the instatement of an academy to regulate the English language. The Society have already been clobbered by Stan Carey, Mark Liberman, John E. McIntyre, and David Mitchell.

There is little I could add to this quartet of brilliant battery, so instead of a general discussion of the Society’s shortcomings, I want to look at one of the things they’re complaining about as an example of bad English. The QES’s complaints are petty, insane, or both. Case in point: they’d like to see Ms.abolished. Why?

It’s an abbreviation, but it has no long form.

It’s “unpronounceable” since it lacks a vowel.

It was created by “certain” women who “suddenly became sensitive about revealing their marital status.”

Regarding point 1, this is matter of being beholden to word labels. It reminds me of an objection I once received to preposition stranding; “preposition” suggests “in a position before”, and therefore a preposition at the end of a sentence, where it doesn’t precede anything, must be incorrect.

So it goes with abbreviations; if you want to be literal, an abbreviation is an abbreviated form of something. But Ms. doesn’t need to be a literal abbreviation to exist. It does exist, as anyone can plainly see. If it’s not an abbreviation, that doesn’t stop it existing any more than a mannequin not being human stops it existing.

Ms. isn’t an abbreviation, but rather a blend. It’s a combination of the two words Miss and Mrs., and it happens to inherit the closing period of the abbreviation Mrs., making it superficially resemble an abbreviation. That’s all.

And if we’re doing an abbreviation witch-hunt, what is Mrs. short for? Missus, one might say, but that isn’t really a word of its own as much as a spelling of the pronunciation of Mrs. Etymologically, Mrs. is an abbreviation of mistress, but the meaning of that word has changed sufficiently that you’d be stirring up a good deal of trouble if you called someone’s wife a “mistress”. I would argue that in modern English Mrs. itself is no longer an abbreviation, but a fully independent lexical item, much like Ms.

Regarding point 2, well, we all manage to pronounce Ms. pretty well for the lack of a vowel supposedly rendering it unpronounceable. How do we do it? Technically speaking, the standard pronunciation of Ms. doesn’t have a vowel. We were told in school that all words need to have vowels, since each syllable has to have a vowel, but that’s not quite right. Some consonants can function as the nucleus of a syllable, just like a vowel. This is more apparent in some non-English languages, such as Berber or Slavic languages. For instance, in Czech or Slovak, you can apparently tell someone to stick their finger through their throat by saying Strč prst skrz krk (audio), a sentence where every word has a nucleic r in lieu of a vowel.

English does this, too, albeit more rarely. We often reduce and down to a syllabic [n] or [ŋ] between words (as in the restaurants Eat ‘n Park or In-N-Out), and word-final [l] and [r] are sometimes syllabic as well (as in bottle [boɾl] or pepper [pepr]). As you might have guessed, [z] is another syllabic consonant, which explains how we are able to pronounce [mz] as a stand-alone word.

Again, I don’t mean to demonize Mrs., but if we’re getting rid of vowel-less words, wouldn’t we have to get rid of it, too? Mrs. lacks a vowel orthographically, and has to trade its r for two [ɪ]s and an extra [z] just to get pronounced (as [mɪzɪz])! Now that’s unpronounceable!

Regarding point 3, this is a contentious point, and I don’t want you to think that I’m caricaturing the QES, so let me quote the entirety of their paragraph on it:

“This linguistic misfit [Ms.] came about because certain — note: certain, not all — women suddenly became sensitive about revealing their marital status. Or perhaps they were annoyed that they could not identify a man as married or single by his title. We won’t begrudge these women their complexes but surely there is a better solution to their problem than an unpronounceable buzz!”

Women, amiright? Well, no. Actually, the original push for Ms. was to avoid mistaking a married woman for an unmarried woman or vice versa. Ben Zimmer found the first known proposal for Ms. in a 1901 newspaper column (probably written by a man), which says:

“Every one has been put in an embarrassing position by ignorance of the status of some woman. To call a maiden Mrs is only a shade worse than to insult a matron with the inferior title Miss.”

This is certainly a conundrum that I face often. Ms. is not (only) popular because women rightfully feel no need to disclose their marital status*, but because it offers a way for both males and females to address a woman whose marital status is unknown.

Of course, the QES has a counter-proposal to make Ms. unnecessary. They propose introducing an unmarried male title to complete the symmetry with Miss and Mrs. and then to make the choice of titles rely on age. Despite the QES’s claim that this is “so simple and sensible”, I think any reasonable person will see that this is a far inferior solution, and so I won’t bother with further comment on that numbskullery.

Summary:Ms. isn’t some recent feminist invention, it’s pronounceable, and it’s a useful addition to English. There is no reasonable reason to oppose it.

—

*: Not to mention that marital status isn’t all or nothing. What is the right title for someone divorced, widowed, separated, etc.? Ms. is a convenient way to solve that problem of etiquette.

I know it’s become common over these last few posts for me to discuss etymological fallacies, but that’s only because they’re so easy to disprove. They’re like a little vacation for me, a pathetic little vacation I take without moving from in front of my computer.

The current etymologically-motivated complaint I’ve grown tired of is the claim that you can’t say center around. This one’s fun because it involves geometry. (I must confess that I never actually took a geometry class. Instead, I took topology, which is sort of like geometry where the entire world is made of infinitely flexible rubber. This is why I can think of geometry as fun.)

Suppose you have a circle O. The circle gets its name from its center, O, so you might want to say that the circle O is centered at point O. But at the same time, the circle is located all around point O, so I can’t see anything unreasonable in saying that the circle is centered around point O either.

Other people, though, can. Andtheydo. I especially like the exhortation of that last link:

“How can you center around anything? You cannot. You can center on or focus on something, but not around it. Think about it!”

So I started to think about it. And try as I might, I couldn’t see how you could center on something. Take, for instance, that king of three-dimensional objects, the sphere. Suppose the sphere X is on point O. Then the sphere X must be above point O, by the definition of on. To have point O as its center, sphere X must extend equally in all directions from point O. (More generally, point O must be the average of all points in object X.) But it can only be the case that X is on O and X has O as its center if X is an impossibly droopy sphere; otherwise the center of X will have to be above O. Even if we move beyond spheres, it’s really only inverted bowl-type mathematical objects that could rest on their own centers. So really, isn’t center on illogical, too? Oughtn’t it to be center at?*

Please tell me you’re thinking that this discussion is really stupid. It is, isn’t it? After all, if geometric logic really determined the proper choice of preposition in an idiomatic construction, we’d all be saying that this debate centers at a contentious point. And of course, we don’t. The Google n-gram corpus has 4858 examples of “debate centers on” and 1763 of “debate centers around”, but does not have a single attestation of “debate centers at” in a trillion words. Language isn’t geometry, and there is no reason to try to make it so. As the Merriam-Webster Dictionary of English Usage (MWDEU) puts it, “[…] questionable or sound, logic is simply not the point. Center around is a standard idiom […]”

So let’s stop dismissing center around out-of-hand for “logical” reasons and look at it dispassionately. How standard of an idiom is it? Well, it’s a fairly old construction; the OED first attests it in 1868 in Edward Freeman‘s solidly scholarly The History of the Norman Conquest:

(1) “It is around the King..that the main storm of battle is made to centre.”

Google Books has some even older attestations:

(2a) “Clouds of deep crimson centered around him, and one would think, by the glory of his parting, he was loath to deprive the earth of her light […]” [1824]
(2b) “[…] I occasionally acted as chaperon to Miss Jameson, but as my hopes centered more trustfully around one object, my taste for general society diminished […]” [1840]
(2c) “His thoughts returned to Miss Percival; his hopes again centered around her.” [1840]

Is center on any older? Not much. The OED’s first attestation of center on comes from 1789, but this usage is based on the obsolete definition “to converge on”. If we don’t accept that example because of the (subtle) difference in meaning, the next attestation is in 1867. That puts it contemporaneous with the first attestation of center around. Google Books has some older attestations, although they might fit better with the “converge on” meaning:

(3a) “Our hope centered on God in Christ, and our hearts ready to leave the world.” [1775]
(3b) “Had it centered on a monarch, it would have given the means of a vigorous and healthy government; but it never centered on a monarch.” [1834]

MWDEU notes that up through the 19th century, in was the primary idiomatic preposition used with center, alongside a smattering of on, upon, and around. More recent usage has shifted these proportions, with on and around taking precedence in American English and round frequent in British English. (in has really fallen by the wayside.) And between the emergence of center around and grammarians’ first complaints about it in the 1920s, no one seems to have thought it illogical. I guess they just weren’t as good at geometry back then.

Summary: There’s nothing illogical about center around, at least nothing inconsistent with the logic of language. (And center on isn’t a paragon of logic itself.) Regardless of the question of logic, center around has been around for 150 years, and there’s no reason to ditch it now.

—

*: And while we’re at it, why are prescriptivists willing to accept center as a verb in the first place? Don’t they know the verb center comes from the noun center? I thought they hated all the bastard verbs that come from nouns, like access.

About The Blog

A lot of people make claims about what "good English" is. Much of what they say is flim-flam, and this blog aims to set the record straight. Its goal is to explain the motivations behind the real grammar of English and to debunk ill-founded claims about what is grammatical and what isn't. Somehow, this was enough to garner a favorable mention in the Wall Street Journal.

About Me

I'm Gabe Doyle, currently a postdoctoral scholar in the Language and Cognition Lab at Stanford University. Before that, I got a doctorate in linguistics from UC San Diego and a bachelor's in math from Princeton.

In my research, I look at how humans manage one of their greatest learning achievements: the acquisition of language. I build computational models of how people can learn language with cognitively-general processes and as few presuppositions as possible. Currently, I'm working on models for acquiring phonology and other constraint-based aspects of cognition.

I also examine how we can use large electronic resources, such as Twitter, to learn about how we speak to each other. Some of my recent work uses Twitter to map dialect regions in the United States.