Q: Isn’t a “categorical” denial a limited denial restricted to a specific category? Why do people “categorically” deny something when they should be denying it “uncategorically”?

A: A “categorical” denial is an unconditional one, not merely a denial applying to a single category.

The Oxford English Dictionary says “categorical” entered English in 1598 as a term in logic. A “categorical” proposition was – and still is – one “asserting absolutely” and “not involving a condition or hypothesis,” according to the OED.

The adjectives “categorical” and the now obscure “categoric” are from the Latin categoricus, derived in turn from the Greek kategorikos, meaning accusatory or affirmative.

In the 17th century, “categorical” acquired the meaning of “direct, explicit, express, unconditional,” as in a “categorical” statement or denial.

And this is the principal sense today of the adjective “categorical” (as well as the adverb “categorically”).

The word “category” entered English in 1588, also as a term in logic; its original meaning was a predication or an assertion, a sense borrowed from Aristotle.

The Chambers Dictionary of Etymology says the word is derived the Greek kategoria, whose roots originally meant to assert or speak in an assembly.

The usual meaning now (“a class, or division, in any general scheme of classification”) came into use in 1660, the OED says.

Q: During a college lecture on German declensions, my professor said “nonce” in the phrase “for the nonce” is the only remaining word in English that retains a piece of an early English declensional ending. Is this the case?A: The phrase “for the nonce” (meaning “for the occasion” or “temporarily”) has been seen in various forms and spellings over the years.

The version you ask about is the result of a mistake in medieval times as Old English was evolving into Middle English and declensions were falling into disuse.

(In Old English, as in modern German, a word may change its form – that is, be declined – to show its function in a sentence.)

In Old English, the language spoken in Anglo-Saxon days, the phrase was seen as to tham annum, meaning for that one thing.

By early Middle English, when the article “the” was still being declined, the expression was for then anes, meaning “for the one.” (The word then was the form “the” took with a singular neuter indirect object.)

Sometime in the 12th century, according to the Chambers Dictionary of Etymology, English speakers (apparently because of their declining grasp of declensions) mistakenly thought the expression for then anes was for the nanes.

The earliest published reference in the Oxford English Dictionary for the erroneous version dates from around 1200: All forr the naness.

The first OED citation for the full “nonce” version of the phrase is from Shakespeare’s Henry IV, Part 1 (1598): “I have cases of Buckram for the nonce, to immaske our noted outward garments.”

Does “nonce” preserve a snippet of an early declensional ending? Perhaps, but I see the word not as a relic of Old English declensions, but as a reminder of English’s evolution by trial and error.

The word “nickname,” for example, is the result of a similar error. It’s derived from an extremely old word, “ekename” (an “eke” is an addition or a piece added on). The first published reference to “ekename,” according to the OED, appeared in 1303.

Over the years, the pronunciation of “an ekename” was misunderstood as “a nekename,” which in turn led to the modern word “nickname,” first recorded in the 17th century. I touched on this subject last year in a blog posting about nicknames.The same thing, but in reverse, happened with “an apron” (originally “a napron”). And our word “orange” went through a similar transformation before entering English. It started in Old French as une narange (borrowed from the Arabic naranj), but it became une arange, une orenge, and eventually une orange. It entered English from Old French in 1380 – as “an orenge.”Is “nonce” unique in preserving part of a declensional ending from Old English? Not by a long shot. The words “who” and “whom,” “he” and “him,” “she” and “her,” and others reflect similar Old English endings.

I might mention here that “nonce word” (a word coined or used for a particular occasion) is a term that James Murray, the founding editor of the OED, coined and used in 1884 for the first edition of the dictionary.

Q: I love Origins of the Specious and intend to shill it whenever I can, which brings me to the reason I’m writing: Is the word “shill” derived from the British shilling?A: I’m glad you like the book, but I may have to disappoint you about the origin of “shill” in the sense of to pose as a satisfied customer to encourage buyers.

The word first showed up in the United States in the early 20th century, as a verb in 1914 and as a noun in 1916, according to the Oxford English Dictionary.

The OED defines the verb as to “act as a shill” and the noun as a “decoy or accomplice, esp. one posing as an enthusiastic or successful customer to encourage other buyers, gamblers, etc.”

The dictionary describes “shill” as “slang (chiefly N. Amer.)” and says it may be an abbreviation of “shillaber” (1913), which the dictionary simply defines as a shill. As for the etymology of “shillaber,” the OED says, “Origin unknown.”

The Chambers Dictionary of Etymology also makes a possible “shillaber” connection and adds that the usage was probably of “circus or carnival” origin.

The “shilling,” a former British monetary unit, is derived from an Old Frisian or Old Saxon coin called the skilling, according to the Chambers reference.

The dictionary’s etymologists speculate that the word may ultimately be derived from one of three ancient roots: skell (to resound), skel (to divide, as of gold or silver), and skeld (shield).

None of my language references connect “shill” and “shilling,” but I suppose it’s possible a coin that rings true and a shill that sings false may ultimately descend from an ancient root that resounds. I wouldn’t put money on it, though.

She’ll be speaking at 2 p.m. Sunday, June 28, at Minor Memorial Library, 23 South St., in Roxbury about myths and misconceptions of the English language. Admission will be free.There will be refreshments, and Pat will sign copies of her latest book, Origins of the Specious, with her co-author, Stewart Kellerman.

Q: When I taught 8th-grade grammar back in the ‘70s, I used to tell my students that “may” meant permission, while “might” meant possibility. Is that no longer the case? I often hear the words used interchangeably now.A: There are two issues here: “may” is an auxiliary verb meaning to be allowed or permitted, but it is also an auxiliary indicating likelihood or possibility (like “might”).

Let me explain this possibility business by quoting a section from my grammar and usage book Woe Is I:

“May is a source of our word maybe, and that’s a good clue to how it’s used. We attach it to another verb (happen, for example) to indicate the possibility of something’s happening. If we say something may happen, we mean it’s possible or even probable.

“Might is a slightly weaker form of may. Something that might happen is a longer shot than something that may happen. I may get a raise is more promising than I might get a raise.

“Although your dictionary will tell you that might can be the past tense of may, either one can be used in the present tense (She may break a leg; She might break a leg) or in the past (She may have broken a leg; She might have broken a leg). The form you choose depends on the degree of possibility and can radically change your meaning. A bulletproof vest may have saved him implies that he was saved. A bulletproof vest might have saved him implies that he wasn’t.

“There’s an exception to this rule of possibility. … If a sentence has other verbs in the past tense, use only might: She thought [past] she might have broken a leg … Eloise was [past] afraid they might lose everything … Frank said [past] he might leave early.”

Q: As a sophomore in high school (50 years ago), I asked my English teacher if “the reason why” is a redundancy. He punted on the answer. I was, and still am, unsatisfied. To my mind, the word “reason” MEANS why … no?

A: I don’t consider “the reason why” a redundancy in a sentence like this: “The reason why the brakes failed is unknown.”

It’s true that “why” could be eliminated, but that doesn’t make it incorrect. This is an idiomatic usage that’s been around since the Renaissance, according to The American Heritage Dictionary of the English Language (4th ed.).

And no, the word “reason” (a noun) doesn’t mean “why” (a conjunction) here. In this expression, “why” means “for which” or “on account of which,” according to American Heritage.

As Bryan A. Garner points out in Garner’s Modern American Usage, the phrase “the reason why” is no more redundant than “the time when” or “the place where.”

However, the expression “the reason is because” is an outright redundancy, since the word “because” means “for the reason that.” I once wrote a blog entry about this klutzy usage.Redundant or not, both expressions (“the reason why” and “the reason is because”) are extremely common and likely to remain in the language – with or without the approval of our English teachers.

I can’t end this item without a snippet from Tennyson’s 1854 poem “The Charge of the Light Brigade”:

Theirs not to reason why,Theirs but to do & die, Into the valley of DeathRode the six hundred.

Q: I’ve read that the word “dasn’t” is common in a small community in Nova Scotia founded by German immigrants in the 1800s. And my grandmother, who was born to German-immigrant farmers in Wisconsin in the1860s, also used it. All of this makes me wonder if “dasn’t” originated among German immigrants.

A: The word “dasn’t” – also spelled “dassn’t” or “dassent” – is a regionalism, found mostly in the Northeast. It’s a contraction for “dare not,” “dares not,” or “dared not.”

I see only one citation for it in the Oxford English Dictionary, from Eugene O’Neill’s 1931 play Mourning Becomes Electra: “You dasn’t stay there till moonrise at ten o’clock.”

Although the OED doesn’t have an entry for “dasn’t” or any information on the word’s history, I assume it evolved because “dasn’t” was easier to say than “daren’t” or “daresn’t” or “daredn’t.”

I wrote a blog item last fall about a related word, “dast,” which some authorities speculate may have come about as a back-formation of “dasn’t.” (A back formation is a word formed by dropping a real or imagined part from another word.)

Another related term, “durst,” an old past tense and past participle of “dare,” goes back (spelled various ways) to Old English, which, like modern English, is a Germanic language.

The Old English verb durren is a cognate (an etymological cousin) of the Old High German gitturan (to dare), which bears a slight resemblance to the modern German verb dürfen (to be allowed or permitted, to dare, to be likely).

But he notes that the early citations aren’t limited to German speakers or to the Northeast. He mentions early examples from Missouri, Indiana, Tennessee, Alabama, and Georgia, as well as New England.

Sorry I can’t be more helpful. I’ll probably lie awake tonight thinking of “dasn’t,” like the character in Huckleberry Finn who’s “that scared I dasn’t hardly go to bed, or get up, or lay down, or SET down.”

Q: I am writing from Bangalore with an issue that may seem nit-picking, but I want to get it right. Which of these sentences is correct? (1) “We are the best,” roared Barcelona. (2) “We are the best”, roared Barcelona. The point is the placement of the comma.

A: The American system of punctuation always calls for placing a comma or period inside the closing quotation marks: “We are the best,” roared Barcelona. Or: Barcelona roared, “We are the best.”

But the British system often calls for placing the comma or period outside the quotation marks: “We are the best”, roared Barcelona. Or: Barcelona roared, “We are the best”. This is true, for example, if what’s being quoted is only part of the original; the fans might actually have cried, “We are the best in the hemisphere!”

As for colons and semicolons, Americans always place them outside the closing quotation marks: “We are the best”; the crowd was deafening.

The British sometimes place a colon or semicolon outside closing quotation marks and sometimes inside them. But the colon or semicolon goes inside only if it’s part of the original quotation.

Question marks and exclamation points are treated the same way in both the American and the British systems. The question mark or exclamation point is placed inside the closing quotation marks only if it’s part of the quotation.

Which system should you follow? The country in which you find yourself generally has an affinity for one system or the other, American English or British English.

I’d go with whatever system is more common where you live. I imagine that you (writing from India) should follow the British system.

If you’d like a bit of brushing up, I wrote a blog entry a couple of years ago about how to use quotation marks with other punctuation, at least in American usage. And I wrote one last year on how to punctuate a question within a question.
In case you’re interested in more about American versus British English, I wrote a blog item last year about a similar question, the differences in the way prepositions are handled from country to country. And I devote a whole chapter of my new book, Origins of the Specious, to myths about American and British English.

Q: I’ve noticed a speech habit the last few years that bugs me: beginning a sentence, particularly a response, with “so.” I hear it all the time on NPR; if anything, it’s a habit of more educated people. Am I fussing about nothing?

A: You aren’t the first person to write me about this tendency of people on NPR – both interviewers and interviewees – to use “so” indiscriminately at the beginning of sentences.

Why do they do it?

I’m guessing here, but interviewers may begin their questions with “so” because it’s an easy way to get into a topic without taking the trouble to find a more graceful entry.

And interviewees may use “so” because it gives them a moment to gather their thoughts – that is, to stall for time.

Although many people find this “so” business annoying, it’s not ungrammatical. In fact, the Oxford English Dictionary says the use of “so” as “an introductory particle” goes back to Shakespeare’s day.

Interviewers as well as interviewees tend to run out of new ideas after a while, and when one of them starts briskly with “so,” then others jump on the usage.

Thus the thing snowballs as it becomes more popular, and eventually starts to resemble a verbal tic permeating the airwaves.

Scientists and academics may be more prone to this habit, since “so” is a handy way of leading from one related idea to another.

The overuse of “so” in interviews will probably go away when it starts to sound too worn-out. And so it goes.

In case you’re interested, I wrote a blog entry a while back about “so” at the beginning of a clause. The posting has links to some related uses of “so.”

Q: As an SAT writing instructor, I am intrigued by your Grammar Myths page, which debunks the rule that “none” is always singular. Since the College Board follows this rule, we have thousands of students learning to write sentences like “None of the chickens is hatched.” What do you think about that?A: What do I think? I think it’s unfortunate that the College Board may be penalizing students who are in fact using the language correctly by writing, “None of the chickens are hatched.”

It is not true that “none” always means “not one.”

It is true that “none” is an etymological descendant of the Old English pronoun nan, which indeed is a combination of ne (“not”) plus an (“one”). But “any” is also descended from the Old English an, and historically “none” has always been closer in meaning to “not any.”

As you know, “any” can be either singular or plural; it can refer to “any of it” (as in “any of the mail”) or to “any of them” (as in “any of the letters”). Hence these sentences are both correct: “None of the mail was delivered” … “None of the letters were delivered.”

As my husband, Stewart Kellerman, and I write in our book Origins of the Specious: Myths and Misconceptions of the English Language:

“It seems that ‘none’ has been both singular and plural since Anglo-Saxon days. Alfred the Great used it as a plural back in the ninth century, when he translated a work by the Roman philosopher Boethius. Although the OED lists numerous examples of both singular and plural ‘nones’ since Alfred’s day, it says plurals have been more common, especially in modern times.”Let me also quote Merriam-Webster’s Dictionary of English Usage: “Clearly, none has been both singular and plural since Old English and still is. The notion that it is singular only is a myth of unknown origin that appears to have arisen late in the 19th century.”

Here’s the advice I give in my grammar book Woe Is I: If “none” means “none of it,” treat it is as singular (“None of the merlot is open”); if “none” means “none of them,” treat it as plural (“None of the carafes are full”). And if you mean “not one,” then say or write “not one.”

I hope the College Board is not also perpetuating the myths that it’s incorrect to “split” an infinitive or to place a preposition at the end of a sentence or to begin a sentence with a conjunction. These, too, are well-known grammatical misconceptions that are alien to the syntax of a Germanic language like English.

If any visitors to the blog would like to read more about these and other myths of English, check out Origins of the Specious.

Q: Your recent entry on “due process” made me think of “due diligence,” as in “He did due diligence,” which is an odd one to parse. If “diligence” means “conscientiousness,” then it’s not really something one does, is it? Just a thought for a rainy day when you don’t have a blog idea … although the supply seems inexhaustible.

A: Well, it’s a rainy day, but the supply is indeed inexhaustible. In fact, I apologize to all those who’ve sent in questions but haven’t gotten answers yet.

Now, let’s give diligence its due.

In the phrase “due process,” first recorded in 1447 (more fully, “due process of law”), the adjective “due” means proper or in accordance with established rules, according to the Oxford English Dictionary.

As for “due diligence,” first recorded in 1598, “due” means appropriate, sufficient, or proper. This sense of the adjective has been in use since around 1400.

The word “diligence” here carries more of an active than a passive sense. It means care and attention, industry, endeavor, and effort to accomplish what is undertaken.

So “due diligence” means something like “the necessary care” or “the effort required.” And one can “exercise” or “perform” or “do” it.Now, I’d better get back to exercising due diligence on the rest of the questions in my mailbox.

A: H-m-m. I never thought of that. I must have been catnapping.The noun “cat,” of course, is a very old word, dating from around the year 800, according to the Oxford English Dictionary, which defines it thusly: “A well-known carnivorous quadruped (Felis domesticus) which has long been domesticated, being kept to destroy mice, and as a house pet.”

The verb first appeared in English in the mid-18th century with the nautical meaning of to raise anchor to the cat-heads, or beams, projecting from the bows of a ship.

In the mid-19th century, the verb took on the meaning of to flog with a cat-o’-nine-tails, according to the OED. Here’s an 1865 citation from the Spectator: “Thirty of them were lashed to a gun, and catted with fifty lashes each.” Yikes!

By the way, the expression “no room to swing a cat” has nothing to do with the cat-o’-nine-tails. If you’d like to read more, I wrote a blog item about this cat-swinging business a few years ago.Interestingly, the noun “dog” first showed up in English a couple of centuries after the appearance of “cat,” according to the OED. Before then, a dog was referred to as a hund, the Old English word for “hound.”

The verb “dog,” however, has been used since the early 16th century in the sense of to follow closely and stubbornly – that is, doggedly. And that brings us to a linguistic term with a following.

A “cataphora” (pronounced kuh-TAFF-ur-uh) is a pronoun or other stand-in for a following word or phrase – for example, the use of “her” to refer to “Sally” in this sentence: “With her, Sally had a bichon and two poodles.”

Finally, a “cataphor” is an obsolete term for deep sleep. It comes from the Latin for coma and the Greek for an attack of lethargy. Speaking of which, I think it’s time for me to take a catnap.

Q: Here’s one that’s been bugging me. When referring to something written, is it fair game to use the verbs “say,” “tell,” “talk,” and “speak”? I’m thinking of a sentence like this: “She told me in an email she’d be late.” And, by extension, can a watch or a radio say something? For example, “My watch says we’re five minutes late.”

A: All of the verbs you mention – “say,” “tell,” “talk,” and “speak” – can be used to refer to written as well as oral communications. Here’s what the Oxford English Dictionary has to say (see what I mean?) about each of them.

(1) “Say”: One meaning is “to utter or pronounce (a specified word or words, or an articulate sound). Also, in wider sense, used of an author or a book, with quoted words as object.” The word is used “of a speaker, writer; also of a literary composition, a proverb, etc.”

So well-established is this sense of “say,” according to the OED, that “its use with reference to written expression does not ordinarily, like the similar use of speak, involve any consciousness of metaphor.”

As for whether a watch or a radio can say something, this is a legitimate usage too. The OED notes that “say” can be used “with an inanimate item as subject: to communicate or represent; esp. of a clock, calendar, etc., to show (a certain time or date); of a notice, to state (a certain message).”

(2) “Tell”: One of the definitions given is “to make known by speech or writing; to communicate (information, facts, ideas, news, etc.); to state, announce, report, intimate.” No problem there either.

(3) “Talk”: The primary meaning is of course to communicate by using speech. But the OED says it is also used “by extension: To convey information in some other way, as by writing, with the fingers, eyes, etc.”

(4) “Speak”: Principally, this means “to utter or pronounce words or articulate sounds; to use or exercise the faculty of speech; to express one’s thoughts by words.” But another meaning is “to state or declare in writing, etc.”

And here’s another use: “Of a writer, literary composition, etc.: To make a statement or declaration in words; to state or say.” And “speak of” means “to mention, or discourse upon, in speech or writing.”

Although it’s fine to say a book “speaks” or “talks” of something, I think it’s venturing a little too far into metaphor to use those verbs with a newspaper. A newspaper article or columnist may “speak” or “talk,” but I’m not ready to accept that a newspaper itself can.

How about other inanimate physical objects? Can they speak or talk? Only if they produce sounds, like radios, TVs, smartphones, computers and so on.

Q: Here’s a quick one for you: In the book I’m writing, I have a character say, “I gave him the thumbs up.” Why do we use the plural “thumbs” in this expression when we use only one thumb to make the gesture?

A: Why the plural? Because when the expression originated, it referred to the many people (a coliseum full of them, in fact) who were voting with their thumbs. But back then, “thumbs up” was bad news.

Under its entry for “thumb,” the Oxford English Dictionary notes that “thumbs down” and “thumbs up” were originally “expressions referring to the use of the thumb by the spectators in the ancient amphitheatre, to indicate approbation or the opposite.”

In the time of the Romans, “thumbs down” signaled spare him, while “thumbs up” was a death warrant.

In modern usage, the significance of the signals has been reversed, according to the OED, so “thumbs down” now means “disapproval or rejection,” while “thumbs up” is “a sign of approval, acceptance, encouragement, etc.”

Rudyard Kipling, for example, used the modern sense in Puck of Pook Hill (1906), “We’re finished men – thumbs down against both of us.”

So did Arthur Guy Empey in a 1917 glossary of terms used in the trenches: “Thumbs up, Tommy’s expression which means ‘everything is fine with me.’ “

Q: Can you please explain the silly expression “thank you kindly.” It seems sort of self-congratulatory!

A: In an early edition (1921) of his book The American Language, H. L. Mencken suggests that “thank you kindly” was brought to America by Irish immigrants who were “almost incapable of saying plain yes or no” and “must always add some extra and gratuitous asseveration.”

“The Irish extravagance of speech struck a responsive chord in the American heart,” Mencken adds. “The American borrowed, not only occasional words, but whole phrases, and some of them have become thoroughly naturalized.”

He notes that P. W. Joyce, author of English as We Speak It in Ireland (1910), “shows the Irish origin of scores of locutions that are now often mistaken for native Americanisms, for example, great shakes, dead (as an intensive), thank you kindly, to split one’s sides (i. e., laughing), and the tune the old cow died of, not to mention many familiar similes and proverbs.”

Interestingly, the expression “thank you kindly” doesn’t appear in my 1937 edition of Mencken’s book. Perhaps he changed his mind about its origins.

Unfortunately, I can’t find any other references that might explain the expression, which is undoubtedly odd, rather like greeting someone with “Hello expectantly” or saying farewell with “Goodbye reluctantly.”

Though I can’t shed much light, I can pass on a poem, called “Graciousness,” that appeared in The English Journal in 1967:

I’d like to spankThose oafs behindlyWho don’t just “thank…”But “thank you kindly.”

Q: I’m interested in whether there’s a name for composite terms like “hanky-panky,” “willy-nilly,” “hurly-burly,” “boogie-woogie,” “hoi polloi,” etc. Can you shed any light on this puzzling category of words?A: These terms are sometimes called “rhyming compounds.” I wrote a blog entry a few months ago that touched on them. But “hoi polloi” isn’t one of them. It’s the English transliteration of a Greek phrase meaning “the many.” In English, it refers to the masses, often in a negative way.

Many usage experts condemn adding the definite article “the” to “hoi polloi” (as in “The hoi polloi are up in arms”) because “hoi” means “the” in Greek. But the Oxford English Dictionary says the phrase is “normally preceded by the definite article” in English.

In fact, the first published reference in the OED for the English version of “hoi polloi” includes the extra article.

James Fenimore Cooper, in Gleanings in Europe by an American (1837), writes that “a few great men … form the front of every honorary institution … after which the oi polloi are enrolled as they can find interest.” (I’ve filled out the OED citation with excerpts from the book.)As for rhyming compounds, The Columbia Guide to Standard American English describes them as “catchy and surprisingly durable self-imitating words such as nitty-gritty, hanky-panky, hurdy-gurdy, namby-pamby, and itty-bitty.”If you’d like to read more about these toothsome twosomes, the book English Words, by Francis Katamba (Routledge, 1994), has an interesting analysis of the linguistic structures of various kinds of rhyming compounds. See page 54.

Q: What is the origin of the phrase “double-jointed”? And how did its usage get to be so widespread when the word “flexible” suffices?A: The phrase “double-jointed,” meaning “having joints that permit a much greater degree of movement of parts of the body than is normal,” dates back to the early 19th century, according to the Oxford English Dictionary.

The first published reference in the OED (at least in the anatomical sense) dates from 1831, when the phrase appeared in John Roby’s Traditions of Lancashire, a study of English folklore.

In one of the tales cited, a character says of another: “The knave is shrewd and playful, but of an incredible strength, being, as ye may observe, double-jointed.”

The expression apparently found early favor in the medical community. A variation on the theme appeared in 1912 in the British medical journal The Lancet, in an article describing a boy with “double-jointedness.”

“The joints were very loose,” the article said, “and the child took particular pleasure in forming almost circles by locking the index and middle finger of each hand.”

Nowadays, however, physicians prefer the terms “hypermobility” or “hyperlaxity” to “double-jointedness.”

In fact, the adjective “double-jointed” and the noun “double-jointedness” are misnomers; people with the condition don’t have any more joints than people without it.

Why has “double-jointed” proved so popular with the public? Perhaps because it’s more evocative than “flexible” – or “hypermobile” or “hyperlax.” Misnomer or not, it’s still very much with us.

Q: I’ve found an earlier citation for “Tom, Dick, & Harry” than the one you cite in your Feb. 18, 2007, posting about the expression. The 17th-century English theologian John Owen used the words in 1657. I discovered this on page 52 of God’s Statesman, a 1971 a biography of Owen written by Peter Toon.A: Congratulations! This predates the earliest citation for the combination “Tom, Dick, and Harry” in the Oxford English Dictionary: “1734 Vocal Miscellany (ed. 2) I. 332: Farewell, Tom, Dick, and Harry, Farewell, Moll, Nell, and Sue.” (The quotation appears to be from a song lyric.)

Your cite even beats an earlier variation in Shakespeare’s Henry IV, Part 1 (1696): “I am sworn brother to a leash of Drawers, and can call them by their names, as Tom, Dicke, and Francis.”

If I were you, I’d let the OED know about your find. The email address for contributing new evidence to the dictionary is oed3@oup.com. I’ve added a note about the Owen quotation to my original blog item.

Thanks for the information. I suspect that I’ll be doing a lot more updating. Language sleuths are discovering earlier and earlier citations for words and phrases as Google and others digitize millions of published works.Buy our books at a local store, Amazon.com, or Barnes&Noble.com.

Q: Lately I’ve noticed the use by broadcast journalists of “worrying” as an adjective: “It’s a very worrying situation in Afghanistan.” My understanding is that the proper word to use here would be “worrisome.” Am I right?

A: I’m sorry to have to tell you that “worrying” is a much older adjective than “worrisome,” and has every claim to legitimacy.

The adjective “worrying” entered English, as far as we know, in the early 1600s. It was first recorded, according to the Oxford English Dictionary, in Philemon Holland’s translation of Camden’s Britain (1610): “A greater rabble of worrying freebutters.”

The OED says that back then the word meant “given to harrying or raiding.” This warlike usage grew out of the earliest meaning of the verb “worry,” first recorded in the 700s: “to kill (a person or animal) by compressing the throat; to strangle.”

After undergoing many changes along the way, the verb “worry” was first used to refer to mental distress in 1822. And the modern meaning of “worrying” as an adjective (“harassing; distressing to the mind or spirits”) was first recorded soon afterward, in 1826.

Dickens must have liked the word, because he used it in 1837 in The Pickwick Papers (“There are few things more worrying than sitting up for somebody, especially if that somebody be at a party”) and in 1853 in Bleak House (“Whatever the sound is, it is a worrying sound”).

“Worrisome” is a relative latecomer. The OED defines it as meaning “apt to cause worry or distress; given to worrying.”

The first published usage was in The Wigwam and the Cabin (1845), a story collection by the American author William Gilmore Simms: “I … followed the old man into the house, with my feelings getting more and more strange and worrisome at every moment.”

A: Some double names are hyphenated and some aren’t. In the case of longstanding double names, family tradition determines whether a hyphen is used.

For example, the British composer Ralph Vaughan Williams has an unhyphenated double last name. “Vaughan” is not a middle name; his family name is “Vaughan Williams.”

The same is true of Andrew Lloyd Webber, whose father was the composer William Lloyd Webber. Their family name is “Lloyd Webber.”

The English author Vita Sackville-West also inherited her double last name (but a hyphenated one) from her family.

The British royal family name is officially hyphenated too. While “Windsor” is the formal royal name and the one used in public, the Queen has decided that her direct descendants will carry a hyphenated double last name: “Mountbatten-Windsor.”

The decision to use (or not use) a hyphen is often not inherited but a matter of personal choice. One or both members of a couple getting married may choose to use the two last names.

They can decide to hyphenate them, like Farrah Fawcett-Majors or Chris Evert-Lloyd (who have since dropped their ex-husbands’ names), or not use a hyphen, like Hillary Rodham Clinton.

Sometimes a couple who keep their separate names may decide to give their children a double (either hyphenated or non-hyphenated) name.As you can see, the only rule for hyphenating these new double names is that there’s no rule.

Q: Am I wrong to be irritated at the overuse of the term “hero”? I think of a hero as someone who does something heroic – say, running into a burning building to rescue a child. Instead, I’ve seen newspapers call Super Bowl champions “heroes.” If we cheapen the term, what do we use for true heroism?A: I think you’re right. In fact, here’s what I say on the subject in my grammar and usage book Woe Is I:

“hero. There was a time when this word was reserved for people who were … well … heroic. People who performed great acts of bravery or valor, often facing danger, even death. But lately, hero has started losing its luster. We hear it applied indiscriminately to professional athletes, lottery winners, and kids who clean up at spelling bees. There’s no other word quite like hero, so let’s not bestow it too freely. It would be a pity to lose it. Achilles was a hero.”

So here we’re on the same side, though I suspect it’s the losing side.

I might add, however, that the word “hero” has long been used to describe heroic acts that aren’t quite as dramatic as running into a burning building to rescue a child. In fact, I plan to revise the above entry in the next edition of Woe Is I to give more consideration to the word’s history.

In Homer’s day, the Greek word heros referred to a man “of superhuman strength, courage, ability favoured by the gods,” according to the Oxford English Dictionary.

The word had that sense when it entered English in the 14th century, but by the 16th century it came to mean an illustrious warrior, one who does brave or noble martial deeds.

In the mid-17th century, however, the term was already being used more loosely to describe not only a brave warrior but a man who exhibits firmness, fortitude, or greatness of soul “in any course of action, or in connexion with any pursuit, work, or enterprise,” according to the OED.

A 1661 citation, for example, refers to Galileo and other astronomers as “illustrious Heroes.”

More recently, of course, the usage has become even looser. A 1955 citation refers to “an Italian hero sandwich,” which the OED describes as “U.S. slang, a very large sandwich.” I guess eating one would be considered heroic.

Q: As a youngster, it was drilled into me that the word “the” is pronounced THUH in front of a consonant (i.e., “the car”), but THEE in front of a vowel (“the other car”). Yet lately I hear news anchors use THUH before vowels. Is this now acceptable? Or did these people fail their English courses?

A: The pronunciation of the definite article “the” is determined by the sound of the following word (not merely by the letter the word starts with).

Most of us pronounce “the” with a long “e” before a vowel sound (as in “THEE apple” … “THEE hour” … “THEE umbrella”), and when stressed for emphasis (as in “This is THEE movie to see”).

Remember, the issue here is whether the following word begins with a vowel or consonant sound, not whether it begins with an actual vowel or consonant.

By the way, this isn’t some arbitrary rule thought up by the language police to make life hard for us. Rather, it has become a rule because it’s the natural way to pronounce “the.”

With most people, this is automatic. It’s much easier to say THEE before a vowel sound than to pronounce two UH sounds in a row (as in “THUH other”).

In other words, THEE and THUH evolved as common practice, and dictionaries list them as differing pronunciations of “the” before vowel and consonant sounds.

These are the standard pronunciations given in The American Heritage Dictionary of the English Language (4th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.).

However, M-W does note that THUH is also heard sometimes before vowel sounds. So that pronunciation, while unusual (and, I think, awkward), isn’t considered incorrect – at least by Merriam-Webster’s.

You didn’t ask, but the indefinite article “a” also has two pronunciations. It’s generally pronounced UH (like the “a” in “about”). But it’s pronounced with a long “a” sound (as in “day”) when it’s stressed for emphasis: “Did you say you had caught AY fish or several fish?”

Q: Since the ‘80s, I’ve heard folks use the noun “quality” as an adjective meaning of good quality. I first noticed this when a baseball commentator spoke of a “quality pitch.” Soon hospitals were offering “quality health care.” Is that excellent or horrible quality? The quality of my mercy is strained.

A: Many usage experts agree with you and frown on “quality” as an adjective meaning “excellent” or “of high quality.” My feeling is that these mavens are going to have to get used to it.

In fact, the usage isn’t as new as you think. Merriam-Webster’s Collegiate Dictionary (11th ed.) says this sense of “quality” dates from 1936, half a century before you noticed it.

And hundreds of years before that, the word was used in compounds to describe something “of high social standing, of good breeding, noble,” according to the Oxford English Dictionary.

Q: I work with a lot of boys and find it interesting to hear so many of them say things like “I will verse you in a game of Pokémon.” I find it annoying to hear “verse” used to mean compete, but I have come to realize that I am witnessing the evolution of the word “versus.”

A: It’s interesting that you bring up the use of “verse” as a verb. I’ve gotten many emails from parents over the years asking where this came from.

One North Jersey father, for instance, has written that his kids use constructions like “We are versing the Yankees today.” And no, they weren’t reading poetry to the Yankees!

The usage is an apparent adaptation of “versus,” as you suggest, and to “verse” here means to play or challenge or go up against.As it turns out, this isn’t such a new phenomenon. In fact, the kids who first used “verse” for compete are now grown up. The linguist and lexicographer Benjamin Zimmer has traced the usage back to the 1980s.

Here’s a citation from the Feb. 20, 1984, issue of the New York Times: “To verse: High school slang meaning to compete against another school’s team, as in ‘We’re going to be versing the Brown Bombers next week.’ From the preposition ‘versus.’ “

You can see how this might have happened. Imagine a sportscaster saying, “Tonight at 8, Boston versus Cincinnati.” To many ears, the preposition “versus” sounds like a verb, “verses,” as in “Boston verses (that is, plays) Cincinnati.”

Now imagine a child passing on the news: “Hey, Dad! Tonight Boston verses Cincinnati.” Thus a new verb is born.

There’s already a recognized verb “verse” that means to study or acquaint oneself with some subject, as in “I’m well versed in such-and-such,” or “He’s versing himself in geometry.”

The verb “versify” means to write verse. And The Dickson Baseball Dictionary (3d ed.), by Paul Dickson, notes a historical use of the noun “verse” as a synonym for “inning.”

The use of the verb “verse” to mean compete hasn’t yet made it into The American Heritage Dictionary of the English Language (4th ed.) or Merriam-Webster’s Collegiate Dictionary (11th ed.), but I wouldn’t bet against it.

Q: I’m confused by this quote from Stieg Larsson’s thriller The Girl with the Dragon Tattoo (2008): “Blomkvist had often wondered whether it were possible to be more possessed by desire for any other woman.” The use of the plural “were” strikes me as awkward. Wouldn’t it be more natural to use the singular “was”?

A: In this sentence, Larsson (or, rather, his translator) was using the subjunctive mood (“whether it were possible”) instead of the indicative mood (“whether it was possible”). In the subjunctive, “was” becomes “were,” and the switch has nothing to do with plurals vs. singulars.

Now for the real question: Was it appropriate for the translator (Larsson wrote in Swedish) to use the subjunctive in the sentence you quoted?English speakers use the subjunctive mood (instead of the normal indicative mood) on three occasions:(1) When expressing a wish: “I wish I were taller.” [Not: “I wish I was taller.”](2) When expressing an “if” statement about a condition that’s contrary to fact: “If I were king …” [Not: “If I was king …”] Larsson is extending this to a “whether” statement, since “whether” sometimes has the meaning of “if.”(3) When something is asked, demanded, ordered, suggested, and so on: “I suggest he get a job.” [Not: “I suggest he gets a job.”]I wrote a brief explanation of the subjunctive for the blog a few years ago.So, was Larsson’s translator justified in using the subjunctive? I don’t think so.It’s conceivably possible that Blomkvist might one day be more possessed by desire for another woman. So, there’s no need to use the subjunctive here and Larsson should have written “whether it was possible.”The subjunctive is only appropriate when a statement is clearly contrary to fact. For example: “Blomkvist had often wondered if [whether] he were dead.”

Q: BAY-zul or BAZZ-ul? KYOO-min or KUM-in? The first pronunciation in each pair is the one I hear most often today; the second is the pronunciation I grew up with. I’m wondering if cooking shows are responsible for this. Julia Child most certainly pronounced them the way I was taught. This will not change the world or stop global warming. It’s just something I want to get off my chest.

A: The pronunciation of herbs (and the word “herb” itself!) comes up a lot in my email. Many herbs have several acceptable pronunciations, as you’ll find when you look them up.

But even dictionaries can change their stripes. “Cumin” is an interesting example.

My 1956 Webster’s New International Dictionary of the English Language (2d ed.) says “KUM-in” is the only correct pronunciation. But things appear to have changed in contemporary usage.

Both The American Heritage Dictionary of the English Language (4th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.) list three pronunciations without comment (meaning all are acceptable): KUM-in; KOO-min; KYOO-min.

As for “basil, American Heritage and Merriam-Webster’s list two acceptable pronunciations: BAZZ-ul (with a short “a” like the one in “jazz”) and BAY-zul (with a long “a” like the one in “bay”).

If you’re curious about “herb,” I’ve written a blog entry on why Britons pronounce the “h” and Americans don’t. The short answer is that the “h” in “herb” wasn’t pronounced on either side of the Atlantic when the Colonies were being settled.
You wondered about cooking shows. I suspect that TV chefs have little influence on how we pronounce herbs – and perhaps less on how we use them!

Q: You suggested on the blog last month that a kitchen isn’t called a “cooking room,” along the lines of “dining room,” “bedroom,” etc., because “kitchen” is much older than the other terms. I have another theory. Could this be because early European houses were made of flammable material like moss and straw? Perhaps the danger of fire was so great that kitchens weren’t rooms but spaces away from houses.A: This is an interesting theory, but it’s not true. Kitchens in early European homes were generally within living quarters and under the same roofs.

As far back as the year 1000, according to the Oxford English Dictionary, the word “kitchen” meant (and I quote) “That room or part of a house in which food is cooked; [or] a place fitted with the apparatus for cooking.”

In some small homes, the kitchen was the hearth or fireplace area, and the cooking was done there. The hearth or fireplace was fitted with hooks, hobs, and spits upon which food was cooked.

In smaller peasant homes, people depended on the single hearth or fireplace not only for cooking but for warmth.

If your theory were true, people couldn’t have had fireplaces or hearths in their homes at all – whether for cooking or for warmth.

A History of Private Life: Vol. II, Revelations of the Medieval World (1988), edited by Philip Ariès and Georges Duby, notes that kitchens in houses built in outlying areas of Florence in the 1200s and 1300s often had no fireplace, but rather a poorly ventilated central hearth.

“Kitchens were located on the top floor and equipped only with a central hearth,” the book says.

Fireplaces with external chimneys gradually became more popular in Florence in the 1300s and 1400s and replaced the central hearths, according to the book.

Of eight interiors discussed, “six kitchens, six bedrooms, and two living rooms contained equipment associated with a fireplace: andirons, tongs, grates, or shovels.” Note that the book is describing “interiors” of houses, not spaces away from the houses.

Elsewhere in Renaissance Europe as well, fireplaces began replacing smoky central hearths during the Middle Ages. The book goes into some detail about medieval architecture in the French town of Montaillou, where usually the “central room of the house” was the room dedicated to cooking and eating.

Both urban and rural houses in medieval Europe were constructed of a variety of materials. Again I’ll quote the book: “Stone predominated in some, wood, dried clay, or brick in others. Some had roofs of slate or flat stone, others tile; thatch and other natural coverings were still to be found.” Entire medieval villages were made of stone if it was readily available (witness the ancient village of Aveyron).

Q: Why is a jailer one who jails, while a prisoner is one who’s imprisoned?

A: “Prisoner” once meant the opposite of what it means today. Back in the early 1300s, according to the Oxford English Dictionary, it meant “one who keeps a prison.”

But beginning in the late 1300s, it was also used to mean the guy on the wrong side of the bars. This is the meaning that has survived to our own time.

The suffix “er,” when added to a noun (like “jail” or “prison”), forms a new noun that can mean “one who is in charge of” or merely “one connected with” the original noun.

This is how we got derivative nouns like “jailer,” “prisoner,” “officer,” “mariner,” “carpenter,” “villager,” and many, many others.

The “er” can also be added to a verb to create a word for a person who performs the action represented by the verb (as in “write”/”writer” and “run”/”runner”). The new word is known as an “agent noun.”

Although “prison” and “jail” are often used interchangeably, in the US the term “prison” generally refers to a place for confining people convicted of serious crimes while “jail” refers to a place for confining people awaiting trial or convicted of minor crimes.

Q: I have a question about the things that logs rest on in a fireplace. I’ve always called them “andirons,” but I’ve often heard them referred to as “firedogs.” And Mark Twain calls them “dog-irons” in Huck Finn. What is the history of these words?

A: Firedogs and andirons are the same thing – metal supports used in pairs in a fireplace to hold burning logs.

“Andiron” (adopted from the Old French andier) has been in the English language since 1300, and was originally spelled aundyrne in English.

The presence of “iron” in later versions of the word was the result of a misunderstanding. In the early days, people confused the ending of aundyrne with two Middle English spellings of “iron” – yre and yren.

As the Oxford English Dictionary says, the ending of the word was identified in people’s minds with the old words yre, yren, and eventually “iron.” Where the Old French came from we don’t know.

The word “dog,” meanwhile, has been used since the mid-1400s for a variety of mechanical devices or tools for grabbing or holding: clamps, levers, nails, screws, pincers, grappling irons, and so on.

The OED has citations from 1458 for “doggs of Iryn,” and from 1552 for “Dogge of yron,” to describe such implements.

It was perhaps inevitable that an andiron would eventually be referred to as a “dog,” and this first came about (as far as we know) in the late 16th century.

As one of its definitions of “dog,” the OED has this: “One of a pair of iron or brass utensils placed one on each side of a fireplace to support burning wood; = andiron; (more fully called fire-dogs).”

The first quotation in the OED refers to “One paire of dogges in the Chymly” (1596). These were variously called “firedogs,” “dog irons,” “iron dogs” or just “dogs.”

Similarly, a “dog-grate” (also called a “dog-stove”) was “a detached fire-grate standing in a fireplace upon supports called dogs,” according to the OED.

By the way, the name of the liberal blog FireDogLake doesn’t refer to firedogs, according to an article in Washingtonian magazine about the site’s founder, Jane Hamsher. The name comes from Hamsher’s favorite pastime: sitting by the fire with her dog and watching Lakers games.

Q: I’m curious about the phrase “human race.” As far as I can see, we don’t refer to any other species as a race. Why is that? And are the terms “human race” and “human species” interchangeable?A: Today we don’t refer to other species as races, but at one time we commonly did. Now the use of “race” to mean species survives chiefly in the phrase “human race.” So yes, I’d say the phrases “human race” and “human species” are interchangeable.

The noun “race” came into English in the mid-1500s from French, which got it from the Italian word razza (meaning species or kind).

The source of razza has never been determined, but it could possibly be derived from the Latin words ratio (i.e., ratio) or generatio (generation), or from the Old French haraz (which referred to horses and mares kept for breeding, and which may in turn be connected to the Arabic faras, or horse).

Whatever its origins, this sense of “race” is unrelated to the identical English word for a rushing forward (as in a footrace), which comes from early Scandinavian sources.

Over the centuries, “race” has been interpreted extremely narrowly (the descendants of a single house; a single line of descent; one’s children or family); very broadly (the animal, vegetable, or mineral kingdom; a single species); and everything in between (nations, tribes, ethnic groups).

In the phrase “human race,” the word essentially means “species.” Soon after “race” entered the language, one of its meanings (sometimes poetic and sometimes literal) was mankind, and it often was preceded by the adjective “human.”

Sir Philip Sidney wrote of “the humane race” (circa 1590) and Shakespeare of “the whole race of mankinde” (c. 1616). Sometimes people spoke of the sexes as different races – as in the “race of woman kind” (Spenser, 1590), and “the unscrupulous race of men” (Henry James, 1897).

The word was formerly used in the same way to refer to species of plants and animals, according to the Oxford English Dictionary. In Macbeth, for example, Shakespeare called Duncan’s horses “Beauteous, and swift, the Minions of their Race” (c. 1616).

Under its definition of “race” as “any of the major groupings of mankind, having in common distinct physical features or having a similar ethnic background,” the OED adds this note:

“In recent years, the associations of race with the ideologies and theories that grew out of the work of 19th-cent. anthropologists and physiologists has led to the word often being avoided with reference to specific ethnic groups. Although it is still used in general contexts, it is now often replaced by terms such as people(s), community, etc.”