Q: Am I at all justified in my disdain for the word “oftentimes”? I understand the speaker is differentiating from “sometimes,” but it sounds redundant to me.

A: Our language is full of surprises. Redundant or not, “oftentimes” is standard English and has been part of the language since the 14th century.

“Oftentimes,” an adverb meaning frequently or repeatedly, can be found in standard dictionaries like The American Heritage Dictionary of the English Language (4th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.).

It also has an entry in the Oxford English Dictionary, which defines it as meaning “many times; on many occasions; in many cases; frequently, often.”

The OED’s earliest citation for the word is from the late 1300s. In Middle English, the OED says, the term was “written indifferently as one word or as two.” But since the 16th century it’s usually been written as one word.

Today “oftentimes” is used chiefly in North America, according to the OED. Elsewhere, it’s considered archaic or literary.

A shorter version, “oft-times,” was recorded slightly earlier than “oftentimes” but isn’t heard as much in modern times. “Oft-times” is now labeled chiefly archaic or poetic. But it, too, can be found in standard dictionaries and is a quite legitimate usage.

In case you’re wondering, “oft” is extremely old, dating from early Old English. The Chambers Dictionary of Etymology says it goes back to before the year 725.

Today it’s still used regionally in the north of England but otherwise the usage is considered archaic or poetic. (It does occasionally turn up, in phrases like “oft-quoted remark” and “oft-told tale.”)

In everyday usage, “oft” was pretty much replaced after the 16th century by the extended form “often.” Chambers says the development of “often” may have been influenced by its opposite number in Old English, seldan (seldom).

And while we’re on the subject, you may be interested in a blog entry we wrote a few years ago on the pronunciation of “often.”

Q: My brother works for a company that captures old recordings on various media and converts them to digital format. When no working reader can be found for a particular medium, the company says the medium has “gone cold.”

A: What a chilling figure of speech! But we like it. Someone who says an outdated medium has “gone cold” is drawing an analogy with death.

Since the 14th century, the Oxford English Dictionary says, the adjective “cold” has been used in speaking “of the human body when deprived of its animal heat; esp. of a dead body, of death, the grave.”

The OED’s first citation for “cold” used in this way is from Cursor Mundi, an anonymous 14th-century poem written in Middle English.

In a section about the Trojan War, the poem has this line: “There mony modir son was colde” [There many a mother’s son was cold].

Five centuries later, in Sir Walter Scott’s The Lay of the Last Minstrel (1805), we find this couplet: “Then Deloraine, in terror, took / From the cold hand the mighty book.”

“Cold” has also been used over the centuries to describe people or things that are unfeeling (as in “a cold fish”), cruel (“cold blooded”), or timid (“to have cold feet”).

Similarly, images of coldness have been used to describe things that are weakened or outdated, like the trail or scent that has “gone cold,” or the detective’s “cold case,” or the journalist’s “cold news” or “cold story.”

The extension to outdated technological media is certainly appropriate!

Q: I hope you can answer this: what is the origin of “pig” as a derogatory word for a policeman? My guess is it comes from the ’60s antiwar protests in the US.

A: Although the word “pig” was heard a lot during the American student protests of the ’60s and ’70s, the usage originated in Britain a century and a half before the first chant of “Ho, Ho, Ho Chi Minh.”

The earliest published reference in the Oxford English Dictionary for this use of “pig” is from Francis Grose’s Lexicon Balatronicum (1811), a slang dictionary that defines “pig” this way:

“A police officer. A China street pig; a Bow-street officer. Floor the pig and bolt; knock down the officer and run away.” (We’ve gone to the original to expand the OED citation.)

The expression “China Street pig” was a slang term for a Bow Street police officer, a member of London’s first professional police force. The police were attached to the Bow Street magistrates’ office in London.

As you can imagine, the word “pig” in its porcine sense is very old, dating back to Anglo-Saxon days. The OED says that when the term first showed up in Old English, this is what it meant:

“An omnivorous, domesticated even-toed ungulate derived from the wild boar Sus scrofa, with a stout body, sparse bristly hair, and a broad flat snout for rooting in the soil, kept as a source of bacon, ham, pork, etc.”

So how did a word for an even-toed ungulate come to be a mocking term for a police officer?

Well, two and a half centuries before it was first used to bad-mouth a cop, “pig” took on a more general negative sense: someone or something considered unattractive, unpleasant, or greedy.

(Speaking of “cop,” we wrote a blog item several years ago on its etymology.)

The OED’s first citation for the pejorative use of “pig” is from a 1546 book of proverbs about marriage: “What, bid me welcome, pig? I pray thee kiss me! Nay, farewell, sow!” (We’ve used an expanded citation from the Internet Archive.)

By the early 1800s, a Bow Street runner (another derogatory term for a police officer in London) was being called a China Street pig or, simply, a pig.

Q: Why do people say “most well” when “best” would sound better? They do this with phrases like “well educated,” “well read,” etc. Doesn’t it sound better to say someone is “the best educated,” not “the most well educated”?

A: In a phrase like “well educated,” the word “well” is a comparative adverb. And the superlative form of that phrase would be “best educated.” But “most well educated” is another option and is not grammatically incorrect.

Why do people sometimes prefer “most well educated” to “best educated”?

We think this happens because certain comparative phrases (“well educated,” “well known,” “well read,” “well meaning,” “well deserved,” and others) have become so firmly entrenched that they resist alteration.

A reporter might write, for instance, that a political candidate was “the most well spoken of the three and the most well received.” The phrases “best spoken” and “best received” might seem unnatural here.

At a cookout we probably wouldn’t say, “The piece in the middle is the best done.” We’d call it “the most well done.” And a biographer might describe a good deed as “the most well-meant gesture he had ever made.” In this case, “best-meant” wouldn’t sound quite right.

The Oxford English Dictionary has quit a few citations that use “most well” in this way.

For example, in Tom Brown’s School Days (1857), Thomas Hughes wrote about boys “whose parsing and construing resisted the most well-meant shoves.”

And a 1935 cookbook described a recipe as “one of the most well known of all Belgian dishes.” Most of the other citations involve “most well known.”

Of course, examples with “best” as the adverb part of the phrase vastly outnumber those with “most well.”

Q: My brother’s name is “Hubert.” My son’s name is “Hugh.” Am I making a mistake when I say their names without pronouncing the initial “H”? What about “hue,” “humid,” “Hume”?

A: The short answer is that the “h” is usually pronounced in these words, so “Hubert” sounds like HYOO-bert, “Hugh” and “hue” like HYOO, “humid” like HYOO-mid, and “Hume” like HYOOM.

But quite a few people don’t pronounce the initial letter, so “Hubert” then sounds like YOO-bert, “Hugh” and “hue” like YOO, “humid” like YOO-mid, and “Hume” like YOOM.

And the people who do pronounce the “h” do it in all sorts of ways, from a very aitchy “h” to a whisper of aitchiness that can barely be heard.

Phonetically, the letter “h” in these words is a voiceless palatal fricative (a consonant produced by narrowing the air passages, arching the tongue toward the hard palate, and not vibrating the vocal cords).

All of the standard dictionaries we checked say the proper names you asked about (“Hubert,” Hugh,” “Hume”) should be pronounced with the “h” sounded.

But the dictionaries differ about pronouncing “hue” and “humid,” as well as “huge,” “human,” and similar words.

The American Heritage Dictionary of the English Language (4th ed.), for example, lists only one pronunciation for each of these words: with the “h” sounded.

But Merriam-Webster’s Collegiate Dictionary (11th ed.) also includes “h”-less pronunciations of “humid,” “huge,” and “human.” And the other standard dictionaries we checked generally agree with M-W.

In its entries for “humid,” “huge,” and “human,” the Random House Webster’s College Dictionary says the words are “often” heard with “h”-less pronunciations.

Interestingly, English adopted all three of these words from early versions of French where the “h” wasn’t sounded.

In the case of “human,” which comes from Latin via Middle French and Anglo-Norman, “the origin of the vocalism is unclear,” according to the Oxford English Dictionary.

The OED notes that the word begins with an “h” in some Romance languages (for example, humano in Spanish, where it’s not pronounced) and without it in others (umano in Italian).

So are you making a mistake by not pronouncing the “h” in the names of your brother and your son?

Yes, according to standard dictionaries. But a lot of people do it. And as Alexander Pope observed, “To err is human, to forgive divine.”

Q: Every day I hear American commentators use singular collective nouns as plurals. For example, “the family have.” I regularly hear this on the BBC, of course, but now it’s on NPR too. What is technically correct?

A: We’re surprised to hear that this British convention for dealing with collective nouns has spilled over into American broadcasting.

We wrote a blog entry a while back about the differences between American English and the British variety, and touched on the different ways the two Englishes treat collective nouns:

“In Britain,” we wrote, “many collective nouns are plural, such as the names of companies, sports teams, government bodies (‘Mobil invite you to join them’). In the US, the preference is for the singular (‘Mobil invites you to join it’).”

Correctness isn’t an issue here, as we wrote earlier this year in a posting about so-called “notional” (as opposed to formal) agreement.

“Reasonable people,” we said, “can disagree on notional versus formal agreement. Take the case of institutional and collective nouns and how they’re perceived in the US as opposed to the UK.

“In the US, the institution is regarded as a singular entity; in the UK, it’s regarded as a collection of individuals.”

That said, we have to emphasize that the tradition in American English is to treat organizations as singular nouns and to give them singular verbs. We say, “The family is waiting,” not “The family are waiting.”

An American who uses the British convention instead is going to sound like a pretentious copycat. Why not go all the way and say “lift” instead of “elevator” and “flat” instead of “apartment”?

Q: I find the indiscriminate use of “he” to be a nagging irritant. This occurs, especially, in sports broadcasting. For instance: “It was a close play at first, his toe touched the bag just as he caught it, and he called him out, and he is furious.” This involves the base runner, the first baseman, the umpire, and the manager that disagreed with the call. I would appreciate your comments.

A: It may be that in the heat of play-by-play, sportscasters find it easier to dispense with names, and as a result overuse pronouns.

And of course live broadcasts don’t give sports announcers much time to think about grammar and usage. (Pat has tripped over a word or two during her appearances on WNYC.)

But in any kind of speech or writing, a pronoun shouldn’t raise a question in the reader’s or listener’s mind. The person being referred to should be obvious.

Pat discusses this problem in her book Woe Is I. In a section called “Watch Out for Pronounitis,” she writes:

“When you write things like this, of course you know the cast of characters. It won’t be so clear to somebody else. Don’t make the reader guess. Here’s a possibility: Judy already regrets telling her boyfriend about her nose job, or so Fleur says. Or maybe this: Fleur says her boyfriend heard about her nose job from Judy, who already regrets telling him.”

Sometimes pronouns aren’t confusing, just too numerous.

For instance, there’s no need to repeat the subject when a sentence has a series of verbs, like this one: “He hit a high fly into left field, then [he] ran to first and [he] nearly made it to second, but [he] was called out.”

The pronouns in brackets aren’t grammatically incorrect, just unnecessary.

Then there’s the pronoun that duplicates the subject, definitely a grammatical no-no: “My brother he was born in Boston.” Cut the “he.”

Q: I was recently reminded, once again, that the captain of the H. M. S. Pinafore commands “a right good crew.” This led me to wonder about the many and varied senses of “right.” Do they all derive from a single core meaning?

A: In Act I of H.M.S. Pinafore, Captain Corcoran sings to the ship’s crew, “You’re very, very good, / And be it understood, / I command a right good crew.”

Sir William S. Gilbert used “right” in that lyric as an adverb meaning something like “very.” This sense of the word, used to modify an adjective, has been around since Old English, according to the Oxford English Dictionary.

Here’s a similar example, from an 1861 letter written by Edward Fitzgerald : “He is a right good little Fellow, I do believe.”

You’re right (that is, correct) in saying that “right” has many and varied meanings. Let’s take a walk through them.

Adverb: We mentioned Captain Corcoran’s usage. “Right” used as an adverb can also mean directly or immediately (as in to “go right home”), vertically (“she sat right up”), appropriately (“make sure it’s done right”), or accurately (“If I remember right”).

In British and Irish English, the adjective is sometimes used as an intensifier meaning, in the words of the OED, “complete, absolute, total, utter” (as in “I felt a right fool”).

Also, “right” can refer to one side of the body (as opposed to the left), a usage that the dictionary says was first recorded in Old English in the noun phrase “right hand.”

And “right” is used to indicate a direction, a sense that the OED says “probably referred originally to the perception that the right hand was the stronger and the more appropriate for most tasks.”

The political sense of “right” originated in revolutionary France, where the term le côté droit referred to the right-hand side of the Assembly. As the OED says, the term “right” was used “with reference to the seating of nobles and high clergy to the right of the Chair, and the third estate and lower-status clergy to the left.”

Interjection: “Right” has been an interjection of agreement since Shakespeare’s time. Today it’s also used ironically to express doubt, as in “Yeah, right.”

Noun: Since early Old English, “right” has been a noun meaning privilege or entitlement (as in “knowing one’s rights”). And for just as long, it’s had meanings related to fairness, goodness, justice, and the like (as in “He’s on the side of right and reason”).

In the 19th century, the plural “rights” was first used in the copyright sense. And of course in political and many other senses, the word is used as a noun: “extremists on the right” … “a right to the jaw” … “take a right at the next corner” … “set the kitchen to rights” … “as of right” … “do right by your brother” … “in her own right.”

Verb: As a verb, “right” can mean to correct, to straighten, to set upright, to recover one’s balance, to set back into place, to put back into order, to rectify, to repair, or to vindicate or avenge.

You asked if all these senses of “right” are derived from the same root. Yes, that’s what linguists believe. Here’s the story.

The word’s ancestor in English and the other Germanic languages is a prehistoric Proto-Germanic root that’s been reconstructed as rekhtaz. This in turn has been linked to an even earlier reconstructed source in Indo-European, reg (to move in a straight line).

The Indo-European reg is thought to be the ultimate source for “right,” not only in English and other Germanic languages but also in Latin (rectus, straight), Greek (orektos, stretched out), Old Irish (recht, law), Welsh (rhaith, law), Sanskrit (raji, straight), and Old Persian (rasta, straight).

Besides its relatives in foreign languages, “right” has many cousins in English—words that are derived from the same Indo-European source. These include “address,” “correct,” “direct,” “erect,” “guide,” “raj,” “rector,” “realm,” “regal,” “regime,” “regular,” “regent,” “regiment,” “royal,” “rule,” and more.

Q: Growing up, we learned that “shall,” indicating intention, should be used only with the first person. So it was correct to say “I shall go to school tomorrow,” but not to say, “He shall stay home.” Legal formulas, however, often use “shall” for the third person: “The party of the first part shall etc.” So, shall I ask you for a comment? Or shan’t I?

A: It’s interesting that although we’ve been writing this blog every day for five years, we’ve gotten only one request to explain the distinction between “shall” and “will.”

Call it a sign of the times. The old tradition that drew a strict line between “shall” and “will” has gone, unlamented, to the grammatical graveyard in the US and it’s on the way there in the UK.

Here’s the old tradition in a nutshell:

● When expressing a future tense, use “shall” with the first person (“I” and “we”) and “will” with the second and third persons (“you,” “he,” “she,” “they,” etc.).

● When expressing determination, permission, or obligation, use “will” with the first person and “shall” with the second and third persons.

Americans seldom use “shall” these days. However, “shall” is still common in legal usage, as you note, and in polite questions (“Shall we dance?” … “Shall we go?” … “Shall I freshen your coffee?”).

“Shall” is also heard in set expressions (“We shall see” … “We shall overcome”). And of course it’s familiar to many of us because of Gen. Douglas MacArthur’s vow, “I shall return!”

How did the old tradition come about?

Merriam-Webster’s Dictionary of English Usage says it was “first set down in the 17th century by John Wallis, a bishop and a well-known mathematician.”

Wallis, who’s credited with introducing the ∞ symbol for infinity, wrote about “shall” and ”will” in an English grammar book written in Latin: Grammatica Linguae Anglicanae.

But it’s been pointed out that his rules didn’t reflect the practices of the preceding century. And even in his own time, M-W says, the “shall”/“will” distinction wasn’t consistently observed: “sometimes usages match the rules and sometimes they do not.”

As for usage today, Merriam-Webster’s observes: “It is clear that even in the English of England there has always been some deviance,” while in America “there has been considerable straying from the Wallis rules.”

“Our conclusion,” M-W adds, “is that the traditional rules about shall and will do not appear to have described real usage of these words very precisely at any time, although there is no question that they do describe the usage of some people some of the time and that they are more applicable in England than elsewhere.”

The Cambridge Grammar of the English Language agrees with Merriam-Webster’s that even in England “shall” isn’t universally used in the traditional way.

In the future tense, the editors write, “we must allow will as well as shall for the 1st person—and modern usage manuals recognise this. Will (including the contracted variant ’ll) is in fact very much more common.”

Q: We on this scepter’d isle wonder why you Yanks are so intent on replacing the s’s in our civilised spellings with z’s.

A: Not so fast! Are verbs ending in “-ise” really better bred than those ending in “-ize”?

The Oxford English Dictionary, which ought to know, says the “-ize” ending is actually the traditional one and the only proper one.

The first of these words to enter English, “baptize,” appeared in the 13th century with its z intact, and was later joined by “authorize” (14th century), “organize” (15th), “characterize” (16th), “civilize” (17th), and many others.

As we point out in our book about language myths, Origins of the Specious, the “-ise” spellings weren’t used much until the 18th century or later.

Whodunit? The culprits were Francophiles enamored of French verbs like civiliser, dramatiser, organiser, and so on.

But, as the OED says, “there is no reason why in English the special French spelling should be followed.”

Spelling aside, many language authorities have criticized the practice of creating new verbs by tacking the suffix “-ize” (or “-ise,” if you prefer) on nouns, adjectives, and proper names.

Critics jumped on Noah Webster, for example, when he included “demoralize,” “Americanize,” and “deputize” in his 1828 dictionary.

Other words condemned in the 19th and 20th centuries were “jeopardize,” “accessorize,” “burglarize,” “prioritize,” “finalize,” and “theorize,” according to Merriam-Webster’s Dictionary of English Usage.

“If you are one of those persons of tender sensibilities whose nerves are grated by –ize, you would be better off learning to live with the problem,” Merriam-Webster’s says.

We agree that “-ize” words aren’t going away, but that doesn’t mean we have to use all of them, especially not the ones that irritate our tender sensibilities (“colorize,” “finalize,” “prioritize,” etc.).

She’ll be on the Leonard Lopate Show around 1:30 PM Eastern time to discuss the English language and take questions from callers. (Andy Borowitz is subbing for Leonard this week.) If you miss the program, you can listen to it on Pat’s WNYC page.

And the Oxford English Dictionary’s entry for “hacking” includes this definition: “The use of a computer for the satisfaction it gives.” That’s about as flexible as you can get.

Finally, here’s a definition from the Free On-Line Dictionary of Computing: “To interact with a computer in a playful and exploratory rather than goal-directed way. ‘Whatcha up to?’ ‘Oh, just hacking.’ ”

Q: I enjoy reading on the Kindle because all I have to do is point to a word and the Oxford American Dictionary definition pops up. For example, I came across the word “spruce” (as in, he’s looking very spruce) and learned that it’s derived from “Prussia.”

A: Yes, there’s a link between “spruce” and “Prussia,” but the connection isn’t quite so neat as the dictionary seems to imply.

The noun “spruce” (the fir tree) is indeed derived from a now-obsolete term for Prussia that entered English in the 14th century. And the country’s connection with the other “spruce” seems likely, but it’s a bit more tentative.

From the 14th to the 17th centuries, Prussia was often referred to as “Spruce” or “Spruce-land” in English, though spellings differed widely: “Sprewse,” “Sprusse,” “Spruse,” and so on.

This “Spruce” and its variations evolved from the country’s name in post-classical Latin (Prussia), Anglo-Norman (Pruys, Pruz), and Middle French (Pruce, Prusse), according to the Oxford English Dictionary. (The German word for Prussia is Preußen.)

The dictionary notes that the country’s name comes from Prussi, a post-classical Latin name for the Prussian people.

It’s probable that “spruce,” meaning the fir tree or its wood, came into English before the adjective meaning neat. A word spelled “spruse” was used for the wood of the spruce fir as far back as 1412, according to the Chambers Dictionary of Etymology.

But “spruce” (spelled various ways) was sometimes used ambiguously in the 15th and 16th centuries.

In phrases like “spruce board,” “spruce ell,” “spruce chest,” “spruce coffer,” and even “spruce tree,” the OED says, the adjective could have meant “brought or obtained from Prussia,” or “in some instances” could have implied the spruce fir.

Here’s one of the examples the OED finds ambiguous: “A maste of a spruce tree … bought for the foremast of the seid ship” (from Naval Accounts and Inventories of Henry VII, 1497).

There’s no ambiguity, however, in this citation from the OED, which is definitely a reference to the spruce fir: “For masts, &c., those of Prussia, which we call Spruce … are the best” (from Sylva, John Evelyn’s 1670 book about trees).

The other “spruce” (the neat one) was first recorded in the late 16th century in Richard Harvey’s Plaine Perceuall (1589): “neat, nimble, spruse Artificer.”

And Ben Jonson used the word a decade later in his comedy Every Man Out of His Humor: “A Neat, spruce, affecting Courtier, one that weares clothes well, and in Fashion.”

The English verb “spruce” (to neaten) showed up in writing at roughly the same time. The OED’s earliest citation is from The Terrors of the Night (1594), a discourse on apparitions. The author, Thomas Nashe, uses “spunging & sprucing” to mean cleaning up, apparently in the ghost-busting sense.

Etymologists believe that the neatness sense of the word may have come from the term “spruce leather,” first recorded in 1464 and meaning a kind of Prussian leather.

As Chambers says, jerkins made of spruce leather were “a popular style in the 1400s made in Prussia and considered smart-looking.”

And John Ayto writes in the Dictionary of Word Origins, “spruce leather” or “Prussian leather” was “a particularly fine sort of leather, used for making jackets.”

The leather connection would explain how “spruce” came to mean dressy in the 16th century. The OED points readers to Thomas Dekker’s The Guls Horne-Booke (1609), which refers to “the neatest and sprucest leather.”

But the connection with Prussia here is less neat and tidy than the one between Prussia and the fir tree.

[Note: This post was updated on Aug. 16, 2015.]

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Q: I’m a pastor in the United Methodist Church. During worship I suddenly realized how odd part of the communion liturgy sounds: “all honor and glory IS yours, almighty God, now and for ever.” Then I thought about it a little more. Every week we say in the Lord’s Prayer “for thine IS the kingdom and the power and the glory forever.” Hmmm. Is something amiss, or is an obscure rule at work?

A: No, there’s nothing amiss, and there’s no obscure rule involved. But subject-verb agreement can be tricky at times. It isn’t simple arithmetic. One and one in grammar doesn’t always equal two. Take that last sentence, for example.

What’s at work here is notional agreement, a subject that we’ve discussed several times on our blog, most recently in a posting last July.

Notional agreement, according to Merriam-Webster’s Dictionary of English Usage, is subject-verb agreement that’s based on meaning rather than on formal textbook grammar.

By meaning here, Merriam-Webster’s is referring to the meaning that an expression has to the writer or speaker.

So if pastors or worshipers consider “honor and glory” a single package, according to the usage guide, it’s proper to use a singular verb. (The early Latin version of the prayer uses the singular est, though the two texts differ quite a bit.)

The M-W editors give this example of notional agreement from Ecclesiastes: “time and chance happeneth to them all.” And here’s a more down-to-earth example of ours: “Macaroni and cheese is my favorite dish.”

As for the Lord’s Prayer, this is a simple case of textbook subject-verb agreement. The subject “thine” is a singular pronoun that takes a singular verb, no matter how many nouns follow the verb.

If you’re still confused, here’s a more worldly example: “She’s a wit and a beauty and an heiress. What a catch!”

Getting back to the issue of notional agreement, Merriam-Webster’s says that before the 18th century “no one seems to have worried whether two or more singular nouns joined by a copulative conjunction (and) took a plural or singular verb.”

“Writers of the 16th and 17th centuries used whatever verb sounded best and did not trouble themselves about grammatical agreement,” M-W adds.

Here are a couple of Shakespearean examples: “All disquiet, horror, and perturbation follows her,” from Much Ado About Nothing, and “art and practice hath enriched,” from Measure for Measure. (“Hath” is an archaic third-person singular of “have.”)

In the 18th century, the usage guide says, grammarians “undertook to prune the exuberant growth of English” and began insisting on formal subject-verb agreement.

“Modern grammarians are not so insistent,” M-W says, noting that George O. Curme and Randolph Quirk have recognized that when compound nouns form a “collective idea” the “singular verb is appropriate—notional agreement prevails.”

Although compound subjects usually take plural verbs today, the usage guide says, “The singular verb is appropriate when the nouns form a unitary notion or when they refer to a single person (as in ‘My friend and colleague says’).”

Q: I’m working on a project related to the term “generation gap.” The Web has led me to Jessica Pallington’s Lipstick, which says the phrase dates from 1925. But the book lacks references to substantiate this. I hope you can help.

A: The notion of a generation gap probably dates from the first two generations of humans to walk the earth.

And generations of authors have written about it, from Shakespeare (King Lear) to Turgenev (Fathers and Sons) to Arthur Miller (Death of a Salesman).

But you’re asking about the phrase, not the gap itself. And the remark that caught your eye in the 1998 Pallington book, Lipstick: A Celebration of the World’s Favorite Cosmetic, does seem to refer to the phrase:

“One of the first known references to the ‘generation gap’ came in 1925, when people referred to the gap between generations of mother and daughter being signified by one wearing lipstick and the other not.”

We’ve seen several other references on the Internet to 1925 and the expression “generation gap,” but all of them either cite the Pallington book or use similar language.

It may be that she knows something we don’t, but as far as we can tell the phrase “generation gap” first showed up during the early 1960s. [NOTE: See a 2013 update at the end of this post.]

The Oxford English Dictionary defines the expression as “a difference of attitudes and values between people of different generations, esp. parents and children, leading to a lack of understanding.”

The first published reference in the OED is from a July 28, 1962, headline in the Daily Record of Stroudsburg, PA: “Generation Gap Affects Parent-Child Relations.”

We searched several other databases—America’s Historical Newspapers, the New York Times archive, Google Timeline, etc.—and that headline was the earliest citation we could find.

We also searched for “generational gap,” but the earliest example we found was this one from the Sept. 9, 1964, issue of Punch: “The generational gap is even more extended at student level.”

We’ll end this with a Shakespearian flourish. Who can forget King Lear’s words about filial ingratitude: “How sharper than a serpent’s tooth it is / To have a thankless child!”

[Update: On May 31, 2013, a reader in the UK sent us an early sighting of the term “generation gap.” He says, “I’m in the course of reading Goodbye to All That (1929), the autobiography of Robert Graves, and thought you’d like to know that in the first paragraph of chapter 2 he writes: ‘I found the gap of two generations between my parents and me easier, in a way, to bridge than a single generation gap.’ So it looks like that 1925 citation may well be correct.” Graves has said that his autobiography grew from fragments he first began writing in 1916.]

Q: I’m shocked, shocked to have read this in the NY Times: “Hanging over the debt ceiling negotiations in Washington has been the threat that the United States could lose its AAA credit rating, a coveted measure of the federal government’s financial strength. But in corporate America, the top rating long ago became an anachronism.” Well, the US lost its AAA rating and maybe the NYT should too. That use of “anachronism” is just wrong. Someone has to uphold some standards. Perhaps you hold some sway with the wayward Times editors.

A: You’re right that the word “anachronism” was misused in that Aug. 2, 2011, article in the New York Times.

The reporter should have called the corporate AAA rating “a rarity,” as the headline writer did: “AAA Rating Is a Rarity in Business.”

But do we have any sway with Times editors? Fuggedaboutit! It’s been ages since we worked for the Gray Lady.

As for “anachronism,” English borrowed the word from French in the 17th century, but its roots are in the Greek words for backward and time.

The Oxford English Dictionary says the English word originally referred to an error in calculating time or fixing dates.

The earliest citation in the OED, dated sometime before 1646, is from Posthuma, a tract by the Orientalist John Gregory: “An error committed herein is called Anachronism.”

In the 19th century, the OED says, the word took on the sense that’s most common today: “Anything done or existing out of date; hence, anything which was proper to a former age, but is, or, if it existed, would be, out of harmony with the present.”

The first citation in the dictionary is from The Statesman’s Manual (1816), by the poet Samuel Taylor Coleridge: “If this one-eyed Experience does not seduce its worshipper into practical anachronisms.”

Here’s a more recent cite, from Mary McCarthy’s 1952 novel The Groves of Academe: “She herself was a smoldering anachronism, a throwback to one of those ardent young women of the Sixties, Turgenev’s heroines.”

Not all anachronisms, however, are throwbacks. Some of them go in the other direction—anachronistically forward, like a noticeable jet trail seen in a movie that’s set in the Old West.

We recently wrote a posting about anachronistic references in the television show Mad Men.

Q: I’ve read that the editors of the New Oxford American Dictionary once planted a fake word, “esquivalience,” as bait to catch lexicographers intent on stealing their material. Do you know of other examples of this?

A: “Esquivalience” is indeed a fake word. It was planted in the 2001 edition of the New Oxford American Dictionary to protect the copyright of the electronic version that came with most copies of the book, according to the editor-in-chief.

The definition: “the willful avoidance of one’s official responsibilities.”

In 2005, the humorist Henry Alford wrote an article for the New Yorker about the genesis of “esquivalience.” He even coined a term for fake entries in dictionaries and encyclopedias: “Mountweazels.”

His inspiration for the neologism was a fake biographical entry for “Lillian Virginia Mountweazel” in the 1975 edition of the New Columbia Encyclopedia.

The fictional Ms. Mountweazel, Alford says, was supposedly “a fountain designer turned photographer who was celebrated for a collection of photographs of rural American mailboxes titled ‘Flags Up!’ ”

The encyclopedia entry, Alford adds, indicates that she “was born in Bangs, Ohio, in 1942, only to die ‘at 31 in an explosion while on assignment for Combustibles magazine.’ ”

He quotes one of the encyclopedia’s editors as saying: “It was an old tradition in encyclopedias to put in a fake entry to protect your copyright. If someone copied Lillian, then we’d know they’d stolen from us.”

You asked whether we knew of other examples. As a matter of fact, we do.

Reference books aren’t the only repository of Mountweazels. Maps, too, have been known to contain fake names for streets and towns.

A 1978 Michigan Department of Transportation highway map shows two towns, Goblu and Beatosu, that don’t exist. They were fakes, based on University of Michigan football cheers, “Go blue!” and “Beat OSU!” (for Ohio State University).

Mountweazels have invaded the non-print media as well.

A few months ago, Google said it had rigged a number of fake search queries, using nonsense terms like “hiybbprqag” and “mbzrxpgjys,” which would turn up results with no relation at all to the search terms.

Why? Google said it was trying to catch Bing, Microsoft’s search engine, in the act of stealing its material.

The synthetic queries, Google explained, did not appear within the Web pages that came up, and there was no reason for any other search engine to return the faked results.

Google said the sting operation worked. It claimed, for instance, that a search on Bing for the term “hiybbprqag” turned up a planted page about seating at a theater in Los Angeles. Microsoft, however, denied Google’s allegations.

Q: A patron at the library where I work wants to know why some words (“pronounce” and “like,” for example) retain their silent “e” when adjectivized.

A: The word “pronounce” keeps its silent “e” in “pronounceable” for the same reason that many other words ending in “ce” do. When a suffix is added, the presence of the “e” influences the pronunciation of the preceding “c”—keeping it soft instead of hard.

If that “e” were dropped, we’d end up with “pronouncable.” In English, the letter combination “ca” is pronounced with a hard “c” (as in “cable”) instead of a soft one (as in “certain”). The middle syllable would be NOWNK instead of NOWNSE.

Same with “peaceable” and “noticeable.” If they were spelled “peacable” and “noticable,” one would be tempted to pronounce each “c” like a “k.”

This is also true of words ending in “ge,” like “marriage.” If the adjective were spelled “marriagable,” the “g” would look as if it were hard (as in “girl”) instead of soft (as in “judge”). The letter combination “ga” is hard but “ge” at the end is usually soft, as in “garage.”

So with many words ending in “ce” and “ge,” the silent “e” is generally retained in a suffixed form to keep the consonant soft—in other words, to keep the sound as “s” instead of “k,” or as “j” instead of a hard “g.”

In the case of “like” and many other words that end in a silent “e,” the “e” is often a signal that the preceding vowel is long instead of short. We’ve written about this phenomenon before on our blog.

For example, “dim” has a short “i” but “dime” has a long one; “hat” has a short “a,” but “hate” has a long one; “lob” has a short “o,” but “lobe” has a long one.

And with many of these words, the silent “e” is retained in a suffixed form to keep the vowel from changing (“hateful” instead of “hatful,” for example).

With “like,” there are two accepted spellings of the adjective form: “likeable” is more common in Britain and “likable” is more common in the US, but both are correct.

The British spelling makes more sense to us, since “likable” looks as though it should be pronounced LICK-able. Besides, the silent “e” is retained in other suffixed forms: “likely,” “likelihood,” “likeness,” “likewise.”

Q: It turns out that there may be linguistic benefits to my being a techie. Here’s a heads-up on the latest in virtual development: The word “gizmo” has a new meaning. It’s the frame by which you apply an effect or action on a virtual surface or solid. Add THAT to your Funk & Wagnall’s!

A: Thanks for the heads-up, but let’s wait a bit to see if “gizmo” 2.0 has staying power.

For the time being, we’ll stick with the traditional meaning (that is, if “traditional” is the proper adjective to describe a slang word that’s relatively new as these things go).

The Oxford English Dictionary defines “gizmo” as a US slang term meaning a gadget, gimmick, or a thingamajig. (Don’t worry. We’ll return to “thingamajig” later.)

The first citation for “gizmo” in the OED is from the July 19, 1943, issue of Time magazine: “Gizmo—a term of universal significance, capable of meaning ‘gadget,’ ‘stuff,’ ‘thing,’ ‘whozis’ or almost anything else the speaker wants it to.”

The word is spelled “gismo” in all of the other OED citations, including this one from the Aug. 1, 1970, issue of the New Yorker: “Every gismo that made use of a clothes hanger will be demonstrated by its inventor.”

The OED entry spells the word “gismo” and lists “gizmo” as a variant, but the two standard dictionaries we consult the most reverse that order.

The American Heritage Dictionary of the English Language (4th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.) have entries for “gizmo,” with “gismo” listed as a lesser-used variant.

Well, we promised to get back to “thingamajig,” so here goes. The OED defines it as a colloquial noun with the same meaning as “thingummy,” which is defined this way:

“A thing or (less commonly) person of which the speaker or writer cannot at the moment recall the name, or which he or she is unable to or does not care to specify precisely; a ‘whatchamacallit.’ Also used as the name of a person, place, etc., in place of the actual name (as Mr Thingummy, etc.).”

The dictionary’s earliest citation for “thingamajig” is from the June 1824 issue of the Casket, a literary monthly in Philadelphia: “I’d a lot of cousins, that ‘com’d all the way down from Varmount to larn the fashions, and to hear and see all the cute and curious thingumajigs of the Old Colony.’ ”

The word “thingummy” showed up a century earlier in an English translation of the works of Rabelais: “In Languedoc they call every Thing (estreé) Thingumy, that they must not name.”

The OED says “thingamajig” is apparently an extended version of “thingummy” or of the obsolete 17th-century “thingum” (a trifling detail) or of the parent of these slang or obsolete offspring, “thing.”

We’ll save the noun “thing,” one of the oldest words in English, for another day.

Q: So my wife opened a Snapple and, as she so often does, read the mini-fact printed under the cap: “Real Fact #916 / The scientific term for a sneezing is sternutation.” We both found the phrase “a sneezing” to be displeasing to the ears. Shouldn’t it be “a sneeze” or just “sneezing”? Side note: I really do enjoy that new word, “sternutation,” even if my spell check doesn’t like it.

A: “A sneezing” is not exactly incorrect, but it certainly is clumsy.

People are more familiar with “sneezing” used as a verbal adjective ( “a sneezing session”) or as a gerund with “the” (“the sneezing was incessant”). A gerund, as you know, consists of an infinitive plus “-ing,” and is used as a noun.

The reason we don’t often see “a sneezing” is that “a sneeze” does the job a lot better. Perhaps the writer originally wrote “a sneeze,” then intended to use “sneezing” alone and forgot to delete the article “a.” Just a guess.

“Sternutation” is indeed a nifty word, though it’s not all that new. It means, according to the Oxford English Dictionary, “the action of sneezing” or “a sneeze.”

It was adopted into English in the 16th century from a medieval Latin word, sternutationem, which is related to the Greek word for “sneeze,” ptarnusthai.

Ultimately, all of these versions—Latin, Greek, and English—are probably echoic, or imitative of the sound the word symbolizes. Many words in English have echoic origins, including “cough,” “giggle,” “cuckoo,” “whimper,” “whistle,” and “tap.”

We’ve written before on our blog about “sneeze” and other words that begin with “sn” and have to do with the nose.

In Old English, the verb “sneeze” was fnesan or fneosan. In Middle English it became fnese, which was altered to snese and finally “sneeze.”

How did the alteration happen? In medieval manuscripts, the letters “f” and “s” were very similar, so the change could have been made unknowingly by scribes. But the new art of printing probably was an influence too.

In his Dictionary of Word Origins, John Ayto explains the evolution of the word’s spelling this way:

“Fnese had largely died out by the early 15th century, and it could well be that when printing got into full swing in the 1490s, with many old manuscript texts being reissued in printed form, printers unfamiliar with the old word fnese assumed it had the much more common initial consonant cluster sn.”

One more note. If you like “sternutation,” you’ll like a derivative, “sternutatory,” a word that describes substances that make you sneeze.

Next time you’re in a restaurant and the pepper guy (the peppier, in faux French) offers you an unwanted turn of the grinder, you’ll know what to say: “No thanks, pepper is so sternutatory.”

Q: I got a case of the giggles when the subject of ponging came up during Pat’s recent appearance on the Leonard Lopate Show. I couldn’t help thinking of an imaginary odor detector that pongs whenever a particularly pungent person enters a room. Ponging? Pungent? There must be a connection way back when!

A: Your suggestion about a possible link between “ponging” and “pungent” may be entirely off base. But then again, maybe not.

The Oxford English Dictionary says the use of the verb “pong” in the sense we’re speaking of (essentially, to stink) is of unknown origin.

To give the verb its full definition, the OED says it means “to smell strongly, esp. unpleasantly; to stink of something.” This usage has been part of British colloquial speech for more than a century.

The OED’s first citation comes from a British periodical, The Marvel (1906): “In its time many things had been tumbled into it, and each had left its flavour behind. ‘It pongs!’ said Mr. Histed.”

Here’s a more recent example, from Jonathan Gash’s mystery novel The Very Last Gambado (1991): “All barkers pong of armpit.”

The corresponding noun “pong”—defined by the OED as “a strong smell, usually unpleasant; a stink”—is a bit older than the verb.

The earliest OED citation is again from The Marvel (1900): “The pong of fride addocks.”

Here’s a mid-century example, from John Braine’s novel Room at the Top (1957): “ ‘What a pong,’ he said. ‘Don’t know how you stand it.’ ”

The word mentioned by a caller to the WNYC show—“ponging”—is a participial form of the verb “pong.”

The word came up, you may recall, in response to a discussion about literary neologisms.

Pat mentioned that Charles Dickens coined a word, “ponging,” to mean declaiming theatrically. That word, which the OED labels theatrical slang, is now rarely used.

Thanks to that caller, we now know that since Dickens’s time “ponging” has acquired a very different meaning.

Today if we described an actor “ponging on the stage,” we’d be talking about a pungent performance indeed!

Q: Do you “log in” or “log on” when you connect to a computer? Is “login” a recognized verb? A related question: do you log “in to” or “into” (“on to” or “onto”) a PC?

A: You can either “log in” or “log on” when you begin a computer session.

The American Heritage Dictionary of the English Language (4th ed.) defines the two verbal phrases this way: “To enter into a computer the information required to begin a session.”

(American Heritage defines “log off” or “log out” as to enter the command to end a computer session.)

Standard dictionaries don’t recognize “login” as a verb. In fact, only a few standard dictionaries even have entries for the noun “login.”

American Heritage defines the noun as the “process of identifying oneself to a computer, usually by entering one’s username and password.”

The Random House Webster’s Unabridged Dictionary adds that it can also refer to just the username and password that someone uses to log in.

The earliest published reference in the Oxford English Dictionary for either phrasal verb (“log in,” in this case) is from a 1963 M.I.T. programming guide: “When he next logs in, he should relieve the excess … by adequate deletions.”

As for your last question, you log “in to” or “on to” a computer, NOT “into” or “onto” it. (Here, you need “to” because you’re mentioning the object—the computer.)

As Pat says in Woe Is I, her grammar and usage book, you don’t combine “in” and “to” or “on” and “to” just because they happen to be next to each other.

“Into is for entering something (like a room or a profession), for changing the form of something (an ugly duckling, for instance), or for making contact (with a friend or a wall, perhaps),” she writes.

Here are some examples from the book: “Get into the coach before it turns into a pumpkin, and don’t bang into the door! Otherwise, use in to. Bring the guests in to me, then we’ll all go in to dinner.”

As for “on to/onto,” if you mean “on top of” or “aware of,” use “onto.” Here are examples from Woe Is I:

Q: On the box of my Mr. Coffee, the “Easy to Clean Glass Carafe” is translated as Pot en Verre Facile à Nettoyer. Now you’d think “carafe” wouldn’t have to be translated! Maybe the translator charged by the word.

A: This is odd indeed, since English adopted “carafe” from French in the late 18th century.

If carafe is perfectly good French, why would a French translation use pot en verre (glass jar) instead? Have the French dropped the carafe, or what?

We asked our Parisian correspondent this question, and he thinks this is simply a bad translation. Here’s his reply:

“I have found commercial uses of the following for the carafe of a Mr. Coffee-type machine: (1) verseuse (from verser, to pour), a container with a spout of some sort; (2) carafe, for “carafe”; and (3) bocal, usually a jar (like a canning jar, a pickle jar, etc.) or a fishbowl, conveying the notion of a round glass container.

“Of these, verseuse is most common in the profession, while carafe is quite common in ordinary use. I myself would say carafe. A pot en verre or a pot de verre would be a bocal.

“So why not use carafe? Because most translations are crap. Businesses use machine translation, or they look around the world for the lowest bid on translation auction sites. You get what you pay for, and businesses that produce such basic stuff as consumer appliances don’t want to pay a lot.

“In addition, translations can go through strange channels. I could easily imagine that there was a first poor-quality translation from Mandarin to English that produced something like ‘glass bowl.’ That may have been corrected later on by the North American distributor, while the original bad translation went to a translator (perhaps a machine) that simply translated the original bad English pretty much word for word.

“I have had to translate into French some manuals that were written in such bad English that I hadn’t a clue as to what they were talking about. When it’s for electrical equipment, it makes me worry….

“In short: for a product description, the best choice would have been verseuse, while carafe would have been perfectly acceptable. It’s for that reason that I’m betting the French translation was done by a non-native speaker, based on an original bad English translation.”

That solves the translation mystery, and it explains why the instructions we’ve gotten with some appliances have been so perplexing. (Translation auction sites? Yikes!) Now for a bit of etymology.

“Carafe” first showed up in English in 1786 as a French borrowing, but it also has equivalents in Italian (caraffa), Spanish (garrafa), Portuguese (garrafa), and Sicilian (carrabba).

Some scholars, according to the Oxford English Dictionary, have also associated it with Persian (qarabah, a large flagon), and with Arabic (gharafa, to draw or lift water, and ghuruf, a little cup).

If you read novels of bygone days, you may have come across an older word, “carboy,” which means a large bottle covered with protective basket work. This word and “carafe” are probably distant relatives, since “carboy” is from the Persian qarabah and may also be related to the Arabic qirba (a large leather bottle).

But back to “carafe.” It entered English first in Scotland and later in England. Ever since, it has meant a glass bottle used for water—either at the table or at the bedside—or for wine.

The word’s association with coffee began in the early 20th century, when the OED says “carafe” came to mean “an insulated jug for serving beverages, esp. coffee; (hence, also) such a jug which is an integral part of a coffee-making apparatus.”

The OED labels this sense of the word as originally and chiefly North American. Oxford’s earliest citation is from a 1911 ad in the New York Times for “Vacuum carafes, $5.”

A more recent citation can be found in Nicholson Baker’s novel A Box of Matches (2003): “Then you rinse out the filter basket and the carafe.”

Q: I’m increasingly irritated by the growing use of “Really?” at the beginning or end of newspaper headlines (e.g., “Really? Squatter lives in Ann Curry’s townhome”). I suppose there’s nothing technically wrong with it, but it seems a slangy and childish construction. I hope this fad soon fades. Just what is “really,” really?

A: The word “really” really gets a workout in the news, not only in headlines but in broadcast journalism.

The writers of Saturday Night Live, for example, spoof this tendency in occasional Weekend Update segments called “REALLY!?!”

The headline example you cite (“Really? Squatter lives in Ann Curry’s townhome”) comes from the Orange County Register.

The California daily regularly uses the word as an apparent play on “real estate” (“Real estate news and views from around the globe that make you go, ‘Really?’ ”)

The New York Times regularly uses “Really?” with headlines on a column about health facts and fictions: “Really? The Claim: To Prevent Migraines, Drink More Water.”

But “really” shows up in a lot of other Times headlines. A search of the newspaper’s archive finds 181 “really” headlines over the last 12 months. Here are a few:

The adverb “really” may sound slangy or childish to you, but it’s been standard English for hundreds of years.

The Oxford English Dictionary defines the word, which entered English around 1425, this way: “In reality; in a real manner. Also: in fact, actually.”

In the 18th century, the OED says, the word took on the sense in the example you cite: “Interrogatively, expressing surprise or doubt.”

The OED’s first example of this usage is from Samuel Richardson’s novel Sir Charles Grandison (1753-54): “ ‘The Count of Belvedere. He was more earnest in his favour — ’ ‘Really?’ ‘Yes, really — than I thought he ought to be.’ ”

We’d describe the use of “really” in those headlines above as casual or informal, not slangy or childish.

But we agree with you that headline writers have been giving “really” a real workout lately. They really ought to give it a rest.

Q: I heard an interview with the historian Louis Henry Gates Jr. the other day and I swear he pronounced the first syllable of “chimera” like the beginning of “chicken.” Is it just me, or what? I must break off now and return to my chi-square calculations.

A: The word “chimera” begins with a “k” sound, as in words like “character,” “chasm,” and “Christian.” The accent is on the second syllable: ki-MIR-uh.

This is the only pronunciation given in standard dictionaries, as well as the Oxford English Dictionary.

The chimera, according to the Oxford English Dictionary, was “a fabled fire-breathing monster of Greek mythology, with a lion’s head, a goat’s body, and a serpent’s tail (or according to others with the heads of a lion, a goat, and a serpent).”

The word in Greek means “she-goat,” and the fact that it comes from Greek accounts for its pronunciation.

In Greek writing, the word begins with X (the letter chi), which is pronounced like “k.” In English words that come from Greek, the “ch” letter combination is usually pronounced like “k.”

This is why the words “Christ” and “Christmas,” for example, begin with a “k” sound (for the Greek X).

In translating manuscripts from Greek, medieval scribes often substituted “X” for “Christ” in words like “Christmas” (“Xmas”) and “Christian” (“Xian”), as we wrote in a posting a few years back.

The word “chimera” was first recorded in English (spelled “chymere”) in the Wycliffe Bible of 1382. Back then, it meant the monstrous creature of mythology.

Later, it was used more loosely to mean any grotesque monster or phantasm.

And in the 16th century, the OED says, “chimera” acquired its modern meaning:

“An unreal creature of the imagination, a mere wild fancy; an unfounded conception.”

Q: Long before Pink Floyd, people were confusing the “dark side” of the moon with the “far side.” The “dark side” is the one without sunlight, but many think it’s the side we can’t see—that is, the “far side.” It seem to me that the confusion began in the 20th century, but I wonder if you have any information on this.

A: The dark side of the moon, as you say, is the one that faces away from the sun at any given time. The far side is the one that faces away from the earth. Only during a full moon are the dark side and the far side the same side.

It’s understandable, however, that many people refer to the far side (the one we can’t see) as the “dark side.” The adjective “dark” has meant hidden from view or knowledge since Shakespeare’s day, according to the Oxford English Dictionary.

We don’t know exactly when the lunar confusion began, but a search of the America’s Historical Newspapers database suggests that it existed at least as far back as the early 19th century.

In fact, the earliest published reference for “the dark side of the moon” in the database, from the Oct. 1, 1810, issue of the Rural Visitor, a Burlington, NJ, newspaper, seems to be an example of confusion:

“Men may be found possessing great professional knowledge, much integrity, and yet be as utterly unnoticed as though they tenanted the dark side of the moon.”

Most of the 19th-century examples, however, are from articles about eclipses and other astronomical events, and the writers use the term “dark side” properly.

As you suggest, the confusion grew in the 20th century. The July 15, 1915, issue of the Kansas City Star, for example, has an example in its serialization of Winnie Childs: The Shop Girl, a novel by C. N. and A. M. Williamson.

In the Williamsons’ work, “the dark side of the moon” is described as “the side about which people seldom troubled and never saw.”

As for the 1973 Pink Floyd album, The Dark Side of the Moon, it’s probably responsible for some of the recent confusion.

But from what we’ve read, the band chose the title as an allusion to the dark side of lunacy, not of Luna.

Q: When Pat was on WNYC, a caller suggested Charles Dickens as the source of “What the dickens!” Actually, it was Shakespeare. Here’s the exchange, from The Merry Wives of Windsor. “Mrs. Page: I cannot tell what the dickens his name is my husband had him of. What do you call your knight’s name, sirrah? Robin: Sir John Falstaff.”

A. Well (as Falstaff once said … we think), whaddya know!

OK, Shakespeare used the phrase more than two centuries before Charles Dickens saw the light of day. But the Bard wasn’t necessarily the first person to use it.

So who the dickens is responsible for all the exclamations that feature the word “dickens”?

The American Heritage Dictionary of the English Language (4th ed.) says “dickens” used in this sense is a euphemism for “devil,” influenced by the name Dickens.

So Charles wasn’t responsible for the usage, but the surname “Dickens” may have had something to do with it.

Q: You’re cruel to us economists in your 2007 posting about “monies.” What would we do without “monies,” particularly “near monies,” which is by now part of the standard definition of M2. And how else can the IMF refer to the “monies” of member nations or the Bank for International Settlements refer to central-bank “monies.”In other words, what shall we call the various categories of money if not “monies”?

A. That entry about “monies” (as well as a later one in 2009) was obviously NOT aimed at economists, though it seems to us that they often use unnecessarily opaque language when communicating with lay people.

Of course “monies” is a legitimate word. As we note on our blog, it’s in standard dictionaries.

Obviously, as you point out, it has technical applications that aren’t germane to the layman. Our point was that plain old “money” should be used where possible.

Since you brought up near monies and M2, we’ll define a few terms here for any readers who aren’t economists or financial-news junkies:

Q: I’m stumped by the following: “You would think a mind-warping full moon was/were beaming bright.” The word “was” sounds better to me, but “were” looks better. Which is correct?

A. “Was” is correct: “You would think a mind-warping full moon was beaming bright.”

“Would think” is the simple conditional form of the verb “think.” The auxiliary “would” is not a signal that the verb in the final clause should be in the subjunctive mood (“were” instead of “was”).

One might be tempted to use the subjunctive in a case like this because the subjunctive is the mood of the hypothetical—of speculation and possibility.

But your sentence doesn’t call for the subjunctive. We’ve written about the subjunctive several times on our blog, most recently in a posting last February.

As we say in those blog entries, the subjunctive is used today primarily for three purposes:

(1) To express a wish: “I wish a mind-warping full moon were beaming bright.”

(2) To express an “if” statement about a condition that’s contrary to fact: “If a mind-warping full moon were beaming bright, I’d howl at it.”

(3) To express that something is being asked, demanded, ordered, suggested, and so on: “A werewolf movie demands that there be a mind-warping full moon.” (The subjunctive “be” is used here instead of the indicative “is.”)

It’s worth noting, as we say in our blog entries, that the subjunctive was once much more commonly used in English than it is today.

Some archaic usages have survived, and these account for such phrases as “lest she forget” (instead of “forgets”), “God forbid” (instead of “forbids”), “come what may” (instead of “comes”), “the powers that be” (instead of “are”), “suffice it to say” (instead of “suffices”), “come spring, we’ll meet again” (instead of “comes”), and “long live the Queen” (instead of “lives”).

We’ve also noted on our blog that the subjunctive is losing ground in British English, though it’s holding its own (for now) in standard American English.