PULLUM ON THE PASSIVE.

For years, Geoff Pullum has been carrying on a war against the people who carry on a war against the passive voice without having the faintest idea what it is, and this piece for the Chronicle of Higher Education is a beautiful distillation of it. He quotes “a colleague and friend with an American doctoral degree” who read a draft he had written, “cast a disapproving eye on a couple of passive clauses (correctly identified, I should note), and stressed that she herself tried to avoid passives.” He is delighted: “an empirical claim by a self-identified passive avoider! My colleague, you see, has an excellent and well-written book to her name—a record that could be checked.” So check it he did, learning that “26 percent of the transitive verbs in that five-page preface [to her book] are in passive rather than active clauses,” versus an average of about 13 percent passives in newspapers and magazines: “And here we have double that percentage, in the writing of an academic who imagines that she avoids passives!”

But this is where modern American writing instruction has brought us. Totally unmotivated warnings against sentences that have nothing wrong with them are handed out by people who (unwittingly) often use such sentences more than the people they criticize. And the warnings are consumed by people who don’t know enough grammar to evaluate them (which is why the percentage of passives in published prose continues basically unchanged over time). The blind warning the blind about a danger that isn’t there.

Of course, the suggestion to try and avoid the passive where it is not really necessary is a good one. The passive all too easily becomes a stylistic mannerism and weeding passives out with practised eye can help improve sloppy writing. The problem arises when this handy suggestion is served up as an iron-clad rule of good writing.
It seems to me that people crave rules that they can teach, and follow, to make them into good writers. A couple of months ago a commenter on a Language Log post pointed out that many of his (or her) students used ‘as’ in the meaning of ‘because’, which he felt created confusion with ‘as’ in its temporal meaning. He therefore instructed his students to avoid ‘as’ in the meaning of ‘because’. Since the person in question appeared to be a writing instructor of some kind, it was easy to see that here was a new prescriptive rule in the making. A critical mass of students going out telling their students that ‘as=because’ is wrong could easily lead to us being saddled with another superfluous rule of English writing.

@Bathrobe, re “the suggestion to try and avoid the passive where it is not really necessary is a good one”: I’m really not sure about that. Students often try to write in a formal style without having a good sense of what a formal style really consists of, and therefore they end up sounding ridiculous when they mean to sound smart. Since they’ll have noticed that formal writing uses the passive voice in many situations where a casual, conversational style would not (since formal writing relies more heavily on sentence structure for focus/topicalization/foregrounding effects), some students may end up using the passive voice “hypercorrectly”, in situations where even formal writing would not; but I doubt that telling them, “try and avoid the passive voice where it is not really necessary” will actually help anything, because they will have no sense of when the passive voice is “really necessary” in the style they’re aiming for.
(I would venture a guess that in most cases, the problem is likely to be that the student simply hasn’t read enough. If so, then I doubt that generalities couched as writing tips can compensate for that lack.)

Of course Pullum deserves respect for his immense erudition and industry. I am less impressed with his attack mode, which he is often in. I draw the line at the mock death threats he has issued (mentioned at this blog), though others seem to appreciate the joke.
The topic here is use of the passive voice by a published academic writer who claims to deprecate it. Well found, and satisfyingly censured. But as I have also pointed out, here and elsewhere, Pullum’s strictures do not always fit with his own writing practice. Another of his bugbears is the “that–which” distinction for relative constructions, much followed in the US but with adherents everywhere. I favour it avidly, despite its supposedly “artificial” provenance. So what if it was elevated in guides to good writing from a mere tendency to a principle of lucid expression? It works as one, and has no stylistic or logical downside of which I am aware – used in a suitably nuanced way, as any principle of good writing ought to be.
Pullum rails against it again and again. But his own writing conforms very closely to it. Without surveying his whole oeuvre, I have found clear exceptions only in writing he does collaboratively (as with Huddleston in CGEL).
The ideologically committed refusing to acknowledge a valuable principle that is there. Sure, it is not enshrined as a universal feature of English grammar. But that’s a straw man; no careful writer on the subject suggests that it is such a feature.
O, and hi everyone!

no logical downside of which I am aware
It’s logical downside is that it’s not how the word which is used by untold millions of native speakers, and that you’re elevating (a very recently invented) stylistic preference into a prescriptive rule.
I’m wholly with Pullam and the descriptivists. Why on earth anybody would want to eliminate a word choice that (or which!) has been used in English for more than a millennium without causing anyone the least confusion or bother is utterly beyond me.

It’s logical downside is that it’s not how the word which is used by untold millions of native speakers
Can stylistic advice to distinguish that and which be construed as illogical on such grounds? Millions of people use phenomena as a singular; millions say being that X as opposed to seeing that X, or thusly when they might say instead thus, therefore, hence, or so. Millions of people write it’s instead of its …
Millions of people do such things. It is because a good proportion of those same millions want advice that guides to style are published. They sell well. Some are excellent and widely respected. If some are stupid, bear in mind that some firmly descriptivist linguists are not the brightest luminaries in the room, also. Human nature and capacities manifest themselves variously and circuitously. To prescribe against all that has been carelessly branded “prescriptivist” is not something automatically to be thought intelligent; on closer examination it may be judged hypocritical. Slogans are usually slanders.
Pullum and Huddleston’s Cambridge Grammar of the English Language (CGEL) is a superb grammar. It has its place in the firmament of great scholarly resources. But it is not the last word on how to write well, nor even the first. It doesn’t intend to be. Other works fill that role, and they too have their place. They will always outsell CGEL, because they are written to meet the genuine quotidian needs of millions, not the needs of the few thousand academics who are interested in what CGEL offers.Why on earth anybody would want to eliminate a word choice that (or which!) has been used in English for more than a millennium without causing anyone the least confusion or bother is utterly beyond me.
Utterly beyond you? Your words, not mine. ☺
Who wants to eliminate anything? As far as I can see, many in the Pullum camp would want to eliminate the giving of sound advice – evidence of whose soundness is furnished by Pullum himself, who implicitly follows it in his writing. Go, read him. I’m not inventing this. As for causing confusion, I have shown examples in previous comments here. We can compare our close analyses sometime, if you like.

Specious arguments. The notion that restrictive which is comparable to singular usage of phenomena and other demotic grammar gaffes is ludicrous to me. The idea of “no which with restrictive clauses” was essentially invented out of whole cloth by H.W. Fowler in 1926. It has no connection with the best writing or writers in English, most of whom have demonstrably used which on occasion to introduce restrictive clauses. The only conclusion that can fairly be drawn from the fact that that is more frequently used is precisely that it “is more frequently used.” Nothing more.
I haven’t seen your “previous comments,” obviously, but am not particularly interested. (If they’re at all typical examples, any ambiguity disappears with a simple comma.) It’s my experience that for those for whom that/which is a bugbear no amount of evidence mustered will ever be sufficient–if Pullum doesn’t persuade you, I certainly won’t–so won’t be drawn into an extended debate about an “issue” I consider completely fatuous.

As for causing confusion, I have shown examples in previous comments here.
Heh. A sentence out of context changes color like a lobster in hot water.So what if it was elevated in guides to good writing from a mere tendency to a principle of lucid expression? It works as one, and has no stylistic or logical downside of which I am aware – used in a suitably nuanced way, as any principle of good writing ought to be.
Just so. There will always be those who want rules for everything, and others who reject even the semblance of control. Me, I’ll take any carefully cooked crustacean, even when it requires a bit of shelling.

A most conciliatory comment from a normally crabby Stu.
Emotionally my sympathies lie with laowai since I, too, find the unforced usage more congenial, but on the other hand I don’t agree with the animus shown towards Noetica for liking a clear ‘that/which’ distinction. Observing this distinction can contribute to clarity and discipline in writing, so it is not something to be dismissed as baseless and useless.
As for Ran’s observations, I’m not sure that wider reading is the answer, either. Wide reading can just as easily reinforce bad writing since people can become blinded to their own faults and habits. In order to make people more conscious of their style, it is helpful to point things out.
As for teaching youngsters the art of good writing, I’m afraid I have little experience in that field. My comments were more aimed at people who have already formed a style but need to improve their writing. For me, to be reminded to look out for overuse of passive voice is a useful reminder of how I could be cleaning up my prose.

It doesn’t intend to be.
I say, Noetika. Steady on. What’s wrong with the passive?
G: There will always be those who want rules for everything, and others who reject even the semblance of control.
Yes, that’s it. And there will also be those who won’t even be aware that someone’s set up some rules in another part of the woods.
Commenters get upset here and at Language Log because of a paradox: a guy writes a book of rules and then he gets cross when people start pointing out other rules. He says they aren’t as good as his rules, and they get very sad.

The “causing confusion” thing is just silly. If Noetica, or Strunk, or anyone else wants to follow a fairly arbitrary set of rules for “that” and “which,” if it makes them feel better about their prose, more power to them. We all have our preferred stylistic mannerisms. It’s when they start pointing the finger at other people for not following their arbitrary rules (and “stylistic advice” amounts to just that) that they lose my sympathy (and that of anyone with a scientific attitude toward language).

It is no argument against someone who argues for a freedom that they don’t exercise that freedom themselves. I, for example, think the possession and sale of marijuana should be entirely legal; it’s just irrelevant that I would never smoke it myself.
In addition, I would say that people don’t so much want advice as fear criticism, and that the people who sell style guides as an aid to writing well are basically selling snake oil.

the people who sell style guides as an aid to writing well are basically selling snake oil.
Is that really what you think about style guides? I think criticism of people’s writing style can be very useful. Bathrobe’s example of a class where everyone writes as instead of because – that would drive me crazy if I were their teacher, not that I claim to be a good teacher – is the teacher supposed to remain silent or shouldn’t she at least point out that there’s an alternative (a style guide)?

Pointing out an alternative is fine. Maybe even saying ‘I don’t like “as” for “because” so don’t use it in this class.’ But saying that a correct usage is wrong–that’s evil, particularly when the prejudice is backed up by nonsensical arguments. (For what it’s worth, that use of ‘as’ sounds fine to 62-year-old me.)

I pretty much agree with (the current state of) wikipedia which points out that the distinction of restrictive/non-restrictive clauses in spoken Enlgish isn’t one of pronoun but one of prosody.
I also agree that many people desire clearcut rules they can follow instead of having to use judgement.
The problem is that AFAIK no Anglophone country does a very good job of teaching English speaking children about how their (or any) language really works. The disconnect between traditional grammar and the real language (and the lack of foreign language instruction) mean that most anglophones, even super-literates can’t reasonably talk about issues of usage and ‘correctness’. Taught to distrust their excellent intuitions they fall back on imaginary rules.
The upshot is that paying attention to style guides and what your high school English teacher said will not help you obtain ‘standard’ (that is, high prestige) English. You need to be born into it or find some other way to it. I think that’s how the system is supposed to work (sadly).

I taught college level remedial writing for a program that was close to 100% prescriptive. There would be test sentences like “Neither the two boys nor their sister [was / were] ….” etc. It was pretty much worthless, because the problem these kids had was that they had no ability to formulate, develop, and present their ideas, and often they were crippled by fear of making mistakes in a classroom context which they felt was hostile. (These were mostly white kids, as it happened). Some of them were capable of speaking very well in their local class dialect.
Prescriptivism is strongest in classrooms and in editorial offices of various sorts. Whether it does more good than harm I’m not sure. 400 page style manuals fill me with revulsion, though.
I think that one reason for the popularity of prescriptive grammar is that it’s the kind of thing that can be taught, whereas it’s damn hard to teach writing, and impossible in many cases.
Another aspect I’ve mentioned is that the elimination of local / class dialect forms is part of the process by which many young prospective teachers make it up into the genteel class, and they see their purpose as being helping their students to do the same as they did.

Style guides as a means of achieving uniformity in a publication with many authors, like a newspaper or magazine or scholarly jounal, are not snake oil, though they frequently seem to be silly (the AP Style Guide certainly is; I have never read Chicago.) Nor is style guidance from a competent teacher useless. But I simply don’t believe that reading a style guide will improve your writing significantly. I wouldn’t recommend MW(C)DEU, for example, in order to improve anybody’s writing, though it has the virtue of being reality-based rather than fantastic.

The notion that restrictive which is comparable to singular usage of phenomena and other demotic grammar gaffes is ludicrous to me.
Depends on the purpose of the comparison. If it’s to portray restrictive whichs as having become no more standard than singular phenomena, then I (and I’ll bet Noetica, too) would agree; but that wasn’t the purpose of Noetica’s comparison. The question he raised was not of rightness or wrongness, or of standardization, but of the utility of stylistic advice. And the purpose of his comparison was to show that his question can’t be answered on the grounds of popular usage. A card-carrying descriptivist, I nonetheless agree with him.
Noetica might just be better versed in matters of punctuation and usage than anybody in these parts, and that’s saying a lot. His arguments in favor of the which/that distinction — in favor of its potential utility, that is, not a mindlessly enforced ukase — are more sophisticated than you assume, trust me. You say that you’re not interested in his previous comments, but you should be. An open mind is a good thing.

I have traditionally found people devoted to crackpottish that/which distinctions which do not exist in my idiolect (and which I’m skeptical really exist in their own idiolects in a fully internalized/automated way) to be most aggravating, but in the interests of open-mindedness, I will ask jamessal: granted that “utility” is a separate claim than grammaticality or empirically common practice, but stronger than a mere claim of subjective taste, how is the plausibility of a particular claim of utility to be evaluated? Separately, would the alleged potential utility of a rigid that/which distinction in practice require as a precondition a critical mass of readers who have been primed to expect it and who thus correctly interpret the signal that is intended to be transmitted?

granted that “utility” is a separate claim than grammaticality or empirically common practice, but stronger than a mere claim of subjective taste, how is the plausibility of a particular claim of utility to be evaluated?
With abundant example sentences which the distinction in question, if applied, would make more lucid.would the alleged potential utility of a rigid that/which distinction in practice require as a precondition a critical mass of readers who have been primed to expect it and who thus correctly interpret the signal that is intended to be transmitted?
I think so. And I think that’s why Noetica often makes the point that the distinction’s provenance is less important than its prominence, if you follow me.
Just to be clear, I’m on your side of this debate. I don’t rely on the distinction in my own writing and I don’t generally prefer the writing of people who do. It’s just that having gone toe-to-toe with Noetica in the past over this issue, I’ve come to respect his position.
A side note: why did you invoke idiolects? We’re talking about writing, not speech. I ask because there might be underlying assumptions of interest.

Not all the old prescriptivist rules are taught any more. I didn’t even hear about passive voice from a teacher until AP english, 11th grade, and my mom was the only one to teach me about ending a sentence in a preposition. Granted, both rules are hogwash, the second more so than the first, but knowing them allows you to make more conscious decisions about usage and introduce more subtlety into how you portray yourself.
Someday maybe the schools will teach the rules of traditional Snootish, explain the linguistics behind local dialect, and offer courses on AAvE. That would be the ideal. It is not an improvement to limit knowledge, even if that knowledge is of artificial and sometimes clumsy conventions.

would the alleged potential utility of a rigid that/which distinction in practice require as a precondition a critical mass of readers who have been primed to expect it and who thus correctly interpret the signal that is intended to be transmitted?
Sorry, I’m pasting this again because I missed the bolded word earlier. Noetica isn’t advocating rigidity, just a principle “used in a suitably nuanced way, as any principle of good writing ought to be.” The answer to your question is still yes, however: people have to be aware of the distinction for it to increase clarity, but since the distinction is in fact prominent, that doesn’t matter much.

people have to be aware of the distinction for it to increase clarity, but since the distinction is in fact prominent, that doesn’t matter much.
Hmmm…. now that I’ve written this, I’m not sure if it’s true — or, more importantly, since I’m really just trying to do justice to Noetica’s argument, if he thinks it’s true. It might be his position that a writer’s consistent adherence to the distinction in question would add lucidity whether or not readers were aware of the “rule.”

Bathrobe: It’s funny, I pronounce my handle JAMES-SAL, equal stress, the SAL rhyming with MAL, as in malpractice; but my real name, James Salant, from which I thoughtlessly derived the handle years ago for my first email account, is pronounced JAMES suh-LAHNT — although most people call me Jim. Hey, that wasn’t so funny after all.

Yeah, that was a good one, Bathrobe. I just enjoyed rereading my own evisceration of Garner — an over-strong, none-too-humble verb, sure, but why can’t I be a little proud? It was a hell of a thread, and I see it was you who had the last word, somewhere around comment 750!

Noetica, so good to see you again!
When I first saw “Jamessal” I tended to pronounce it inwardly to myself as ja-MESS-al (I didn’t even think of “mess-all” until right now) while feeling that it was probably wrong. It took me a while to realize that the first word must be James.

“Different than” or “different from”?
Before Hat converted me to descriptivism, I forced “different than” out of my idiolect and “different from” in, and now I can’t go back, and I hate prescriptivism for it. “Different than” sounds strange to me, and it shouldn’t: it’s one fewer tool in my writer’s kit — it just about never makes its way into final drafts — and it’s a ubiquitous, perennial distraction in my reading. I made a similar point in that thread we’ve been discussing:

[W]ith their incessant ordering of an imagined language system, prescriptivists make it harder to read well. They make it harder to consider sense and euphony as opposed to their system, in which they pointlessly suppose only one lexical item should align with one concept… A sentence is not necessarily worse for including “inimicable” (a variant of “inimical”) or “proven” as the past participle of “prove,” but I’d immediately think it was if I’d been influenced by [Bryan Garner, author Garner’s Modern American Usage], and it wouldn’t occur to me that the extra syllable might be felicitous.

“stylistic advice”… amounts to pointing the finger at… people for not following… arbitrary rules
Hat, I’m sorry I skipped over that before — I was so sure we saw eye to eye on prescriptivism, I found that line shocking. Surely it’s the general shoddiness of the whole damn de haut en bas prescriptive canon that we find despicable — the fatuity, the piss poor logic, the false appeals to science and ethics, the meanness, the bullying, etc., etc. — surely it’s Prescriptivism that we find despicable, and not the idea of stylistic advice itself? Surely Richard Lanham has written books of some value, and John Cowan’s revised S&W would do a student more harm than good? Please, Hat, reassure me that the noxious comma nazis haven’t made you so cynical you’ve given up on the whole enterprise of teaching style! Say it ain’t so!

Well, that’s a lot of commentary. Unfortunately some of it responds to points that were not made here, as if they had been made. I have already mentioned a straw man; I could do so half a dozen more times.
Hat, you write:

The “causing confusion” thing is just silly. If Noetica, or Strunk, or anyone else wants to follow a fairly arbitrary set of rules for “that” and “which,” if it makes them feel better about their prose, more power to them. We all have our preferred stylistic mannerisms. It’s when they start pointing the finger at other people for not following their arbitrary rules (and “stylistic advice” amounts to just that) that they lose my sympathy (and that of anyone with a scientific attitude toward language).

Now, I have just deleted a long reply to that. I gave up. It would have to descend into fault-finding that I did not come here to engage in (except against Pullum’s paradoxical tirades). I find too much in it that is palpably unfair and suggestive of misreading – too much that has no bearing what I say, think, or do. I’ll leave it.
I love this blog, and I check it regularly; but I have a couple of reasons to stay away. I’m undisciplined, and likely to be drawn into some fantastical thread of unprecedented length if I stay too long. I like the people very much, and appreciate being liked by them.
Some other time, then.

fault-finding that I did not come here to engage in (except against Pullum’s paradoxical tirades)

I fail to see the paradox in GKP’s post. You can rail as much as you want against the enforcement of a rule, especially when framed as one of (deterministic) grammar rather than one of (probabilistic) readability, while not violating it yourself. John Cowan has given a flawless example.

[CGEL] is not the last word on how to write well, nor even the first. It doesn’t intend to be. Other works fill that role

It would be a good world where something like the “last word on how to write well” existed, but it would entail that we find a perfect algorithmic representation of our cognitive processing of language. I take it you’re not suggesting that’s the case.

Please, Hat, reassure me that the noxious comma nazis haven’t made you so cynical you’ve given up on the whole enterprise of teaching style! Say it ain’t so!
No, no, I got carried away by the exhilaration of the argument. Style advice is fine, of course, as long as it’s couched in terms of “this may make your writing more easily comprehensible” or “this will help you avoid irritating the reader” rather than “this is right and that is wrong.”I find too much in it that is palpably unfair and suggestive of misreading – too much that has no bearing what I say, think, or do. I’ll leave it.
I apologize for any offense I may have given; I hope you know that I and (as far as I can tell) everyone else here is extremely fond of you, and I trust you’re just leaving that particular exchange for the moment and not washing your hands of the whole business. Your contributions are deeply valued, and jamessal was quite right about your being “better versed in matters of punctuation and usage than anybody in these parts”; I’ve lost points trying to fence with you more than once.

If something is a style rule, one person may go one way while another goes the opposite way, without either one being wrong. Styles also fall into categories, e.g. plain vs. elegant, common vs. pretentious, NPR radio vs. country western radio, etc. They can be used to define fictional characters. Authors and speakers can mix styles for effect.
I always argue in these thread that prescriptive grammarians are almost always teaching a class style. On the one hand, this is elitist, implicitly rejecting the local class dialect/style. On the other hand, it can be useful for a student trying to go out into the wider world. I think that that’s the nub of it there. In a way, prescriptivists are showing their non-elite, upward mobile, self-maed, schoolteacherish origins when they insist too strongly. It’s often the case that the upper classes deliberately break the rules that distinguish the parvenu middle classes from the lower classes.
It was perfectly normal around here to say “He don’t got none” when I was a kid, and maybe still, but that kind of speech would have done me a lot of harm as a freshman in college.

Here’s a funny prescriptive story (Attn: AJPC). One of my teachers in middle school insisted that “kids” were only young goats, not children. No one took her seriously, except maybe when writing formally. Awhile back I was doing business with a woman when she casually mentioned her children. I asked, politely I think, “How many kids do you have?” She did a little double take, gritted her teeth, and then said, “I have ten children, and about 40 kids”. She was a goat breeder, and she cared about the rule.

I used to know a couple who had two children, about 6 and 2 years old. They told me that one day they were discussing the possibility of raising goats, and among other advantages the woman said “we could eat the kids” – only to see a look of horror on the 6-year-old’s face.

Polish immigrant and Catholic. Her kids went to work young.
Normally “How many kids do you have?” is a polite throwaway line, like talking about the weather, but when someone has 10 kids, it’s potentially insulting, as though you think there are too many.

Awhile back I was doing business with a woman when she casually mentioned her children. I asked, politely I think, “How many kids do you have?” She did a little double take, gritted her teeth, and then said, “I have ten children, and about 40 kids”. She was a goat breeder, and she cared about the rule.
A prescriptivist child-fetishivist. I shudder.

Just in response to jamessal’s question, I suppose I was talking about both speech and writing. Obviously much of this advice-literature, whether beneficial, baleful or merely useless, is focused on writing and/or transmitted in a school context dealing with written work. But if systematic rules dictating whether to use that or which in a given context really promoted “lucidity,” surely they ought to be (non-coercively) urged for speech as well? I guess I use the word “idiolect” as equally applicable to someone’s native or natural mode of writing as well as speaking, in terms of what syntactic etc. rules their first-draft writing (adjusted for register etc.) will or will not comport with before they go back and try to self-consciously edit it to conform to some stylebook whose norms they have failed to assimilate/internalize. If idiolect is best used only to refer to that sort of thing (plus phonological stuff) in speech, I am open to education as to what the parallel term for an individual’s writing might be.

if systematic rules dictating whether to use that or which in a given context really promoted “lucidity,” surely they ought to be (non-coercively) urged for speech as well?
No, I don’t think so. It’s my lay impression that speech is generally far simpler syntactically than writing, such that instances of referential ambiguity — which adherence to a which/that distinction might clarify — rarely arise in spoken sentences. If you follow the link to the thread where Noetica and I argued about this earlier, you’ll see example sentences of the sort he thinks the distinction would improve, and it’s hard to imagine anyone but maybe Henry James speaking those sentences off the cuff.
More on speech vs. writing and idiolects later.

Thanks to LH (whose firm, luminous style I admire but cannot equal) for conciliatory words. I don’t mean to appear or to be primadonnic (prim-hedonic), among such friends. Exasperating, ugye? I suppose I should stay away exactly when the topic of pre- and descriptiveness comes up, because I think it is perennially mismanaged by those in either trench. I am assigned by some to one trench, by some to the other. I’m in neither; and in Somme, I think it’s all a huge misunderstanding. Pullum – let me say probably for the last time here – has no excuse for his own shovel-ready partisanship.
Soon I’ll email my favourite hatter to exchange what small intelligences we are happy to share; and I am grateful for other correspondence already received.
More another day.

“I always argue in these thread that prescriptive grammarians are almost always teaching a class style.”
I agree and disagree.
Agree: There is a specific register of English required for those who aspire to upward social mobility. This in and of itself is not a bad thing.
Disagree: Following the advice of English teachers (teaching anglophone students), style guides and prescriptivists in general will not lead to mastery of said style. It will mark you as someone trying (and failing) to acquire said style. At present for a variety of reasons, if you’re a native speaker of English your only realistic chances of gaining mastery of the prestige variety is a) being born into it b) having better than average second language (or dialect) acquisition skills.
I am in favor of honesty in labelling and saying “this is a kind of language you need to use in certain (not all) situations if you want to thrive materially”. But prescriptivists don’t say that, they say usage that doesn’t conform to said style is ‘ungrammatical’ or ‘incorrect’ (or worse).
Many years ago (forget where) I read that unlike the usual European example where normative grammars arose out of cultural identity issues, normative grammar in England (which was exported with the language itself) arose of class issues. Namely there was tension between those who wanted to acquire prestige speech to go with their upwardly mobile ambitions and those who already had prestige speech wanting to make its acquisition by outsiders as difficult as possible.

Relating to the thread ‘A Draft of Mandelstam’, I just got back from Hohhot where I bought a number of books related to Mongolian. One of them was a book of poetry from around the world translated into Mongolian, and one of the poems translated was our friend ‘Insomnia… Homer… taut sails’. Having offended read at the other thread with silly comments about ‘sleepy backward cultures’, it mortifies me to say that the book in question is (apparently) an Inner Mongolian edition of a book originally published in Ulaanbaatar. (It appears to be that quite a few books from Mongolia are reissued in Traditional Script editions in Inner Mongolia, and this is one of them.) Ironically, I was unable to find such a book in Ulaanbaatar itself because once an edition has sold out in Mongolia it is often not republished and can be very hard to get hold of.
Once I’ve figured out how to read it (it takes me a lot of painstaking effort to understand the traditional script) I will put it up for people to see.

There is a specific register of English required for those who aspire to upward social mobility. This in and of itself is not a bad thing.
Why isn’t it a bad thing? It’s an arbitrary shibboleth, like wearing a suit:
“The dress was ancient formal court dress, shapeless tubular trousers, a silly jacket with a claw-hammer tail, both in black, and a chemise consisting of a stiff white breastplate, a “winged” collar, and a white bow tie. Bonforte’s chemise was all in one piece, because (I suppose) he did not use a dresser; correctly it should be assembled piece by piece and the bow tie should be tied poorly enough to show that it has been tied by hand — but it is too much to expect a man to understand both politics and period costuming.” —Robert Heinlein, Double Star

Yeah, the existence of a formal register is not a bad thing (and is inevitable anyway), but its being “required for those who aspire to upward social mobility” is indeed a bad thing. It cuts out lots of smart, capable people who would make far less of a mess of things than the idiots who get to run the world because of their irrelevant mastery of a formal register (plus, of course, their social connections).

There is a specific register of English required for those who aspire to upward social mobility. This in and of itself is not a bad thing.
Why isn’t it a bad thing? It’s an arbitrary shibboleth, like wearing a suit:
“The dress was ancient formal court dress, shapeless tubular trousers, a silly jacket with a claw-hammer tail, both in black, and a chemise consisting of a stiff white breastplate, a “winged” collar, and a white bow tie. Bonforte’s chemise was all in one piece, because (I suppose) he did not use a dresser; correctly it should be assembled piece by piece and the bow tie should be tied poorly enough to show that it has been tied by hand — but it is too much to expect a man to understand both politics and period costuming.” —Robert Heinlein, Double Star

Why isn’t it a bad thing? It’s an arbitrary shibboleth, like wearing a suit:
But to be fair, shibboleths aren’t only conservative and they don’t exclude just in one direction, they work both ways. Tony Blair and other Britons added a glottal stop to sound more user-friendly. Nowadays all the suits wear open-necked shirts. They’re feeble attempts to assimilate, win votes etc., though in my view the shirt thing is a move in the right direction. I’m not so sure about fake glottal stops, though political oratory today is generally less pompous than it was, say, in the 1960s.

I do like environments where wearing a suit is the norm; one less thing for people to overthink when meeting me. Similarly, I wish we had an international standard accent in English for much the same reason.

Learning to tie a bow-tie can be done much more easily than (past a certain age) learning to speak (esp. as to pronunciation) a prestige dialect you were not exposed to in childhood. Uniform-like dress codes may be good or bad for other reasons, but (assuming the necessary budget is not too astronomical) are the sort of “shibboleth” that can promote rather than retard social mobility.

Younger doctors dress much more informally nowadays, Aidan. I’ve even seen middle-aged consultants wearing jogging shoes in a (Norwegian) hospital.Uniforms…promote rather than retard social mobility.
That’s what they always tell you at school in Britain, the uniforms are to help the poor kids blend in. I always thought it was lies, it just allowed the school to set the hierarchy instead of the pupils doing it. They had special rugby and cricket blazers (really nice ones, actually, much nicer than the rest of us wore) and prefects’ ties and God knows what else to show who was better than everyone else.

To clarify, having a prestige dialect within a given society that needs to be learned and used if you want to get ahead is not especially onerous in and of itself.
The problem is that in English speaking countries it’s unreasonably difficult for most people to properly learn yhe local prestige dialect if they weren’t born into it (being ‘exposed’ to it doesn’t count, I’ve been exposed to lots of British English and can no more use it than I can fly).
Some of the problems are practical. The unharmonic spelling system of English and variety of vowel systems used by speakers means that there’s no reliable way for people to systematically learn prestige speech. This is by no means the case in many other languages. In Polish adopt a spelling pronunciation and you’re good to go.
There’s also the fact that a lot of the guides official and non-official for teaching people the prestiage dialect do no such thing. You can follow the rules given by prescriptivists and English teachers and style guides and you won’t be using the real prestige dialect you’ll be using a burlesque of it that marks you as an outsider just as much as ‘he don’t got none’ does.
Every indication that I can find says that society overall is just fine with this state of affairs. Otherwise it might change.
I can thing of a few things that could/should happen to make it easier for Americans (the case I know most about) to acquire prestige speech and writing if that’s what they want:
1. remodel the grammar taught to anglophone childrens along the lines of the grammar taught to ESL students (superior in every way to traditional models). Make it clear that the prestige form is not ‘better’ or ‘more correct’ than what they already speak but a socially non-negotiable obligation for certain purposes.
2. include learning IPA (or something kind of like it) practice transcription
3. make foreign language classes mandatory for students starting kind of young (nb, the goal here is less expecting the childen to become proficient in a foreign language but to increase their overall language awareness).
“I wish we had an international standard accent in English for much the same reason’
ugh, I would hate that. I’m not enthusiastic about international use of English. I care about American English but don’t much care about non-American usage (beyond the amusement factor). I realize I’m not necessarily in the majority with that.

What I meant to say is that prescriptive grammar has most of its power at exactly the point where a teacher who has struggled their way up into the middle class from below is dealing with students who have not yet made that rise. Often the teacher wants all of them to do so, but quickly gets angry at the ones who don’t try. The shibboleth settles on class dialect mostly because, as I said, elements of it can be taught (the shibboleths) and the actually goal hoped for is literally impossible. In a class society you can’t bring everyone up to the middle or above. At best the teacher is a gatekeeper who rescues a few individuals while abandoning the others to their fates.
Sometimes prescriptive grammar in the media might be an attempt to keep the lower orders down, but just as often it’s someone who’s just barely escaped the lower orders outing themself as a parvenu. And sometimes it’s someone who had a literary education grumbling about someone richer with a technical education.
Anyway, the fact of class dialect is pretty deeply rooted in the whole way of life.

In Canada most doctors or dentists don’t wear the traditional white lab coat. However, if I have to have some small piece of myself removed from my body, I would prefer that the doctor should wear a freshly laundered white lab coat than a fuzzy sweater carrying God knows how many other patients’ DNA.

In fact surgical scrubs are no longer white, because blood splashes on white look too lurid, and because white coats in combination with bright operating lights can cause eyestrain. Surgeons therefore switched first to a pale green, and more recently to teal, which looks better on video (many surgeries are now videotaped so that students can observe them without being present).

marie-lucie, You don’t want anyone in a lab coat operating on you, at least not in the U.S. The lab coat is worn over scrubs outside of patient areas to keep the surgical scrubs clean, but it is probably more useful in preventing the spread of germs in the opposite direction, that is, from the patient bedside. The color of the surgical scrubs can also be used to identify the job title of the person wearing it, depending on the hospital. Short lab coats generally identify student doctors. The last time I worked in an OR, quite some time ago, there was also a special color of scrub for the OR that could not be worn in any other area of the hospital. You picked up the scrubs on your way into the OR dressing room and deposited them in the laundry on your way out. Once inside the OR, there was also a gowning procedure for anyone who was “sterile” – an operating room has “sterile” and “dirty” personnel – as well as an extensive handwashing and glove ritual for anyone entering the OR.

Dakota, I was not referring to a major operation in an OR, in which the personnel wears suits of soothing colours and the patient is put under total anesthetic, but to a minor procedure that can be done in a doctor’s office. That’s why I said “a small piece of myself”.

My wife, as it happens, is having a small piece of her breasts removed in her surgeon’s office next week. It’s a pair of blobs or bubbles that remained in the dead center when her breasts were reconstructed after a double mastectomy. They look extremely strange.

And you can support my book habit without even spending money on me by following my Amazon links to do your shopping (if, of course, you like shopping on Amazon); I get a small percentage of every dollar spent while someone is following my referral links, and every month I get a gift certificate that allows me to buy a few books (or, if someone has bought a big-ticket item, even more). You will not only get your purchases, you will get my blessings and a karmic boost!

Favorite rave review, by Teju Cole:
"Evidence that the internet is not as idiotic as it often looks. This site is called Language Hat and it deals with many issues of a linguistic flavor. It's a beacon of attentiveness and crisp thinking, and an excellent substitute for the daily news."

From "commonbeauty"

(Cole's blog circa 2003)

All comments are copyright their original posters. Only messages signed "languagehat" are property of and attributable to languagehat.com. All other messages and opinions expressed herein are those of the author and do not necessarily state or reflect those of languagehat.com. Languagehat.com does not endorse any potential defamatory opinions of readers, and readers should post opinions regarding third parties at their own risk. Languagehat.com reserves the right to alter or delete any questionable material posted on this site.