Staci Bernard Roth's Thoughts on Words and the World

It’s been over a month since the 1145th anniversary of his coronation, but Alfred the Great is on my mind. It’s probably because my semester has just ended, and I’ve been reflecting on the past school year. I welcome any wisdom that can help me reflect, and where better to look for wisdom than in the rule of King Alfred?

According to Asser, Alfred possessed a “sapientiae desiderium” (“desire for wisdom”*), particularly in the “liberalem . . . artem” (“liberal artes”**). His mother, Osburh, encouraged his love of learning when she showed a book to Alfred and his brothers and promised it to whichever son could memorize it first. Alfred, the youngest, won. Years later, he passed down this tradition of learning by providing teachers for his own children and “omnibus pene totius regionis nobilibus infantibus et etiam multis ignobilibus” (“all the noble children in the entire region, and also many common ones”). Making education available to “common” children anticipates much later movements toward universal education.

The young Anglo-Saxon students “utriusque linguae libri, Latinae scilicet et Saxonicae” (“read books in both languages, namely Latin and Saxon [English]”). Alfred was painfully aware of how important both Latin and English were. Although he had read English from a young age, Latin came only later and with the help of “magistros” (“experts”) from “ultra mare” (“beyond the sea”)–whom he rewarded with “magna potestate” (“great power”)–as well as, ultimately, “divino instinctu” (“divine inspiration”).

He knew that he was not alone in his ignorance of Latin. In the preface to his translation of Pope Gregory’s Regula Pastoralis (Pastoral Rule, known more commonly as Pastoral Care), he famously wrote, “SwæclæneheowæsoðfeallenuonAngelcynneðætswiðefeawawæronbehionanHumbreðe ðiora ðeningacuðen understondanonEnglisc,oððefurðumanærendgewritofLædeneonEngliscareccean;ondicweneðættenohtmonigebegiondan Humbre næren” (“So thoroughly had it [learning] declined in England that there were very few on this side of the Humber who could understand their divine services in English, or even translate a written letter from Latin to English; and I think that there were not many beyond the Humber”***). After this demonstration of England’s state of educational disrepair, Alfred points out the dangers of an educational and religious system that relies on a little-known language to transmit knowledge and enjoins the reader to

“[g]eðenchwelc witu usða becomonforðisse worulde,ða ða wehitnohwæðer ne selfe ne lufodon ne eacoðrum monnum nelefdon” (“remember that punishment then befell us in this world, when we neither loved it [learning] ourselves nor passed it down to other men”). The “punishment” to which Alfred refers is undoubtedly the Viking attacks in which “hit eallforhergodwæreondforbærned” (“it all was ravaged and burned”) and before which “ða ciricean giondeallAngelcynnstodonmaðmaondbocagefylda” (“the churches throughout all England stood filled with treasures and books”). Alfred points out that people had received “lytlefiorme“ (“little benefit”) from those books, though, because the books “næron onhiora agen geðiodeawritene” (“were not written in their own language”). The suggestion is, of course, that the inability to read and the ignorance it fostered were responsible for decades of suffering at the Vikings’ hands.

Although today most would not blame the Viking invasions on God’s anger at the state of education in Anglo-Saxon England, we can still admire Alfred’s prescience and intellect. His life and career reveal that learning was no less important than–and inexorably connected to–other aspects of kingship and citizenship, including military savvy and prowess. This lesson, certainly, is as valuable today as it was in the ninth century.

In a professional development workshop today, someone used the word “relevancy.” I don’t remember who said it or what the context was (thanks to last night’s insomnia), but I do remember the word. I don’t remember anyone ever telling me that “relevancy” is not a word, or that it is an unfortunate or undesirable choice for whatever reason, but I do know that I don’t like it. I much prefer “relevance.” Some sources, such as Grammarist and the folks at the English Language and Usage Stack Exchange, state that the forms are interchangeable although the former claims that “relevance” is preferred. For what it’s worth, the OED notes the first use of “relevancy” in 1678 and that of “relevance” in 1787. Looking at “-ancy” and “-ance” in isolation, however, reveals more information. According to the OED, “-ancy” comes from the Latin “-ntia” and relates to a “quality or state.” On the other hand, “-ance”–which comes from Latin but via Old French–relates to action. It would seem, then, that if once were to take a prescriptive perspective, “relevancy” would be the preferable word. I guess the only thing left for me to do is to deal with it.

Are there words or forms of words that drive you crazy, even if they shouldn’t?

Over breakfast this morning, I was telling my daughter about an essay I’m writing. “I’ve written the sad part,” I said, “so it’s all downhill the rest of the way.” She was curious about my use of “downhill.”

“Isn’t that negative?” she asked.

“I suppose it can be.” After all, I explained, going downhill is easier than going uphill, but for many people, “down” has negative connotations. (Think: Hell.) The Oxford English Dictionary states that the word can be used figuratively as well as literally, but I wasn’t able to find a definition with positive connotations. The word does, however, appear on some lists of contranyms (or auto-antonyms or Janus words), words that possess two opposite meanings. Recently, in the Chronicle of Higher Education‘s Lingua Franca blog, Anne Curzan speculates on whether there’s a difference between “going downhill” and “all downhill from here/there” and reminds us of the importance of context.

Despite some people’s inclination to label “downhill” a contranym (or an auto-antonym or a Janus word), Curzan points out that this label is not quite accurate; after all, “easy” and “bad” are not quite opposites. For now, if I need a label, I’ll refer to “downhill” as “one of those interesting linguistic things.”

This evening, I read an PhotoAbility Article profiling Deborah Davis, whose business, PhotoAbility, provides stock images of people who happen to have disabilities. The people in the photos are doing the same sorts of things that people without disabilities do, and the photos can illustrate articles on just about anything. Ideally, the use of these images in mainstream media will “normalize” people with disabilities. (Full disclosure: I’ve been living with multiple sclerosis for twenty years, so I have a vested interest in this topic.)

The article was fine, but I wish the reporter had followed Ms. Davis’ example and used people-first language. People-first language is what it sounds like. Instead of writing “disabled people,” for example, one would write “people with disabilities.” The difference may seem negligible, but think about it: Which should we prioritize–one’s humanity or one’s diagnosis, which is but one facet of that humanity? In the article, where Davis mentions “someone with a disability,” the reporter refers to “images of disabled people.” In the next paragraph, Davis brings up “a person with a disability,” shortly after which the reporter mentions “photos of disabled people.”

Not every individual or organization agrees with using person-first language. For example, many in the Blind, Deaf, and Autistic communities equate it with feelings of shame and/or attempts to distance the person from the disability. I respect these views and realize that it is up to people to define themselves. Because Davis used people-first language, though, the reporter should have followed suit.

Yesterday my Facebook news feed revealed a meme comparing Hillary Clinton to Adolph Hitler. (I will not dignify the meme by linking to it.) Suffice it to say that I told the poster what I thought.

That anything an American politician can do to approximate the horrors perpetrated by Hitler is highly unlikely, to say the least. More insidious than this suggestion is the trivialization of Nazi atrocities that such statements involve.

These comparisons go beyond simple hyperbole. The problem is neither the logical fallacy nor the lack of imagination needed to commit it; rather, it is the aforementioned trivialization. If refusing to bake a cake for a couple is as bad as a Nazi policy that contributed to the deaths of millions of people, then it follows that Nazi policies that contributed to the deaths of millions of people were no worse than refusing to bake a cake. The dwindling number of Shoah survivors and World War II veterans distances us from the atrocities they endured and witnessed. We must keep their memories alive.

Several years ago, one of my students thought it would be funny to sieg heil me. (A classmate, to his credit, tried to stop him.) The young man in question was not evil, or even bad. He was immature, and he just didn’t get it. His dad got it; I could feel the man’s mortification over the phone when I told him what had happened. We are surrounded by words and phrases like “feminazi” and “Grammar Nazi” that dilute the reality of the Shoah, though, so it is not surprising that a teenager would not understand the gravity of the situation. As adults, we must set a good example by not taking the easy way out with shabby rhetoric, and we must set others on the correct path when they stray. When one of my students–or colleagues–tosses around the word “Nazi”–or “slavery,” or “rape”–I explain the idea and danger of trivialization and have the student find a less damaging and more accurate word or phrase. I owe it to those whose voices have been lost.

Yes, I know the pun in this post’s title is horrible, but it will have to do until I can think of something better. My concern at the moment is grammar and usage. I recently have come across posts and articles about grammar that, in the “Comments” section, spiral into arguments about whether certain rules are actual rules and even if such a thing exists in the realm of language. In various contexts, both print and digital, laypeople and linguists such as Columbia University’s John McWhorter discuss whether these rules (or “rules”) are fetters holding us to a version of English that never was or worn gates preventing us from tumbling into barbarism.

Of course, I see both sides’ points. I’m aware that:

1. Language evolves, and ultimately there is nothing we can do to stop it.

2. Many of our rules came about in the eighteenth century, when folks set out to standardize English, sometimes using Latin as their guide despite the fact that not only is Latin not English but it also is not even Germanic.

3. Because of #2, Shakespeare’s plays commit all sorts of atrocities that we would not overlook in our students’ writing.

It is these students that I often think about when considering my views of English’s rules. I never correct my students’ speech (unless they try to instigate the “Let’s find fault with each other’s spoken English” game, which I win quickly so that we can get back to the task at hand). I do, however, teach on the conservative end of the grammar spectrum. For example, I do not except “they” as a singular pronoun, and I differentiate between “lay” and “lie.” I do this not to strike a blow for civilization but rather to prepare my students for whatever professors will cross their paths next year. I see it as a question of dialect, and I explain that to the students. There is a time and a place for everything, and just as we naturally code switch in daily life, so must we do so when writing in academia. Yes, the dialect of the Academy might be that of old (or dead) white men, but if they want to navigate the Academy, they must learn the lingo.

First off, I can’t believe that it’s been so long since I’ve posted here. My intent was to post at least once a week, even once school started, but clearly I’ve fallen short. I can’t change the past, of course, but I can recommit myself my writing, including my blogging. Here it goes:

A couple of days ago, I discussed the Tower of Babel with some of my classes. As I was preparing the lesson, I wondered whether our word “babble” derives from “Babel.” I think I always assumed it did, but I didn’t know for sure. I was curious, and I also figured the students would ask me about the connection, so I quickly looked up the words’ etymologies in a nearby dictionary. As it turns out, the words are not related. “Babel” comes from the Akkadian babilu (“gate of God”), whereas “babble” derives from the Middle English babeln (“to chatter”). According to the Online Etymology Dictionary, babeln is related to the Swedish babbla and the Old French bablillier.

This incident, which took only a few minutes before class, makes me think of false etymolgies and notions of Edenic language. I plan to explore both ideas in future posts.

I just finished reading Arika Okrent’s 2009 In the Land of Invented Languages. (Continually grading papers keeps me one step behind in my reading.) Okrent is a linguist who has studied invented languages and the people who invented them. She discusses Esperanto and Klingon, as well as lesser-known languages, and she explores the inventors’ motivations and the reasons behind each language’s success or lack thereof. Her conversational tone and respect for the languages’ creators and learners kept me reading well into the night.