Arrant Pedantry

By

15% Off All T-Shirts

First, I apologize for not blogging in so long. It’s been a crazy summer, complete with a new baby (yay!), a new job (yay!), and moving to a new house (boo!). I’ve got a few posts in the works and hope to have something done soon.

Second, it’s time for another sale! Now through September 2, get 15 percent off all T-shirts in the Arrant Pedantry Store. Just use the code SHIRTS15 at checkout.

By

Do Usage Debates Make You Nauseous?

Several days ago, the Twitter account for the Chicago Manual of Style tweeted, “If you’re feeling sick, use nauseated rather than nauseous. Despite common usage, whatever is nauseous induces nausea.” The relevant entry in Chicago reads,

Whatever is nauseous induces a feeling of nausea—it makes us feel sick to our stomachs. To feel sick is to be nauseated. The use of nauseous to mean nauseated may be too common to be called error anymore, but strictly speaking it is poor usage. Because of the ambiguity of nauseous, the wisest course may be to stick to the participial adjectives nauseated and nauseating.

Though it seems like a straightforward usage tip, it’s based on some dubious motives and one rather strange assumption about language. It’s true that nauseous once meant causing nausea and that it has more recently acquired the sense of having nausea, but causing nausea wasn’t even the word’s original meaning in English. The word was first recorded in the early 17th century in the sense of inclined to nausea or squeamish. So you were nauseous not if you felt sick at the moment but if you had a sensitive stomach. This sense became obsolete in the late 17th century, supplanted by the causing nausea sense. The latter sense is the one that purists cling to, but it too is going obsolete.

I searched for nauseous in the Corpus of Contemporary American English and looked at the first 100 hits. Of those 100 hits, only one was used in the sense of causing nausea: “the nauseous tints and tinges of corruption.” The rest were all clearly used in the sense of having nausea—”I was nauseous” and “it might make you feel a little nauseous” and so on. Context is key: when nauseous is used with people, it means that they feel sick, but when it’s used with things, it means they’re sickening. And anyway, if nauseous is ambiguous, then every word with multiple meanings is ambiguous, including the word word, which has eleven main definitions as a noun in Merriam-Webster’s Collegiate. So where’s this ambiguity that Chicago warns of?

The answer is that there really isn’t any. In this case it’s nothing more than a red herring. Perhaps it’s possible to concoct a sentence that, lacking sufficient context, is truly ambiguous. But the corpus search shows that it just isn’t a problem, and thus fear of ambiguity can’t be the real reason for avoiding nauseous. Warnings of ambiguity are often used not to call attention to a real problem but to signal that a word has at least two senses or uses and that the author does not like one of them. Bryan Garner (the author of the above entry from Chicago), in his Modern American Usage, frequently warns of such “skunked” words and usually recommends avoiding them altogether. This may seem like sensible advice, but it seems to me to be motivated by a sense of jealousy—if the word can’t mean what the advice-giver wants it to mean, then no one can use it.

But the truly strange assumption is that words have meaning that is somehow independent of their usage. If 99 percent of the population uses nauseous in the sense of having nausea, then who’s to say that they’re wrong? Who has the authority to declare this sense “poor usage”? And yet Garner says, rather unequivocally, “Whatever is nauseous induces a feeling of nausea.” How does he know this is what nauseous means? It’s not as if there is some platonic form of words, some objective true meaning from which a word must never stray. After all, language changes, and an earlier form is not necessarily better or truer than a newer one. As Merriam-Webster editor Kory Stamper recently pointed out on Twitter, stew once meant “whorehouse”, and this sense dates to the 1300s. The food sense arose four hundred years later, in the 1700s. Is this poor usage because it’s a relative upstart supplanting an older established sense? Of course not.

People stopped using nauseous to mean “inclined to nausea” several hundred years ago, and so it no longer means that. Similarly, most people no longer use nauseous to mean “causing nausea”, and so that meaning is waning. In another hundred years, it may be gone altogether. For now, it hangs on, but this doesn’t mean that the newer and overwhelmingly more common sense is poor usage. The new sense is only poor usage inasmuch as someone says it is. In other words, it all comes down to someone’s opinion. As I’ve said before, pronouncements on usage that are based simply on someone’s opinion are ultimately unreliable, and any standard that doesn’t take into account near-universal usage by educated speakers in edited writing is doomed to irrelevance.

So go ahead and use nauseous. The “having nausea” sense is now thoroughly established, and it seems silly to avoid a perfectly good word just because a few peevers dislike it. Even if you stick to the more traditional “causing nausea” sense, you’re unlikely to confuse anyone, because context will make the meaning clear. Just be careful about people who make unsupported claims about language.

By

Celebrate T-Shirt Day with 15% Off

T-shirt day is June 21st, and in preparation for the big day, Spreadshirt is offering 15 percent off all t-shirts when you use the coupon code MYSHIRT2014 between now and June 10th. If you met me at the annual conferences of the American Copy Editors Society and liked my shirts, now’s a good chance to get one for yourself. Go check out what’s available in the Arrant Pedantry Store.

And if you’re not the word-nerd-T-shirt-buying type, don’t worry—a new post is coming soon.

By

Mother’s Day

Today is officially Mother’s Day, and as with other holidays with possessive or plural endings, there’s a lot of confusion about what the correct form of the name is. The creator of Mother’s Day in the United States, Anna Jarvis, specifically stated that it should be a singular possessive to focus on individual mothers rather than mothers in general. But as sociolinguist Matt Gordon noted on Twitter, “that logic is quite peccable”; though it’s a nice sentiment, it’s grammatical nonsense.

English has a singular possessive and a plural possessive; it does not have a technically-plural-but-focusing-on-the-singular possessive. Though Jarvis may have wanted everyone to focus on their respective mothers, the fact is that it still celebrates all mothers. If I told you that tomorrow was Jonathon’s Day, you’d assume that it’s my day, not that it’s the day for all Jonathons but that they happen to be celebrating separately. That’s simply not how grammatical number works in English. If you have more than one thing, it’s plural, even if you’re considering those things individually.

This isn’t the only holiday that employs some grammatically suspect reasoning in its official spelling—Veterans Day officially has no apostrophe because the day doesn’t technically belong to veterans. But this is silly—apostrophes are used for lots of things beyond simple ownership.

It could be worse, though. The US Board on Geographic Names discourages possessives altogether, though it allows the possessive s without an apostrophe. The peak named for Pike is Pikes Peak, which is worse than grammatical nonsense—it’s an officially enshrined error. The worst part is that there isn’t even a reason given for this policy, though presumably it’s because they don’t want to indicate private ownership of geographical features. (Again, the apostrophe doesn’t necessarily show ownership.) But in this case you can’t even argue that Pike is a plural attributive noun, because there’s only one Pike who named the peak.

The sad truth is that the people in charge of deciding where or whether to put apostrophes in things don’t always have the best grasp of grammar, and they don’t always think to consult someone who does. But even if the grammar of Mother’s Day makes me roll my eyes, I can still appreciate the sentiment. In the end, arguing about the placement of an apostrophe is a quibble. What matters most is what the day really means. And this day is for you, Mom.

By

Over Has Always Meant More Than. Get Over it.

Last month, at the yearly conference of the American Copy Editors Society, the editors of the AP Stylebook announced that over in the sense of more than was now acceptable. For decades, newspaper copy editors had been changing constructions like over three hundred people to more than three hundred people; now, with a word from AP’s top editors, that rule was being abandoned.

According to Merriam-Webster editor Peter Sokolowski, who was in attendance, the announcement was met with gasps. Editors quickly took to Twitter and to blogs to express their approval or dismay. Some saw it as part of the dumbing-down of the language or as a tacit admission that newspapers no longer have the resources to maintain their standards. Others saw it as the banishment of a baseless superstition that has wasted copy editors’ time without improving the text.

The argument had been that over must refer to spatial relationships and that numerical relationships must use more than . But nobody objects to other figurative uses of over, such as over the weekend or get over it or in over your head or what’s come over you?. The rule forbidding the use of over to mean more than was first codified in the 1800s, but over can be found in this sense going back a thousand years or more, in some of the earliest documents written in English.

Not only that, but parallel uses can be found in other Germanic languages, including German, Dutch, and Swedish. (Despite all its borrowings from French, Latin, and elsewhere, English is considered a Germanic language.) There’s nothing wrong with the German Kinder über 14 Jahre (children over 14 years) (to borrow an example from the Collins German-English Dictionary) or the Swedish Över femhundra kom (more than five hundred came). This means that this use of over actually predates English and must have been inherited from the common ancestor of all the Germanic languages, Proto-Germanic, some two thousand years ago.

Mignon Fogarty, aka Grammar Girl, wrote that “no rationale exists for the ‘over can’t mean more than’ rule.” And in a post on the Merriam-Webster Unabridged blog, Sokolowski gave his own debunking, concluding that “we just don’t need artificial rules that do not promote the goal of clarity.” But none of this was good enough for some people. AP’s announcement caused a rift in the editing staff at Mashable, who debated the rule on the lifestyle blog.

Alex Hazlett argued that the rule “was an arbitrary style decision that had nothing to do with grammar, defensible only by that rationale of last resort: tradition.” Megan Hess, though, took an emotional and hyperbolic tack, claiming that following rules like this prevents the world from slipping into “a Lord of the Flies-esque dystopia.” From there her argument quickly becomes circular: “The distinction is one that distinguishes clean, precise language and attention to detail — and serves as a hallmark of a proper journalism training.” In other words, editors should follow the rule because they’ve been trained to follow the rule, and the rule is simply a mark of clean copy. And how do you know the copy is clean? Because it follows rules like this. As Sokolowski says, this is nothing more than a shibboleth—the distinction serves no purpose other than to distinguish those in the know from everyone else.

It’s also a perfect example of a mumpsimus. The story goes that an illiterate priest in the Middle Ages had learned to recite the Latin Eucharist wrong: instead of sumpsimus (Latin for “we have taken”), he said mumpsimus, which is not a Latin word at all. When someone finally told him that he’d been saying it wrong and that it should be sumpsimus, he responded that he would not trade his old mumpsimus for this person’s new sumpsimus. He didn’t just refuse to change—he refused to recognize that he was wrong and had always been wrong.

But so what if everyone’s been using over this way for longer than the English language has existed? Just because everyone does it doesn’t mean it’s right, right? Well, technically, yes, but let’s flip the question around: what makes it wrong to use over to mean more than? The fact that the over-haters have had such an emotional reaction is telling. It’s surprisingly easy to talk yourself into hating a particular word or phrase and to start judging everyone who allegedly misuses it. And once you’ve developed a visceral reaction to a perceived misuse, it’s hard to be persuaded that your feelings aren’t justified.

We editors take a lot of pride in our attention to language—which usually means our attention to the usage and grammar rules that we’ve been taught—so it can seem like a personal affront to be told that we were wrong and have always been wrong. Not only that, but it can shake our faith in other rules. If we were wrong about this, what else might we have been wrong about? But perhaps rather than priding ourselves on following the rules, we should pride ourselves on mastering them, which means learning how to tell the good rules from the bad.

Learning that you were wrong simply means that now you’re right, and that can only be a good thing.

By

Book Review: Schottenfreude

German is famous for its compound words. While languages like English are content to use whole phrases to express an idea, German can efficiently pack the same idea into a single word, like Schadenfreude, which means a feeling of joy from watching or hearing of someone else’s miseries. Well, in Schottenfreude: German Words for the Human Condition, Ben Schott has decided to expand on German’s compounding ability and create words that should exist.

Every right-hand page lists three made-up German compounds, along with their pronunciation, their English translation, and a more literal gloss. On the facing left-hand pages are explanatory notes discussing the concepts in more depth. For example, the first word is Herbstlaubtrittvergnügen (autumn-foliage-strike-fun), meaning “kicking through piles of autumn leaves”. The explanatory notes talk about self-reported rewarding events and the metaphorical connection between fallen leaves and human souls in literature.

The rest of the book proceeds much the same way, with funny and surprising insights into the insecurities, frailties, and joys of human life. Who hasn’t at some time or another experienced Deppenfahrerbeäugung (“the urge to turn and glare at a bad driver you’ve just overtaken”), Sommerferienewigkeitsgefühl (“childhood sensation that the summer vacation will last forever”), or Gesprächsgemetzel (“moments when, for no good reason, a conversation suddenly goes awry”)?

You don’t have to be a German speaker to appreciate this book, but it certainly helps. There are a few puns that you can only appreciate if you have a knowledge of both English and German, such as Besserwinzer (“one of those people who pretend to know more about wine than they do”), which is a play on Besserwisser, meaning “know-it-all”, and Götzengeschwätz (“praying to a god you don’t believe in”), which literally means “idol chatter”. And knowing German will certainly help you pronounce the words better; I found the provided pronunciations somewhat unintuitive, and there’s no key. The words also don’t seem to be in any particular order, so it can be a little difficult to find one again, even though there is an index.

Overall, though, it’s a greatly enjoyable little book, great for flipping through when you have a few idle minutes. Word lovers—and especially German lovers—are sure to find a lot of treasures inside.

Full disclosure: I received a free review copy of this book from the publisher. My apologies to the author and publisher for the lateness of this review.

By

Why Teach Grammar?

Today is National Grammar Day, and I’ve been thinking a lot lately about what grammar is and why we study it. Last week in the Atlantic, Michelle Navarre Cleary wrote that we should do away with diagramming sentences and other explicit grammar instruction. Her argument, in a nutshell, is that grammar instruction not only doesn’t help students write better, but it actually teaches them to hate writing.

It’s really no surprise—as an editor and a student of language, I’ve run into a lot of people who never learned the difference between a preposition and a participle and are insecure about their writing or their speech. I once had a friend who was apparently afraid to talk to me because she thought I was silently correcting everything she said. When I found out about it, I reassured her that I wasn’t; not only had I never noticed anything wrong with the way she talked, but I don’t worry about correcting people unless they’re paying me for it. But I worried that this was how people saw me: a know-it-all jerk who silently judged everyone else for their errors. I love language, and it saddened me to think that there are people who find it not fascinating but frustrating.

But given the state of grammar instruction in the United States today, it’s not hard to see why a lot of people feel this way. I learned hardly any sentence diagramming until I got to college, and my public school education in grammar effectively stopped in eighth or ninth grade when I learned what a prepositional phrase was. In high school, our grammar work consisted of taking sentences like “He went to the store” and changing them to “Bob went to the store” (because you can’t use he without an antecedent; never mind that such a sentence would not occur in isolation and would surely make sense in context).

Meanwhile, many students are marked down on their papers for supposed grammar mistakes (which are usually matters of spelling, punctuation, or style): don’t use contractions, don’t start a sentence with conjunctions, don’t use any form of the verb be, don’t write in the first person, don’t refer to yourself in the third person, don’t use the passive voice, and on and on. Of course most students are going to come out of writing class feeling insecure. They’re punished for failing to master rules that don’t make sense.

And it doesn’t help that there’s often a disconnect between what the rules say good writing is and what it actually is. Good writing breaks these rules all the time, and following all the rules does little if anything to make bad writing good. We know the usual justifications: students have to master the basics before they can become experts, and once they become experts, they’ll know when it’s okay to break the rules.

But these justifications presuppose that teaching students not to start a sentence with a conjunction or not to use the passive voice has something to do with good writing, when it simply doesn’t. I’ve said before that we don’t consider whether we’re giving students training wheels or just putting sticks in their spokes. Interestingly, Cleary uses a similar argument in her Atlantic piece: “Just as we teach children how to ride bikes by putting them on a bicycle, we need to teach students how to write grammatically by letting them write.”

I’m still not convinced, though, that learning grammar has much at all to do with learning to write. Having a PhD in linguistics doesn’t mean you know how to write well, and being an expert writer doesn’t mean you know anything about syntax and morphology beyond your own native intuition. And focusing on grammar instruction may distract from the more fundamental writing issues of rhetoric and composition. So why worry about grammar at all if it has nothing to do with good writing? Language Log’s Mark Liberman said it well:

We don’t put chemistry into the school curriculum because it will make students better cooks, or even because it might make them better doctors, much less because we need a relatively small number of professional chemists. We believe (I hope) that a basic understanding of atoms and molecules is knowledge that every citizen of the modern world should have.

It may seem like a weak defense in a world that increasingly focuses on marketable skills, but it’s maybe the best justification we have. Language is amazing; no other animal has the capacity for expression that we do. Language is so much more than a grab-bag of peeves and strictures to inflict on freshman writing students; it’s a fundamental part of who we are as a species. Shouldn’t we expect an educated person to know something about it?

So yes, I think we should teach grammar, not because it will help people write better, but simply because it’s interesting and worth knowing about. But we need to recognize that it doesn’t belong in the same class as writing or literature; though it certainly has connections to both, linguistics is a separate field and should be treated as such. And we need to teach grammar not as something to hate or even as something to learn as a means to an end, but as a fascinating and complex system to be discovered and explored for its own sake. In short, we need to teach grammar as something to love.

By

Lynne Truss and Chicken Little

Lynne Truss, author of the bestselling Eats, Shoots & Leaves: The Zero Tolerance Approach to Punctuation, is at it again, crying with her characteristic hyperbole and lack of perspective that the linguistic sky is falling because she got a minor bump on the head.

As usual, Truss hides behind the it’s-just-a-joke-but-no-seriously defense. She starts by claiming to have “an especially trivial linguistic point to make” but then claims that the English language is doomed, and it’s all linguists’ fault. According to Truss, linguists have sat back and watched while literacy levels have declined—and have profited from doing so.

What exactly is the problem this time? That some people mistakenly write some phrases as compound words when they’re not, such as maybe for may be or anyday for any day. (This isn’t even entirely true; anyday is almost nonexistent in print, even in American English, according to Google Ngram Viewer.) I guess from anyday it’s a short, slippery slope to complete language chaos, and then “we might as well all go off and kill ourselves.”

But it’s not clear what her complaint about erroneous compound words has to do with literacy levels. If the only problem with literacy is that some people write maybe when they mean may be, then it seems to be, as she originally says, an especially trivial point. Yes, some people deviate from standard orthography. While this may be irritating and may occasionally cause confusion, it’s not really an indication that people don’t know how to read or write. Even educated people make mistakes, and this has always been the case. It’s not a sign of impending doom.

But let’s consider the analogies she chose to illustrate linguists’ supposed negligence. She says that we’re like epidemiologists who simply catalog all the ways in which people die from diseases or like architects who make notes while buildings collapse. (Interestingly, she makes two remarks about how well paid linguists are. Of course, professors don’t actually make that much, especially those in the humanities or social sciences. And it smacks of hypocrisy from someone whose book has sold 3 million copies.)

Perhaps there is a minor crisis in literacy, at least in the UK. This article says that 16–24-year-olds in the UK are lagging behind many counterparts in other first-world countries. (The headline suggests that they’re trailing the entire world, but the study only looked at select countries from Europe and east Asia.) Wikipedia, however, says that the UK has a 99 percent literacy rate. Maybe young people are slipping a bit, and this is certainly something that educators should address, but it doesn’t appear that countless people are dying from an epidemic of slightly declining literacy rates or that our linguistic structures are collapsing. This is simply not the linguistic apocalypse that Truss makes it out to be.

Anyway, even if it were, why would it be linguists’ job to do something about it? Literacy is taught in primary and secondary school and is usually the responsibility of reading, language arts, or English teachers—not linguists. Why not criticize English professors for sitting back and collecting fat paychecks for writing about literary theory while our kids struggle to read? Because they’re not her ideological enemy, that’s why. Linguists often oppose language pedants like Truss, and so Truss finds some reason—contrived though it may be—to blame them. Though some applied linguists do in fact study things like language acquisition and literacy, most linguists hew to the more abstract and theoretical side of language—syntax, morphology, phonology, and so on. Blaming descriptive linguists for children’s illiteracy is like blaming physicists for children’s inability to ride bikes.

And maybe the real reason why linguists are unconcerned about the upcoming linguistic apocalypse is that there simply isn’t one. Maybe linguists are like meteorologists who observe that, contrary to the claims of some individuals, the sky is not actually falling. In studying the structure of other languages and the ways in which languages change, linguists have realized that language change is not decay. Consider the opening lines from Beowulf, an Old English epic poem over a thousand years old:

HWÆT, WE GAR-DEna in geardagum,
þeodcyninga þrym gefrunon,
hu ða æþelingas ellen fremedon!

Only two words are instantly recognizable to modern English speakers: we and in. The changes from Old English to modern English haven’t made the language better or worse—just different. Some people maintain that they understand that language changes but say that they still oppose certain changes that seem to come from ignorance or laziness. They fear that if we’re not vigilant in opposing such changes, we’ll lose our ability to communicate. But the truth is that most of those changes from Old English to modern English also came from ignorance or laziness, and we seem to communicate just fine today.

Languages can change very radically over time, but contrary to popular belief, they never devolve into caveman grunting. This is because we all have an interest in both understanding and being understood, and we’re flexible enough to adapt to changes that happen within our lifetime. And with language, as opposed to morality or ethics, there is no inherent right or wrong. Correct language is, in a nutshell, what its users consider to be correct for a given time, place, and audience. One generation’s ignorant change is sometimes the next generation’s proper grammar.

It’s no surprise that Truss fundamentally misunderstands what linguists and lexicographers do. She even admits that she was “seriously unqualified” for linguistic debate a few years back, and it seems that nothing has changed. But that probably won’t stop her from continuing to prophesy the imminent destruction of the English language. Maybe Truss is less like Chicken Little and more like the boy who cried wolf, proclaiming disaster not because she actually sees one coming, but rather because she likes the attention.

By

The Pronunciation of Smaug

With the recent release of the new Hobbit movie, The Desolation of Smaug, a lot of people have been talking about the pronunciation of the titular dragon’s name. The inclination for English speakers is to pronounce it like smog, but Tolkien made clear in his appendixes to The Lord of the Rings that the combination au was pronounced /au/ (“ow”), as it is in German. A quick search on Twitter shows that a lot of people are perplexed or annoyed by the pronunciation, with some even declaring that they refuse to see the movie because of it. Movie critic Eric D. Snider joked, “I’m calling him ‘Smeowg’ now. Someone please Photoshop him to reflect the change, thanks.” I happily obliged.

smeowg

I can haz desolashun?

So what is it about the pronunciation of Smaug that makes people so crazy? Simply put, it doesn’t fit modern English phonology. Phonology is the pattern of sounds in language (or the study of those patterns), including things like syllable structure, word stress, and permissible sound combinations. In my undergraduate phonology class, my professor once gave us an exercise: think of all the consonants that can follow /au/, and give an example of each. The first several came easily, but we started to run out quickly: out, house (both as a noun with /s/ and as a verb with /z/), owl, mouth (both as a noun with /θ/ and as a verb with /ð/), down, couch, hour, and gouge. What these sounds all have in common is that they’re coronal consonants, or those made with the front of the tongue.

The coronal consonants in modern Standard English are /d/, /t/, /s/, /z/, /ʃ/ (as in shoe), /ʒ/ (as in measure), /tʃ/ (as in church), /dʒ/ (as in judge) /l/, /r/, and /n/. As far as I know, only two coronal consonants are missing from the list of consonants that can follow /au/—/ʃ/ and /ʒ/, the voiceless and voiced postalveolar fricatives. By contrast, /g/ is a dorsal consonant, pronounced with the back of the tongue. There are some nonstandard dialects (such as Cockney and African American English) that change /θ/ to /f/ and thus pronounce words like mouth as /mauf/, but in Standard English the pattern holds; there are no words with /aup/ or /aum/ or /auk/. (The only exception I know of, howf, is a rare Scottish word that was apparently borrowed from Dutch, and it could be argued that it appears rarely enough in Standard English that it shouldn’t be considered a part of it. It appears not at all in the Corpus of Contemporary American English and only once in the Corpus of Historical American English, but it’s in scare quotes. I only know it as an occasionally handy Scrabble word.)

And this isn’t simply a case like orange or silver, where nothing happens to rhyme with them. Through the accidents of history, the /aug/ combination simply does not occur in modern English. Before the Great Vowel Shift, Middle English /au/ turned into /ɔ:/ (as in caught today). (Note: the : symbol here denotes that a vowel is long.) During the Great Vowel Shift, /u:/ turned into a new /au/, but apparently this /u:/ never occurred before non-coronal consonants. This means that in Middle English, either /u/ lengthened before coronals or /u:/ shortened before non-coronals; I’m not sure which. But either way, it left us with the unusual pattern we see in English today.

What all this technical gibberish means is that, in the absence of a clear pronunciation guide, readers will assume that the “au” in Smaug is pronounced as it is in other English words, which today is almost always /ɔ:/ or /ɑ:/. Thus most Americans will rhyme it with smog. (I can’t speak with authority about other varieties of English, but they would probably opt for one of those vowels or something similar, but not the diphthong /au/.) It’s not surprising that many readers will feel annoyed when told that their pronunciation clashes with the official pronunciation, which they find unintuitive and, frankly, rather non-English.

One final note: Michael Martinez suggests in this post that /smaug/ is not actually Tolkien’s intended pronunciation. After all, he says, the appendixes are a guide to the pronunciation of Elvish, and Smaug’s name is not Elvish. Martinez quotes one of Tolkien’s letters regarding the origin of the name: “The dragon bears as name—a pseudonym—the past tense of the primitive Germanic verb Smugan, to squeeze through a hole: a low philological jest.” He seems to take this as evidence against the pronunciation /smaug/, but this is probably because Tolkien was not as clear as he could have been. Smugan is the infinitive form; the past tense is—surprise—smaug.

Note: the definition given for the Proto-Germanic form doesn’t quite match Tolkien’s, though it appears to be the same verb; the Old English form, also with the infinitive smugan, is defined as “to creep, crawl, move gradually”. The astute student of language will notice that the past tense of the verb in Old English had the form smēag in the first and third person. This is because the Proto-Germanic /au/ became /ēa/ in Old English and /i:/ or /ai/ in modern English; compare the German auge ‘eye’ and the English eye. This demonstrates once again that English lost the combination /aug/ quite some time ago while its sister languages hung on to it.

So yes, it appears that Tolkien really did intend Smaug to be pronounced /smaug/, with that very un-English (but very Germanic) /aug/ combination at the end. He was a linguist and studied several languages in depth, particularly old Germanic languages such as Old English, Old Norse, and Gothic. He was certainly well aware of the pronunciation of the word, even if he didn’t make it clear to his readers. You can find the pronunciation silly if you want, you can hate it, and you can even threaten to boycott the movie, but you can’t call it wrong.

By

Now on Visual Thesaurus: “Electrocution: A Shocking Misuse?”

I have a new post up on Visual Thesaurus about the use, misuse, and history of the word electrocute. Some usage commentators today insist that it be used only to refer to death by electric shock; that is, you can’t say you’ve been electrocuted if you lived to tell the tale. But the history, unsurprisingly, is more complicated: there have been disputes about the word since its birth.

As always, the article is for subscribers only, but a subscription costs a paltry $2.95 a month or $19.95 (and would make a great gift for the word lover in your life). Check it out.