Arrant Pedantry

By

Sneak Peek: “There Are a Number of Agreement Problems”

Unless you’re a subscriber to Copyediting newsletter, you don’t get the chance to read my “Grammar on the Edge” column. But now you can get a sneak peek of my most recent entry, “There Are a Number of Agreement Problems,” on Copyediting’s website.

You’ll still have to subscribe to get the whole thing, but maybe this will whet your appetite. (And a year’s subscription is only $79.) You’ll also get lots of great content from Erin Brenner, Mark Farrell, Katharine O’Moore-Klopf, and others. Check it out!

By

New Post on Visual Thesaurus: Less Usage Problems

I have a new post on Visual Thesaurus, and this one’s open to non-subscribers:

The distinction between less and fewer is one of the most popular rules in the peevers’ arsenal. It’s a staple of lists of grammar rules that everyone supposedly gets wrong, and sticklers have pressured stores into changing their signs from “10 items or less” to “10 items or fewer.” Students have it drilled into their heads that fewer is for things you can count while less is for things you can’t. But there’s a problem: the rule as it’s commonly taught is wrong, and it’s dulling our sense of what’s actually right.

Go here to read the rest.

By

Is the Oxford Comma Ungrammatical?

Few language issues inspire as much fervent debate as the question of whether you need a comma before the last item in a series, also known as the Oxford, Harvard, or serial comma. This is the comma that you sometimes see before and in lists, such as “I need you to go to the store and get bread, milk, and butter.” The Chicago Manual of Style, which is used by many book publishers and some academic journals, requires the comma. The AP Stylebook, which is used by newspapers and many magazines, omits the comma unless it’s necessary to avoid ambiguity.

It seems like such a trifling thing, yet it inspires impassioned debate among editors and writers on both sides of the issue. Last year the satirical news site the Onion joked about violence between the AP and Chicago gangs and wrote that “an innocent 35-year-old passerby who found himself caught up in a long-winded dispute over use of the serial, or Oxford, comma had died of a self-inflicted gunshot wound.”

But more recently, Walt Hickey at the FiveThirtyEight blog decided to approach the argument in a more scientific fashion, by polling readers on their preference. What he found was that readers are fairly split: 57 percent prefer the serial comma, while 43 percent dislike it. So while there’s a preference for the serial comma, at least among FiveThirtyEight readers, it’s not an overwhelming one.

Roy Peter Clark at the Poynter Institute followed up by arguing that journalists should adopt the serial comma, after which Sam Kirkland posted a poll on Poynter asking if the Associated Press should make the switch. Surprisingly, a whopping 71 percent of respondents said yes.

But none of this was good enough for Poynter blogger Andrew Beaujon. He took to Twitter to lay out his arguments against the serial comma. The first argument—that the serial comma is simply ungrammatical—is actually the easiest to refute. He says that since you can’t write “My wife, and I drove to work” (note the comma before the “and”), you can’t write “Bob, my wife, and I drove to work.” But he’s simply presupposing that there’s a rule that you can never have a comma before a conjunction in a list, which is obviously not true.

He’s also presupposing that lists of two items behave exactly like lists of three or more items, but this is also untrue. I can’t write “Bob, my wife drove to work” (at least not with the intended meaning), but I can certainly write “Bob, my wife and I drove to work.” By Beaujon’s logic, the fact that you add a third item doesn’t magically obviate the rule that you can’t use a comma to join two things together. (Of course, this is done all the time in headline style, but it’s not allowed in normal prose.) It’s clear that the structure of lists changes a bit when you have three items or more, but it is not clear that there is any grammatical rule forbidding commas.

His next point, that “the sentences people employ to show the need for a serial comma are usually ridiculous”, is weak at best. Yes, it’s true that most contrived example sentences are a little ridiculous, but that’s just a problem with contrived example sentences, not with the argument for the serial comma. And consider this example: “The highlights of his global tour include encounters with Nelson Mandela, an 800-year-old demigod and a dildo collector.” It may be a ridiculous sentence, but it’s real.

Beaujon’s final argument is that the serial comma arises from an urge to overpunctuate because we believe readers are too stupid to figure things out on their own. He says that “prescriptivism is not for [the readers'] benefit; its purpose is to make those of us in the publishing game to feel important and necessary”, but how is his own prescriptivism any different? He’s instructing writers and editors in comma rules and telling them that following his rule means they’re good writers. In other words, my writing is so clear that I don’t need a crutch like the serial comma; if you disagree, it’s only because you don’t trust readers and want to make yourself feel important.

But consider this: most people in both the FiveThirtyEight and Poynter polls prefer the serial comma. That means most readers prefer it. If they find it helpful, who are we to argue that it’s some sort of crutch of bad writers or source of job security for copy editors? And AP style does in fact use the serial comma to prevent ambiguity (though apparently not in the above example regarding the late Mr. Mandela), so what’s the harm in using it all the time?

Because the fact is that I often stumble over sentences that lack the serial comma. Even though I’m well aware that AP and other styles omit the comma before the “and”, I still tend to read “bacon and eggs” in “He made muffins, bacon and eggs” as a single item, not the final two items in a list. The sudden end of the list after “eggs” throws me because I was expecting something to follow it.

I suppose you could conclude that I’m an idiot that can’t work out writing on his own, but you could just as easily (and much more charitably) conclude that the serial comma really is helpful because it signals something about the structure of the sentence. In speech, we can rely on a speaker’s prosody—the rise and fall of pitch—to tell us where the syntactic units begin and end. In writing, we have to rely on punctuation marks to serve as signposts.

You can claim all you want that your writing is so clear that it can do without these signposts, but if you leave out too many, your readers may feel lost, wandering through meandering sentences without knowing where they’re going. Did your reader immediately understand what you wrote, or did they stumble, backtrack, and read it again before they got your message?

This isn’t to say there’s one right way to punctuate, and it’s to use the serial comma. As we saw from the polls above, opinion on its use is still fairly divided. But rather than accusing your opponents of distrusting readers or being self-aggrandizing, you could take them at their word. Maybe there really are legitimate reasons to prefer the serial comma, just as there are legitimate reasons to prefer omitting it.

I find the arguments in favor of including the serial comma stronger than the arguments in favoring of leaving it out, but I don’t pretend that my preference is an ironclad grammatical law or proof of my superiority. It’s just that—a preference. You are free to choose for yourself.

By

15% Off All T-Shirts

First, I apologize for not blogging in so long. It’s been a crazy summer, complete with a new baby (yay!), a new job (yay!), and moving to a new house (boo!). I’ve got a few posts in the works and hope to have something done soon.

Second, it’s time for another sale! Now through September 2, get 15 percent off all T-shirts in the Arrant Pedantry Store. Just use the code SHIRTS15 at checkout.

By

Do Usage Debates Make You Nauseous?

Several days ago, the Twitter account for the Chicago Manual of Style tweeted, “If you’re feeling sick, use nauseated rather than nauseous. Despite common usage, whatever is nauseous induces nausea.” The relevant entry in Chicago reads,

Whatever is nauseous induces a feeling of nausea—it makes us feel sick to our stomachs. To feel sick is to be nauseated. The use of nauseous to mean nauseated may be too common to be called error anymore, but strictly speaking it is poor usage. Because of the ambiguity of nauseous, the wisest course may be to stick to the participial adjectives nauseated and nauseating.

Though it seems like a straightforward usage tip, it’s based on some dubious motives and one rather strange assumption about language. It’s true that nauseous once meant causing nausea and that it has more recently acquired the sense of having nausea, but causing nausea wasn’t even the word’s original meaning in English. The word was first recorded in the early 17th century in the sense of inclined to nausea or squeamish. So you were nauseous not if you felt sick at the moment but if you had a sensitive stomach. This sense became obsolete in the late 17th century, supplanted by the causing nausea sense. The latter sense is the one that purists cling to, but it too is going obsolete.

I searched for nauseous in the Corpus of Contemporary American English and looked at the first 100 hits. Of those 100 hits, only one was used in the sense of causing nausea: “the nauseous tints and tinges of corruption.” The rest were all clearly used in the sense of having nausea—“I was nauseous” and “it might make you feel a little nauseous” and so on. Context is key: when nauseous is used with people, it means that they feel sick, but when it’s used with things, it means they’re sickening. And anyway, if nauseous is ambiguous, then every word with multiple meanings is ambiguous, including the word word, which has eleven main definitions as a noun in Merriam-Webster’s Collegiate. So where’s this ambiguity that Chicago warns of?

The answer is that there really isn’t any. In this case it’s nothing more than a red herring. Perhaps it’s possible to concoct a sentence that, lacking sufficient context, is truly ambiguous. But the corpus search shows that it just isn’t a problem, and thus fear of ambiguity can’t be the real reason for avoiding nauseous. Warnings of ambiguity are often used not to call attention to a real problem but to signal that a word has at least two senses or uses and that the author does not like one of them. Bryan Garner (the author of the above entry from Chicago), in his Modern American Usage, frequently warns of such “skunked” words and usually recommends avoiding them altogether. This may seem like sensible advice, but it seems to me to be motivated by a sense of jealousy—if the word can’t mean what the advice-giver wants it to mean, then no one can use it.

But the truly strange assumption is that words have meaning that is somehow independent of their usage. If 99 percent of the population uses nauseous in the sense of having nausea, then who’s to say that they’re wrong? Who has the authority to declare this sense “poor usage”? And yet Garner says, rather unequivocally, “Whatever is nauseous induces a feeling of nausea.” How does he know this is what nauseous means? It’s not as if there is some platonic form of words, some objective true meaning from which a word must never stray. After all, language changes, and an earlier form is not necessarily better or truer than a newer one. As Merriam-Webster editor Kory Stamper recently pointed out on Twitter, stew once meant “whorehouse”, and this sense dates to the 1300s. The food sense arose four hundred years later, in the 1700s. Is this poor usage because it’s a relative upstart supplanting an older established sense? Of course not.

People stopped using nauseous to mean “inclined to nausea” several hundred years ago, and so it no longer means that. Similarly, most people no longer use nauseous to mean “causing nausea”, and so that meaning is waning. In another hundred years, it may be gone altogether. For now, it hangs on, but this doesn’t mean that the newer and overwhelmingly more common sense is poor usage. The new sense is only poor usage inasmuch as someone says it is. In other words, it all comes down to someone’s opinion. As I’ve said before, pronouncements on usage that are based simply on someone’s opinion are ultimately unreliable, and any standard that doesn’t take into account near-universal usage by educated speakers in edited writing is doomed to irrelevance.

So go ahead and use nauseous. The “having nausea” sense is now thoroughly established, and it seems silly to avoid a perfectly good word just because a few peevers dislike it. Even if you stick to the more traditional “causing nausea” sense, you’re unlikely to confuse anyone, because context will make the meaning clear. Just be careful about people who make unsupported claims about language.

By

Celebrate T-Shirt Day with 15% Off

T-shirt day is June 21st, and in preparation for the big day, Spreadshirt is offering 15 percent off all t-shirts when you use the coupon code MYSHIRT2014 between now and June 10th. If you met me at the annual conferences of the American Copy Editors Society and liked my shirts, now’s a good chance to get one for yourself. Go check out what’s available in the Arrant Pedantry Store.

And if you’re not the word-nerd-T-shirt-buying type, don’t worry—a new post is coming soon.

By

Mother’s Day

Today is officially Mother’s Day, and as with other holidays with possessive or plural endings, there’s a lot of confusion about what the correct form of the name is. The creator of Mother’s Day in the United States, Anna Jarvis, specifically stated that it should be a singular possessive to focus on individual mothers rather than mothers in general. But as sociolinguist Matt Gordon noted on Twitter, “that logic is quite peccable”; though it’s a nice sentiment, it’s grammatical nonsense.

English has a singular possessive and a plural possessive; it does not have a technically-plural-but-focusing-on-the-singular possessive. Though Jarvis may have wanted everyone to focus on their respective mothers, the fact is that it still celebrates all mothers. If I told you that tomorrow was Jonathon’s Day, you’d assume that it’s my day, not that it’s the day for all Jonathons but that they happen to be celebrating separately. That’s simply not how grammatical number works in English. If you have more than one thing, it’s plural, even if you’re considering those things individually.

This isn’t the only holiday that employs some grammatically suspect reasoning in its official spelling—Veterans Day officially has no apostrophe because the day doesn’t technically belong to veterans. But this is silly—apostrophes are used for lots of things beyond simple ownership.

It could be worse, though. The US Board on Geographic Names discourages possessives altogether, though it allows the possessive s without an apostrophe. The peak named for Pike is Pikes Peak, which is worse than grammatical nonsense—it’s an officially enshrined error. The worst part is that there isn’t even a reason given for this policy, though presumably it’s because they don’t want to indicate private ownership of geographical features. (Again, the apostrophe doesn’t necessarily show ownership.) But in this case you can’t even argue that Pike is a plural attributive noun, because there’s only one Pike who named the peak.

The sad truth is that the people in charge of deciding where or whether to put apostrophes in things don’t always have the best grasp of grammar, and they don’t always think to consult someone who does. But even if the grammar of Mother’s Day makes me roll my eyes, I can still appreciate the sentiment. In the end, arguing about the placement of an apostrophe is a quibble. What matters most is what the day really means. And this day is for you, Mom.

By

Over Has Always Meant More Than. Get Over it.

Last month, at the yearly conference of the American Copy Editors Society, the editors of the AP Stylebook announced that over in the sense of more than was now acceptable. For decades, newspaper copy editors had been changing constructions like over three hundred people to more than three hundred people; now, with a word from AP’s top editors, that rule was being abandoned.

According to Merriam-Webster editor Peter Sokolowski, who was in attendance, the announcement was met with gasps. Editors quickly took to Twitter and to blogs to express their approval or dismay. Some saw it as part of the dumbing-down of the language or as a tacit admission that newspapers no longer have the resources to maintain their standards. Others saw it as the banishment of a baseless superstition that has wasted copy editors’ time without improving the text.

The argument had been that over must refer to spatial relationships and that numerical relationships must use more than . But nobody objects to other figurative uses of over, such as over the weekend or get over it or in over your head or what’s come over you?. The rule forbidding the use of over to mean more than was first codified in the 1800s, but over can be found in this sense going back a thousand years or more, in some of the earliest documents written in English.

Not only that, but parallel uses can be found in other Germanic languages, including German, Dutch, and Swedish. (Despite all its borrowings from French, Latin, and elsewhere, English is considered a Germanic language.) There’s nothing wrong with the German Kinder über 14 Jahre (children over 14 years) (to borrow an example from the Collins German-English Dictionary) or the Swedish Över femhundra kom (more than five hundred came). This means that this use of over actually predates English and must have been inherited from the common ancestor of all the Germanic languages, Proto-Germanic, some two thousand years ago.

Mignon Fogarty, aka Grammar Girl, wrote that “no rationale exists for the ‘over can’t mean more than’ rule.” And in a post on the Merriam-Webster Unabridged blog, Sokolowski gave his own debunking, concluding that “we just don’t need artificial rules that do not promote the goal of clarity.” But none of this was good enough for some people. AP’s announcement caused a rift in the editing staff at Mashable, who debated the rule on the lifestyle blog.

Alex Hazlett argued that the rule “was an arbitrary style decision that had nothing to do with grammar, defensible only by that rationale of last resort: tradition.” Megan Hess, though, took an emotional and hyperbolic tack, claiming that following rules like this prevents the world from slipping into “a Lord of the Flies-esque dystopia.” From there her argument quickly becomes circular: “The distinction is one that distinguishes clean, precise language and attention to detail — and serves as a hallmark of a proper journalism training.” In other words, editors should follow the rule because they’ve been trained to follow the rule, and the rule is simply a mark of clean copy. And how do you know the copy is clean? Because it follows rules like this. As Sokolowski says, this is nothing more than a shibboleth—the distinction serves no purpose other than to distinguish those in the know from everyone else.

It’s also a perfect example of a mumpsimus. The story goes that an illiterate priest in the Middle Ages had learned to recite the Latin Eucharist wrong: instead of sumpsimus (Latin for “we have taken”), he said mumpsimus, which is not a Latin word at all. When someone finally told him that he’d been saying it wrong and that it should be sumpsimus, he responded that he would not trade his old mumpsimus for this person’s new sumpsimus. He didn’t just refuse to change—he refused to recognize that he was wrong and had always been wrong.

But so what if everyone’s been using over this way for longer than the English language has existed? Just because everyone does it doesn’t mean it’s right, right? Well, technically, yes, but let’s flip the question around: what makes it wrong to use over to mean more than? The fact that the over-haters have had such an emotional reaction is telling. It’s surprisingly easy to talk yourself into hating a particular word or phrase and to start judging everyone who allegedly misuses it. And once you’ve developed a visceral reaction to a perceived misuse, it’s hard to be persuaded that your feelings aren’t justified.

We editors take a lot of pride in our attention to language—which usually means our attention to the usage and grammar rules that we’ve been taught—so it can seem like a personal affront to be told that we were wrong and have always been wrong. Not only that, but it can shake our faith in other rules. If we were wrong about this, what else might we have been wrong about? But perhaps rather than priding ourselves on following the rules, we should pride ourselves on mastering them, which means learning how to tell the good rules from the bad.

Learning that you were wrong simply means that now you’re right, and that can only be a good thing.

By

Book Review: Schottenfreude

German is famous for its compound words. While languages like English are content to use whole phrases to express an idea, German can efficiently pack the same idea into a single word, like Schadenfreude, which means a feeling of joy from watching or hearing of someone else’s miseries. Well, in Schottenfreude: German Words for the Human Condition, Ben Schott has decided to expand on German’s compounding ability and create words that should exist.

Every right-hand page lists three made-up German compounds, along with their pronunciation, their English translation, and a more literal gloss. On the facing left-hand pages are explanatory notes discussing the concepts in more depth. For example, the first word is Herbstlaubtrittvergnügen (autumn-foliage-strike-fun), meaning “kicking through piles of autumn leaves”. The explanatory notes talk about self-reported rewarding events and the metaphorical connection between fallen leaves and human souls in literature.

The rest of the book proceeds much the same way, with funny and surprising insights into the insecurities, frailties, and joys of human life. Who hasn’t at some time or another experienced Deppenfahrerbeäugung (“the urge to turn and glare at a bad driver you’ve just overtaken”), Sommerferienewigkeitsgefühl (“childhood sensation that the summer vacation will last forever”), or Gesprächsgemetzel (“moments when, for no good reason, a conversation suddenly goes awry”)?

You don’t have to be a German speaker to appreciate this book, but it certainly helps. There are a few puns that you can only appreciate if you have a knowledge of both English and German, such as Besserwinzer (“one of those people who pretend to know more about wine than they do”), which is a play on Besserwisser, meaning “know-it-all”, and Götzengeschwätz (“praying to a god you don’t believe in”), which literally means “idol chatter”. And knowing German will certainly help you pronounce the words better; I found the provided pronunciations somewhat unintuitive, and there’s no key. The words also don’t seem to be in any particular order, so it can be a little difficult to find one again, even though there is an index.

Overall, though, it’s a greatly enjoyable little book, great for flipping through when you have a few idle minutes. Word lovers—and especially German lovers—are sure to find a lot of treasures inside.

Full disclosure: I received a free review copy of this book from the publisher. My apologies to the author and publisher for the lateness of this review.

By

Why Teach Grammar?

Today is National Grammar Day, and I’ve been thinking a lot lately about what grammar is and why we study it. Last week in the Atlantic, Michelle Navarre Cleary wrote that we should do away with diagramming sentences and other explicit grammar instruction. Her argument, in a nutshell, is that grammar instruction not only doesn’t help students write better, but it actually teaches them to hate writing.

It’s really no surprise—as an editor and a student of language, I’ve run into a lot of people who never learned the difference between a preposition and a participle and are insecure about their writing or their speech. I once had a friend who was apparently afraid to talk to me because she thought I was silently correcting everything she said. When I found out about it, I reassured her that I wasn’t; not only had I never noticed anything wrong with the way she talked, but I don’t worry about correcting people unless they’re paying me for it. But I worried that this was how people saw me: a know-it-all jerk who silently judged everyone else for their errors. I love language, and it saddened me to think that there are people who find it not fascinating but frustrating.

But given the state of grammar instruction in the United States today, it’s not hard to see why a lot of people feel this way. I learned hardly any sentence diagramming until I got to college, and my public school education in grammar effectively stopped in eighth or ninth grade when I learned what a prepositional phrase was. In high school, our grammar work consisted of taking sentences like “He went to the store” and changing them to “Bob went to the store” (because you can’t use he without an antecedent; never mind that such a sentence would not occur in isolation and would surely make sense in context).

Meanwhile, many students are marked down on their papers for supposed grammar mistakes (which are usually matters of spelling, punctuation, or style): don’t use contractions, don’t start a sentence with conjunctions, don’t use any form of the verb be, don’t write in the first person, don’t refer to yourself in the third person, don’t use the passive voice, and on and on. Of course most students are going to come out of writing class feeling insecure. They’re punished for failing to master rules that don’t make sense.

And it doesn’t help that there’s often a disconnect between what the rules say good writing is and what it actually is. Good writing breaks these rules all the time, and following all the rules does little if anything to make bad writing good. We know the usual justifications: students have to master the basics before they can become experts, and once they become experts, they’ll know when it’s okay to break the rules.

But these justifications presuppose that teaching students not to start a sentence with a conjunction or not to use the passive voice has something to do with good writing, when it simply doesn’t. I’ve said before that we don’t consider whether we’re giving students training wheels or just putting sticks in their spokes. Interestingly, Cleary uses a similar argument in her Atlantic piece: “Just as we teach children how to ride bikes by putting them on a bicycle, we need to teach students how to write grammatically by letting them write.”

I’m still not convinced, though, that learning grammar has much at all to do with learning to write. Having a PhD in linguistics doesn’t mean you know how to write well, and being an expert writer doesn’t mean you know anything about syntax and morphology beyond your own native intuition. And focusing on grammar instruction may distract from the more fundamental writing issues of rhetoric and composition. So why worry about grammar at all if it has nothing to do with good writing? Language Log’s Mark Liberman said it well:

We don’t put chemistry into the school curriculum because it will make students better cooks, or even because it might make them better doctors, much less because we need a relatively small number of professional chemists. We believe (I hope) that a basic understanding of atoms and molecules is knowledge that every citizen of the modern world should have.

It may seem like a weak defense in a world that increasingly focuses on marketable skills, but it’s maybe the best justification we have. Language is amazing; no other animal has the capacity for expression that we do. Language is so much more than a grab-bag of peeves and strictures to inflict on freshman writing students; it’s a fundamental part of who we are as a species. Shouldn’t we expect an educated person to know something about it?

So yes, I think we should teach grammar, not because it will help people write better, but simply because it’s interesting and worth knowing about. But we need to recognize that it doesn’t belong in the same class as writing or literature; though it certainly has connections to both, linguistics is a separate field and should be treated as such. And we need to teach grammar not as something to hate or even as something to learn as a means to an end, but as a fascinating and complex system to be discovered and explored for its own sake. In short, we need to teach grammar as something to love.