Arrant Pedantry

By

Book Review: The Sense of Style

Full disclosure: I received an advance review copy of this book from the publisher, Viking.

The Sense of StyleI was intrigued when I first heard that Steven Pinker, the linguist and cognitive scientist, was writing a book on style. I’ve really enjoyed some of his other books, such as The Stuff of Thought, but wasn’t this the guy who had dedicated an entire chapter of The Language Instinct to bashing prescriptivists, calling them a bunch of “kibbitzers and nudniks” who peddle “bits of folklore that originated for screwball reasons several hundred years ago”? But even though it can be satisfying to bash nonsensical grammar rules, I’ve also long felt that linguists could offer some valuable insight into the field of writing. I was hopeful that Pinker would have some interesting things to say about writing, and he didn’t disappoint me.

I should be clear, though, that this is not your ordinary book on writing advice. It isn’t a quick reference book full of rules and examples of what to do and what not to do (for which I recommend Joseph Williams’s excellent Style). It’s something deeper and more substantial than that—it’s a thorough examination of what makes good writing good and why writing well is so hard.

Pinker starts by reverse-engineering some of his favorite passages of prose, taking them apart piece by piece to see what makes them tick. Though it’s an interesting exercise, it gets a little tedious at times as he picks passages apart. However, his point is valuable: good writing can only come from good reading, which means not only reading a lot but engaging with what you read.

He then explores classic style, which he calls “an antidote for academese, bureaucratese, corporatese, legalese, officialese, and other kinds of stuffy prose.” Classic style starts with the assumption that the writer has seen something that they want to show to the reader, so the writer engages in a conversation with the reader to help direct their gaze. It’s not suitable for every kind of writing—for example, a user manual needs just a straightforward list of instructions, not a dialogue—but it works well for academic writing and other kinds of writing in which an author explains a new idea to the reader.

Then Pinker tackles perhaps the most difficult challenge in writing—overcoming the curse of knowledge. The cause of much bad writing, he says, is that the author is so close to the subject that they don’t know how to explain it to someone who doesn’t already know what the author knows. They forget how they came by their knowledge and thus unthinkingly skip key pieces of explanation or use jargon that is obscure or opaque to outsiders. And to make things worse, even being aware of the curse of knowledge isn’t enough to ensure that you’ll write more clearly; that is, you can’t simply tell someone, “Keep the reader in mind!” and expect them to do so. The best solution, Pinker says, is to have test readers or editors who can tell you where something doesn’t make sense and needs to be revised.

The next chapters provide a crash course on syntax and a guide to creating greater textual coherence, and though they occasionally get bogged down in technical details, they’re full of good advice. For example, Pinker uses syntax tree diagrams to illustrate both the cause of and solution to problems like misplaced modifiers. Tree diagrams are much more intuitive than other diagramming methods like Reed-kellog, so you don’t need to be an expert in linguistics to see the differences between two example sentences. And though the guide to syntax is helpful, the chapter on coherence is even better. Pinker explains why seemingly well-written text is sometimes so hard to understand: because even though the sentences appear to hang together just fine, the ideas don’t. The solution is to keep consistent thematic strings throughout a piece, tying ideas together and making the connections between them clear.

The last and by far the longest chapter—it occupies over a third of the book—is essentially a miniature grammar and usage guide prefaced by a primer on the supposed clash between prescriptivism and descriptivism. It’s simultaneously the most interesting and most disappointing chapter in the book. Though it starts rather admirably by explaining the linguistics behind particular usage issues (something I try to do on this blog), it ends with Pinker indulging in some peevery himself. Ironically, some of the usage rules he endorses are no more valid than the ones he debunks, and he gives little justification for his preference, often simply stating that one form is classier. At least it’s clear, though, that these are his personal preferences and not universal laws. The bulk of the chapter, though, is a lucid guide to some common grammar and usage issues. (And yes, he does get in a little prescriptivist bashing.)

Despite some occasional missteps, The Sense of Style is full of valuable advice and is a welcome addition to the genre of writing guides.

By

Interview at Grammarist

Forgive me if you’ve already seen this, but I was interviewed a couple of weeks ago at Grammarist.com. Find out what got me into language blogging, what my greatest accomplishment in the world of language is, and why you should care more about language. Check it out!

By

Sneak Peek: “There Are a Number of Agreement Problems”

Unless you’re a subscriber to Copyediting newsletter, you don’t get the chance to read my “Grammar on the Edge” column. But now you can get a sneak peek of my most recent entry, “There Are a Number of Agreement Problems,” on Copyediting’s website.

You’ll still have to subscribe to get the whole thing, but maybe this will whet your appetite. (And a year’s subscription is only $79.) You’ll also get lots of great content from Erin Brenner, Mark Farrell, Katharine O’Moore-Klopf, and others. Check it out!

By

New Post on Visual Thesaurus: Less Usage Problems

I have a new post on Visual Thesaurus, and this one’s open to non-subscribers:

The distinction between less and fewer is one of the most popular rules in the peevers’ arsenal. It’s a staple of lists of grammar rules that everyone supposedly gets wrong, and sticklers have pressured stores into changing their signs from “10 items or less” to “10 items or fewer.” Students have it drilled into their heads that fewer is for things you can count while less is for things you can’t. But there’s a problem: the rule as it’s commonly taught is wrong, and it’s dulling our sense of what’s actually right.

Go here to read the rest.

By

Is the Oxford Comma Ungrammatical?

Few language issues inspire as much fervent debate as the question of whether you need a comma before the last item in a series, also known as the Oxford, Harvard, or serial comma. This is the comma that you sometimes see before and in lists, such as “I need you to go to the store and get bread, milk, and butter.” The Chicago Manual of Style, which is used by many book publishers and some academic journals, requires the comma. The AP Stylebook, which is used by newspapers and many magazines, omits the comma unless it’s necessary to avoid ambiguity.

It seems like such a trifling thing, yet it inspires impassioned debate among editors and writers on both sides of the issue. Last year the satirical news site the Onion joked about violence between the AP and Chicago gangs and wrote that “an innocent 35-year-old passerby who found himself caught up in a long-winded dispute over use of the serial, or Oxford, comma had died of a self-inflicted gunshot wound.”

But more recently, Walt Hickey at the FiveThirtyEight blog decided to approach the argument in a more scientific fashion, by polling readers on their preference. What he found was that readers are fairly split: 57 percent prefer the serial comma, while 43 percent dislike it. So while there’s a preference for the serial comma, at least among FiveThirtyEight readers, it’s not an overwhelming one.

Roy Peter Clark at the Poynter Institute followed up by arguing that journalists should adopt the serial comma, after which Sam Kirkland posted a poll on Poynter asking if the Associated Press should make the switch. Surprisingly, a whopping 71 percent of respondents said yes.

But none of this was good enough for Poynter blogger Andrew Beaujon. He took to Twitter to lay out his arguments against the serial comma. The first argument—that the serial comma is simply ungrammatical—is actually the easiest to refute. He says that since you can’t write “My wife, and I drove to work” (note the comma before the “and”), you can’t write “Bob, my wife, and I drove to work.” But he’s simply presupposing that there’s a rule that you can never have a comma before a conjunction in a list, which is obviously not true.

He’s also presupposing that lists of two items behave exactly like lists of three or more items, but this is also untrue. I can’t write “Bob, my wife drove to work” (at least not with the intended meaning), but I can certainly write “Bob, my wife and I drove to work.” By Beaujon’s logic, the fact that you add a third item doesn’t magically obviate the rule that you can’t use a comma to join two things together. (Of course, this is done all the time in headline style, but it’s not allowed in normal prose.) It’s clear that the structure of lists changes a bit when you have three items or more, but it is not clear that there is any grammatical rule forbidding commas.

His next point, that “the sentences people employ to show the need for a serial comma are usually ridiculous”, is weak at best. Yes, it’s true that most contrived example sentences are a little ridiculous, but that’s just a problem with contrived example sentences, not with the argument for the serial comma. And consider this example: “The highlights of his global tour include encounters with Nelson Mandela, an 800-year-old demigod and a dildo collector.” It may be a ridiculous sentence, but it’s real.

Beaujon’s final argument is that the serial comma arises from an urge to overpunctuate because we believe readers are too stupid to figure things out on their own. He says that “prescriptivism is not for [the readers’] benefit; its purpose is to make those of us in the publishing game to feel important and necessary”, but how is his own prescriptivism any different? He’s instructing writers and editors in comma rules and telling them that following his rule means they’re good writers. In other words, my writing is so clear that I don’t need a crutch like the serial comma; if you disagree, it’s only because you don’t trust readers and want to make yourself feel important.

But consider this: most people in both the FiveThirtyEight and Poynter polls prefer the serial comma. That means most readers prefer it. If they find it helpful, who are we to argue that it’s some sort of crutch of bad writers or source of job security for copy editors? And AP style does in fact use the serial comma to prevent ambiguity (though apparently not in the above example regarding the late Mr. Mandela), so what’s the harm in using it all the time?

Because the fact is that I often stumble over sentences that lack the serial comma. Even though I’m well aware that AP and other styles omit the comma before the “and”, I still tend to read “bacon and eggs” in “He made muffins, bacon and eggs” as a single item, not the final two items in a list. The sudden end of the list after “eggs” throws me because I was expecting something to follow it.

I suppose you could conclude that I’m an idiot that can’t work out writing on his own, but you could just as easily (and much more charitably) conclude that the serial comma really is helpful because it signals something about the structure of the sentence. In speech, we can rely on a speaker’s prosody—the rise and fall of pitch—to tell us where the syntactic units begin and end. In writing, we have to rely on punctuation marks to serve as signposts.

You can claim all you want that your writing is so clear that it can do without these signposts, but if you leave out too many, your readers may feel lost, wandering through meandering sentences without knowing where they’re going. Did your reader immediately understand what you wrote, or did they stumble, backtrack, and read it again before they got your message?

This isn’t to say there’s one right way to punctuate, and it’s to use the serial comma. As we saw from the polls above, opinion on its use is still fairly divided. But rather than accusing your opponents of distrusting readers or being self-aggrandizing, you could take them at their word. Maybe there really are legitimate reasons to prefer the serial comma, just as there are legitimate reasons to prefer omitting it.

I find the arguments in favor of including the serial comma stronger than the arguments in favoring of leaving it out, but I don’t pretend that my preference is an ironclad grammatical law or proof of my superiority. It’s just that—a preference. You are free to choose for yourself.

By

15% Off All T-Shirts

First, I apologize for not blogging in so long. It’s been a crazy summer, complete with a new baby (yay!), a new job (yay!), and moving to a new house (boo!). I’ve got a few posts in the works and hope to have something done soon.

Second, it’s time for another sale! Now through September 2, get 15 percent off all T-shirts in the Arrant Pedantry Store. Just use the code SHIRTS15 at checkout.

By

Do Usage Debates Make You Nauseous?

Several days ago, the Twitter account for the Chicago Manual of Style tweeted, “If you’re feeling sick, use nauseated rather than nauseous. Despite common usage, whatever is nauseous induces nausea.” The relevant entry in Chicago reads,

Whatever is nauseous induces a feeling of nausea—it makes us feel sick to our stomachs. To feel sick is to be nauseated. The use of nauseous to mean nauseated may be too common to be called error anymore, but strictly speaking it is poor usage. Because of the ambiguity of nauseous, the wisest course may be to stick to the participial adjectives nauseated and nauseating.

Though it seems like a straightforward usage tip, it’s based on some dubious motives and one rather strange assumption about language. It’s true that nauseous once meant causing nausea and that it has more recently acquired the sense of having nausea, but causing nausea wasn’t even the word’s original meaning in English. The word was first recorded in the early 17th century in the sense of inclined to nausea or squeamish. So you were nauseous not if you felt sick at the moment but if you had a sensitive stomach. This sense became obsolete in the late 17th century, supplanted by the causing nausea sense. The latter sense is the one that purists cling to, but it too is going obsolete.

I searched for nauseous in the Corpus of Contemporary American English and looked at the first 100 hits. Of those 100 hits, only one was used in the sense of causing nausea: “the nauseous tints and tinges of corruption.” The rest were all clearly used in the sense of having nausea—“I was nauseous” and “it might make you feel a little nauseous” and so on. Context is key: when nauseous is used with people, it means that they feel sick, but when it’s used with things, it means they’re sickening. And anyway, if nauseous is ambiguous, then every word with multiple meanings is ambiguous, including the word word, which has eleven main definitions as a noun in Merriam-Webster’s Collegiate. So where’s this ambiguity that Chicago warns of?

The answer is that there really isn’t any. In this case it’s nothing more than a red herring. Perhaps it’s possible to concoct a sentence that, lacking sufficient context, is truly ambiguous. But the corpus search shows that it just isn’t a problem, and thus fear of ambiguity can’t be the real reason for avoiding nauseous. Warnings of ambiguity are often used not to call attention to a real problem but to signal that a word has at least two senses or uses and that the author does not like one of them. Bryan Garner (the author of the above entry from Chicago), in his Modern American Usage, frequently warns of such “skunked” words and usually recommends avoiding them altogether. This may seem like sensible advice, but it seems to me to be motivated by a sense of jealousy—if the word can’t mean what the advice-giver wants it to mean, then no one can use it.

But the truly strange assumption is that words have meaning that is somehow independent of their usage. If 99 percent of the population uses nauseous in the sense of having nausea, then who’s to say that they’re wrong? Who has the authority to declare this sense “poor usage”? And yet Garner says, rather unequivocally, “Whatever is nauseous induces a feeling of nausea.” How does he know this is what nauseous means? It’s not as if there is some platonic form of words, some objective true meaning from which a word must never stray. After all, language changes, and an earlier form is not necessarily better or truer than a newer one. As Merriam-Webster editor Kory Stamper recently pointed out on Twitter, stew once meant “whorehouse”, and this sense dates to the 1300s. The food sense arose four hundred years later, in the 1700s. Is this poor usage because it’s a relative upstart supplanting an older established sense? Of course not.

People stopped using nauseous to mean “inclined to nausea” several hundred years ago, and so it no longer means that. Similarly, most people no longer use nauseous to mean “causing nausea”, and so that meaning is waning. In another hundred years, it may be gone altogether. For now, it hangs on, but this doesn’t mean that the newer and overwhelmingly more common sense is poor usage. The new sense is only poor usage inasmuch as someone says it is. In other words, it all comes down to someone’s opinion. As I’ve said before, pronouncements on usage that are based simply on someone’s opinion are ultimately unreliable, and any standard that doesn’t take into account near-universal usage by educated speakers in edited writing is doomed to irrelevance.

So go ahead and use nauseous. The “having nausea” sense is now thoroughly established, and it seems silly to avoid a perfectly good word just because a few peevers dislike it. Even if you stick to the more traditional “causing nausea” sense, you’re unlikely to confuse anyone, because context will make the meaning clear. Just be careful about people who make unsupported claims about language.

By

Celebrate T-Shirt Day with 15% Off

T-shirt day is June 21st, and in preparation for the big day, Spreadshirt is offering 15 percent off all t-shirts when you use the coupon code MYSHIRT2014 between now and June 10th. If you met me at the annual conferences of the American Copy Editors Society and liked my shirts, now’s a good chance to get one for yourself. Go check out what’s available in the Arrant Pedantry Store.

And if you’re not the word-nerd-T-shirt-buying type, don’t worry—a new post is coming soon.

By

Mother’s Day

Today is officially Mother’s Day, and as with other holidays with possessive or plural endings, there’s a lot of confusion about what the correct form of the name is. The creator of Mother’s Day in the United States, Anna Jarvis, specifically stated that it should be a singular possessive to focus on individual mothers rather than mothers in general. But as sociolinguist Matt Gordon noted on Twitter, “that logic is quite peccable”; though it’s a nice sentiment, it’s grammatical nonsense.

English has a singular possessive and a plural possessive; it does not have a technically-plural-but-focusing-on-the-singular possessive. Though Jarvis may have wanted everyone to focus on their respective mothers, the fact is that it still celebrates all mothers. If I told you that tomorrow was Jonathon’s Day, you’d assume that it’s my day, not that it’s the day for all Jonathons but that they happen to be celebrating separately. That’s simply not how grammatical number works in English. If you have more than one thing, it’s plural, even if you’re considering those things individually.

This isn’t the only holiday that employs some grammatically suspect reasoning in its official spelling—Veterans Day officially has no apostrophe because the day doesn’t technically belong to veterans. But this is silly—apostrophes are used for lots of things beyond simple ownership.

It could be worse, though. The US Board on Geographic Names discourages possessives altogether, though it allows the possessive s without an apostrophe. The peak named for Pike is Pikes Peak, which is worse than grammatical nonsense—it’s an officially enshrined error. The worst part is that there isn’t even a reason given for this policy, though presumably it’s because they don’t want to indicate private ownership of geographical features. (Again, the apostrophe doesn’t necessarily show ownership.) But in this case you can’t even argue that Pike is a plural attributive noun, because there’s only one Pike who named the peak.

The sad truth is that the people in charge of deciding where or whether to put apostrophes in things don’t always have the best grasp of grammar, and they don’t always think to consult someone who does. But even if the grammar of Mother’s Day makes me roll my eyes, I can still appreciate the sentiment. In the end, arguing about the placement of an apostrophe is a quibble. What matters most is what the day really means. And this day is for you, Mom.

By

Over Has Always Meant More Than. Get Over it.

Last month, at the yearly conference of the American Copy Editors Society, the editors of the AP Stylebook announced that over in the sense of more than was now acceptable. For decades, newspaper copy editors had been changing constructions like over three hundred people to more than three hundred people; now, with a word from AP’s top editors, that rule was being abandoned.

According to Merriam-Webster editor Peter Sokolowski, who was in attendance, the announcement was met with gasps. Editors quickly took to Twitter and to blogs to express their approval or dismay. Some saw it as part of the dumbing-down of the language or as a tacit admission that newspapers no longer have the resources to maintain their standards. Others saw it as the banishment of a baseless superstition that has wasted copy editors’ time without improving the text.

The argument had been that over must refer to spatial relationships and that numerical relationships must use more than . But nobody objects to other figurative uses of over, such as over the weekend or get over it or in over your head or what’s come over you?. The rule forbidding the use of over to mean more than was first codified in the 1800s, but over can be found in this sense going back a thousand years or more, in some of the earliest documents written in English.

Not only that, but parallel uses can be found in other Germanic languages, including German, Dutch, and Swedish. (Despite all its borrowings from French, Latin, and elsewhere, English is considered a Germanic language.) There’s nothing wrong with the German Kinder über 14 Jahre (children over 14 years) (to borrow an example from the Collins German-English Dictionary) or the Swedish Över femhundra kom (more than five hundred came). This means that this use of over actually predates English and must have been inherited from the common ancestor of all the Germanic languages, Proto-Germanic, some two thousand years ago.

Mignon Fogarty, aka Grammar Girl, wrote that “no rationale exists for the ‘over can’t mean more than’ rule.” And in a post on the Merriam-Webster Unabridged blog, Sokolowski gave his own debunking, concluding that “we just don’t need artificial rules that do not promote the goal of clarity.” But none of this was good enough for some people. AP’s announcement caused a rift in the editing staff at Mashable, who debated the rule on the lifestyle blog.

Alex Hazlett argued that the rule “was an arbitrary style decision that had nothing to do with grammar, defensible only by that rationale of last resort: tradition.” Megan Hess, though, took an emotional and hyperbolic tack, claiming that following rules like this prevents the world from slipping into “a Lord of the Flies-esque dystopia.” From there her argument quickly becomes circular: “The distinction is one that distinguishes clean, precise language and attention to detail — and serves as a hallmark of a proper journalism training.” In other words, editors should follow the rule because they’ve been trained to follow the rule, and the rule is simply a mark of clean copy. And how do you know the copy is clean? Because it follows rules like this. As Sokolowski says, this is nothing more than a shibboleth—the distinction serves no purpose other than to distinguish those in the know from everyone else.

It’s also a perfect example of a mumpsimus. The story goes that an illiterate priest in the Middle Ages had learned to recite the Latin Eucharist wrong: instead of sumpsimus (Latin for “we have taken”), he said mumpsimus, which is not a Latin word at all. When someone finally told him that he’d been saying it wrong and that it should be sumpsimus, he responded that he would not trade his old mumpsimus for this person’s new sumpsimus. He didn’t just refuse to change—he refused to recognize that he was wrong and had always been wrong.

But so what if everyone’s been using over this way for longer than the English language has existed? Just because everyone does it doesn’t mean it’s right, right? Well, technically, yes, but let’s flip the question around: what makes it wrong to use over to mean more than? The fact that the over-haters have had such an emotional reaction is telling. It’s surprisingly easy to talk yourself into hating a particular word or phrase and to start judging everyone who allegedly misuses it. And once you’ve developed a visceral reaction to a perceived misuse, it’s hard to be persuaded that your feelings aren’t justified.

We editors take a lot of pride in our attention to language—which usually means our attention to the usage and grammar rules that we’ve been taught—so it can seem like a personal affront to be told that we were wrong and have always been wrong. Not only that, but it can shake our faith in other rules. If we were wrong about this, what else might we have been wrong about? But perhaps rather than priding ourselves on following the rules, we should pride ourselves on mastering them, which means learning how to tell the good rules from the bad.

Learning that you were wrong simply means that now you’re right, and that can only be a good thing.