Arrant Pedantry

By

The “Only” Comma, pt. 1

A little while ago, one of my coworkers came to me with a conundrum. She had come across a sentence like “Ryan founded the company with his brother Scott” in something she was editing, and she couldn’t figure out if “brother” should be followed by a comma. She’d already spent quite a bit of time trying to answer the question, but she was coming up empty-handed.

The problem? She didn’t know how many brothers Ryan had.

If you’re a little baffled by the relationship between commas and how many brothers someone has, you’ve probably never heard of restrictive and nonrestrictive appositives. An appositive is a word or phrase that follows another and modifies it or provides additional information. In this case, the name “Scott” is an appositive for “brother”; it tells you more about the brother’s identity.

Sometimes an appositive provides information that you need in order to understand the sentence, but sometimes it just provides information that’s helpful but not strictly necessary. The Chicago Manual of Style gives these two examples in section 5.23 (the appositives are bolded):

Robert Burns, the poet, wrote many songs about women named Mary.
The poet Robert Burns wrote many songs about women named Mary.

In the first sentence, “the poet” simply provides extra information about Robert Burns, and it could be deleted without affecting the meaning of the sentence. But in the second, “Robert Burns” is necessary. If you cut it out, you wouldn’t know who “the poet” referred to. The former kind of appositive is often called nonrestrictive, while the latter is called restrictive. The second appositive restricts the reference of “the poet” to Robert Burns—that is, it specifies which poet we’re talking about. The first one doesn’t do that, so it’s called nonrestrictive.

The general rule, as it’s presented in The Chicago Manual of Style and elsewhere, is that if there’s more than one thing that the noun could refer to, then the appositive should be restrictive. That is, the appositive needs to specify which of the possible things we’re talking about. If there’s only one thing to which the appositive might refer, then it’s nonrestrictive.

For example, there’s been more than one poet in the history of the earth, so we need a restrictive appositive to tell us that the one in question is Robert Burns. Therefore, going back to my coworker’s problem, if Ryan has more than one brother, then his brother’s name should be restrictive to tell us which of his several brothers we’re talking about, but if he has only one brother, then it should be a nonrestrictive appositive (because there’s only one person that “his brother” could refer to, so the name is just extra information). For this reason, in his book Dreyer’s English, Benjamin Dreyer calls the comma before a nonrestrictive appositive the “only” comma. That is, a comma before “Scott” would tell you that he’s Ryan’s only brother. (Though if “Scott” appears in the middle of a sentence, as in “Ryan and his brother, Scott, founded a company”, then you would need commas on both sides of the appositive to set it off.)

The problem is that this forces editors to waste time doing genealogy work when we really should just be editing. My coworker had already spent who knows how long trying to figure out how many brothers Ryan had, but she couldn’t find anything definitive. So should she put in a comma or not?

I gave her a controversial opinion: I would leave the comma out, because it simply doesn’t matter how many brothers Ryan has. If it were relevant, why wouldn’t the writer have made it more explicit, as in “Ryan founded the company with his only brother, Scott”?

I’m not sure what my coworker ended up doing, but she didn’t seem happy with my heretical opinion on commas. Afterwards, I took to Twitter to voice my opinion that worrying about these commas is a waste of time. The ensuing discussion prompted a friend and fellow editor, Iva Cheung, to make the following cartoon, which she dedicated to me:

(Follow the link to see the mouseover text and bonus.)

It may indeed sound ridiculous, but my coworker is far from the only editor or writer to have grappled with this problem. In a New Yorker piece on the magazine’s famously assiduous fact-checking, John McPhee writes about a similar dilemma. In a book draft, he had written, “Penn’s daughter Margaret fished in the Delaware.” But was that right? He writes, “Should there be commas around Margaret or no commas around Margaret? The presence or absence of commas would, in effect, say whether Penn had one daughter or more than one. The commas—there or missing there—were not just commas; they were facts.”

But as Jan Freeman, a former copyeditor, asked in a column for the Boston Globe, “Were they important facts?” She continues, “How much time should you spend finding the answer—commas or no commas—to a question nobody’s asking?”

That is, is any reader asking how many daughters William Penn had or how many brothers Ryan had? Or, to be more specific, is anyone thinking, “I wonder if the number of brothers Ryan has is exactly equal to one or is some unspecified number greater than one”? And even if they are, are they expecting that information to be communicated via a comma or the lack thereof? I suspected that most people who aren’t editors aren’t reading as much into those commas as we think we’re putting into them, so I turned to Facebook to ask my friends and family members. The results were pretty surprising.

I provided the following sentences and asked what people thought the difference was:

Frank and his brother Steve started a company.
Frank and his brother, Steve, started a company.

Some people said that you use the first sentence if the reader doesn’t know Steve and the second one if they do. Some people said that the latter was always correct and that the former is incorrect or at least more casual. But someone else said that the first sentence looked correct and that the second looked overpunctuated. Another person said that the second sentence gives more emphasis to Frank’s brother. Someone else said that the second implied that the name of Frank’s brother was being provided for the first time and possibly that it’s his only brother, while the first implied that we already know the name of Frank’s brother. But someone else said that she’d use commas if she went into business with one of her brothers, but she’d use no commas if she went into business with her one and only husband. A couple of people said that they thought the issue had to do with whether or not the information in the appositive was needed as a qualifier—that is, whether the sentence makes sense without it. Someone else thought that you don’t need commas if the appositive is short but that you do if it’s longer. Another commenter said that the rule probably varied from one style guide to another. But a few people said they’d read no difference between the two, and one friend responded simply with this gif:

I-DENTICAL!

Out of more than two dozen respondents, only a few answered with the editorially sanctioned explanation: that the first implies that Frank has multiple brothers, while the second implies that he has only one. One person posted this comment: “If a writer wants to convey that Frank has one brother or more, this is an awful way of sneaking in that information. If the information is irrelevant, then I think most readers will not notice the presence or absence of a comma, or conclude anything on that basis, and that’s just fine.”

I think that there are two connected issues here: what the comma means and whether it’s important to communicate that an appositive is the only thing in its class or one of multiple things in its class. And both of them are essentially questions of pragmatics.

Most people think of meaning as something that is simply inherent in words (or punctuation marks) themselves. Put in a comma, and the sentence means one thing. Leave it out, and it means something else. But meaning is a lot messier than this. It depends a lot on what the speaker or writer intends and on how the listener or reader receives it.

In other words, there are really three aspects to meaning: the basic meaning of the utterance itself, known as the locution; the intent of the writer or speaker, known as the illocution; and the way in which the listener or reader interprets the message, known as the perlocution. That is, meaning isn’t found only in the utterance itself; it’s found in the entire exchange between writer and reader.

As I explained in a previous post, sometimes there’s a mismatch between the intended meaning and the form of the utterance itself. For example, if I ask, “Do you know what time it is?”, I’m not literally just checking to see if you have knowledge of the time. I’m asking you to tell me the time, but I’m doing it in a slightly indirect way, because sometimes that’s more polite—maybe I don’t know if you have a watch or phone handy, so I don’t want to presume. In this case, we could say that the illocution (my intent) is “Tell me the time”, even though the locution itself is literally just asking if you know the time, not asking you to tell me the time. Even though my utterance has the form of a yes-or-no question, you’d probably only answer “Yes, I know what time it is” if you were trying to be a smart alec. But people are usually pretty good at reading each other’s intent, so the perlocution—the message you receive—is “Jonathon wants me to tell him the time.”

The comma example is supposedly straightforward. If the writer or editor intends for a comma to indicate that Ryan has only one brother, and if it’s an established convention that that comma indicates that the thing that comes after it is the only thing that the preceding noun could refer to, and if the reader gleans from that comma that Ryan has only one brother, then everything works just as it’s supposed to. But if, for example, the writer intends to communicate that someone has only one spouse but they leave out the comma, then sometimes smart-alecky readers or editors ignore the writer’s obvious intent and insist on an incorrect reading based on the absence of the comma. That is, they ignore the obvious illocution and deliberately misread the text based on a convention that may not be shared by everyone. They’re essentially pretending that meaning comes only from the locution and not from the writer’s intent.

For instance, I remember one time in my basic copyediting course in college when my professor pointed out a book dedication that read something like “To my wife Mary”. She said that the lack of a comma clearly means that the author is a polygamist. I think I was the only one in the class who didn’t laugh at the joke. I just thought it was stupid, because obviously we know that the author isn’t a polygamist. First off, polygamy isn’t legal in the US, so it’s a pretty safe assumption that the author has only one wife. Second, if he had really meant to dedicate the book to one of his multiple wives, he probably would have written something like “To my third wife, Mary”. Pretending to misunderstand someone based on a rule that most readers don’t even know just makes you look like a jerk.

And, judging from the responses I got on Facebook, it appears that most readers are indeed unfamiliar with the rule. Many of them don’t know what the comma is supposed to mean or even that it’s supposed to mean something. Whether the comma has no inherent meaning or has an unclear meaning, there’s a problem with the locution itself. The “only” comma simply isn’t an established convention for most readers.

But there’s a problem with the illocution too, and here’s where the other question of pragmatics comes in to play. Conversation—even if it’s just the sort of one-way conversation that happens between a writer and a hypothetical reader—is generally guided by what linguists call the cooperative principle. And part of this principle is the idea that our contribution to the conversation will be relevant and will be communicated in an understandable manner.

As one of my commenters said, “If a writer wants to convey that Frank has one brother or more, this is an awful way of sneaking in that information.” So we end up with two pragmatic problems: editors are inserting irrelevant information into the text, but readers don’t even pick up on that information because they’re unaware of the convention or don’t anticipate what the editor is trying to communicate. Even when they try to guess the editor’s intent (because it’s almost always the editor putting in or taking out the comma, not the writer), they often guess wrong, because it’s not obvious why someone would be trying to sneak in information like “Ryan has only one brother” in this manner. In effect, the two problems cancel out, and all we’ve done is waste time and possibly annoy our writers and waste their time as well.

And because so few of our readers understand the purpose of the “only” comma, I think it falls firmly into what John McIntyre calls “dog-whistle editing“, which he defines as “attention to distinctions of usage”—or, in this case, punctuation—“that only other copy editors can hear.”

And, as Jan Freeman showed in her Boston Globe column, there’s evidence that this rule is a relatively recent invention. No wonder readers don’t know what the “only” comma means—it’s a convention that editors just made up. And, for the record, I’m not saying that the whole restrictive/nonrestrictive distinction is bunk, but I do think that the “only” comma is the result of an overly literal interpretation of that distinction. (But I’ll save the exploration of the rule’s origins for a future post.)

For now, I think that the solution, as I told my coworker, is to just stop worrying about it. It almost never matters whether someone is someone else’s only brother or daughter or friend or whether a book is someone’s only book, and it’s certainly not worth the time we spend trying to track down that information. Editing is fundamentally about helping the writer communicate with the reader, and I don’t think this rule serves that purpose. Let’s put the dog whistle away and worry about things that actually matter.

By

For Whomever the Bell Tolls

A couple of weeks ago, Ben Yagoda wrote a post on Lingua Franca in which he confessed to being a whomever scold. He took a few newspapers to task for messing up and using whomever where whoever was actually called for, and then he was taken to task himself by Jan Freeman. He said that “Whomever Mr. Trump nominates will inherit that investigation” should have the subject form, whoever, while she said that the object form, whomever was indeed correct. So what’s so tricky about whoever that even experts disagree about how to use it?

To answer that, we need to back up and explain why who trips so many people up. Who is an interrogative and relative pronoun, which means that it’s used to ask questions and to form relative clauses. One feature of both questions and relative clauses is that they cause movement—that is, the pronoun moves from where it would normally be in a declarative sentence to a position at the beginning of the clause. For example, a sentence like You gave it to him becomes Who did you give it to? when made into a question, with the personal pronoun him being changed to who and moved to the front. Or a pair of sentences like I gave it to the woman. I met her at the conference becomes I gave it to the woman who I met at the conference. Again, the personal pronoun her is replaced with who and moved up.

Technically, both of these examples should use whom, because in both cases it’s replacing an object pronoun, and whom is the object form of who. But we often have trouble keeping track of the syntactic role of who(m) when it moves, so many people just who regardless of whether it’s syntactically a subject or object. Sometimes people overcorrect and use whom where it’s syntactically a subject, as in Whom may I say is calling?

Whoever adds another layer of complexity. It’s what we call a fused relative pronoun—it functions as both the relative pronoun and its own antecedent. Let’s go back to our example above: I gave it to the woman who I met at the conference. The antecedent of who is the woman. But we can replace both with whoever: I gave it to whoever I met at the conference.

Because a fused relative functions as its own antecedent, it fills roles in two different clauses—the main clause and the relative clause. And whereas a simple relative like who is always just a subject or an object in the relative clause, whoever can be both a subject and an object simultaneously thanks to its dual roles. There are four possible combinations:

  1. Subject of main clause, subject of relative clause: Whoever ate the last cookie is in trouble.
  2. Object in main clause, subject of relative clause: I’ll give the last cookie to whoever wants it.
  3. Subject of main clause, object in relative clause: Whoever you gave the last cookie to is lucky.
  4. Object in main clause, object in relative clause: I’ll give the last cookie to whoever I like the most.

So if whoever can fill two different roles in two different clauses, how do we decide whether to use the subject or object form? Which role wins out?

The traditional rule is that the role in the relative clause wins. If it’s the subject of the relative clause, use whoever. If it’s the object of the relative clause, use whomever. This means that the prescribed forms in the sentences above would be (1) whoever, (2) whoever, (3) whomever, and (4) whomever

The rationale for this rule is that the relative clause as a whole functions as the subject or as an object within the main clause. That is, the relative clause is treated as a separate syntactic unit, and that unit is then slotted into the main clause. Thus it doesn’t matter if whoever follows a verb or a preposition—the only thing that matters is its role in the relative clause.

I think this is easier to understand with sentence diagrams. Note that in the diagram below, whoever occupies a place in two different structures—it’s simultaneously the complement of the preposition to and the subject of the relative clause. Syntax diagrams normally branch, but in in this case they converge because whoever fuses those two roles together.

Grammatical case is governed by the word on which the pronoun is dependent, so we can think of case assignment as coming down from the verb or preposition to the pronoun. In the diagram above, the case assignment for whoever (represented by the arrows) comes from its role in the relative clause. Normally the preposition to would assign case to its complement, but in this situation it’s blocked, because case has already been assigned at the level of the relative clause.

Of course, case in English has been a mess ever since the Norman Conquest. English went from being a highly inflected language that marked case on all nouns, pronouns, and adjectives to a minimally inflected language that marks case only on a small handful of pronouns. Our internal rules governing pronoun case seem to have broken down to some extent, leading to a number of constructions where subject and object forms are in alternation, such as between you and I or me and her went to the store. The Oxford English Dictionary has examples of whomever being used for whoever going all the way back to John Wyclif in 1380 and examples of whoever being used for whomever going back to Shakespeare in 1599.

Which brings us back to Yagoda’s original post. The sentence that brought correction from Jan Freeman was “Whomever Mr. Trump nominates will inherit that investigation.” Yagoda said it should be whoever; Freeman said it was correct as is. Yagoda eventually conceded that he was wrong and that the Times sentence was right, but not before a side trip into let him who. Freeman linked to this post by James Harbeck, in which he explains that constructions like let him who is without sin don’t work quite the same way as let whoever is without sin.

A lot of people have learned that the clause with whoever essentially functions as the object of let, but many people then overextend that rule and say that the entire construction he who is without sin is the object of let. To understand why it’s not, let’s use another syntax diagram.

Note the differences between this diagram and the previous one. Him is the object of the verb let, and who is . . . is a relative clause that modifies him. But, crucially, him is not part of that clause; it’s merely the antecedent to the relative pronoun. Its case assignment comes from the verb let, while the case assignment of who comes from its role in the relative clause.

For he to be the correct form here, its case would have to be controlled by the verb in a relative clause that it’s not even a part of. Case assignment essentially flows downhill from the structure above the pronoun; it doesn’t flow uphill to a structure above it.

But apparently Harbeck’s post wasn’t enough to convince Yagoda. While admitting that he didn’t understand Harbeck’s argument, he nevertheless said he disagreed with it and declared that he was on team “Let he who is without sin . . .”

Some of the commenters on Yagoda’s post, though, had an elegant test to show that Yagoda was wrong without resorting to syntax trees or discussions of case assignment: simply remove the relative clause. In Let him cast the first stone, it’s clear that him is an object. The relative clause may add extra information about who exactly is casting the first stone, but it’s grammatically optional and thus shouldn’t affect the case of its antecedent.

In conclusion, case in English is a bit of a mess and a syntactic analysis can help, but sometimes the simplest solutions are best.

By

Prescriptivism and Language Change

Recently, John McIntyre posted a video in which he defended the unetymological use of decimate to the Baltimore Sun’s Facebook page. When he shared it to his own Facebook page, a lively discussion ensued, including this comment:

Putting aside all the straw men, the ad absurdums, the ad hominems and the just plain sillies, answer me two questions:
1. Why are we so determined that decimate, having once changed its meaning to a significant portion of the population, must be used to mean obliterate and must never be allowed to change again?
2. Is your defence of the status quo on the word not at odds with your determination that it is a living language?
3. If the word were to have been invented yesterday, do you really think “destroy” is the best meaning for it?
…three questions!

Putting aside all the straw men in these questions themselves, let’s get at what he’s really asking, which is, “If decimate changed once before from ‘reduce by one-tenth’ to ‘reduce drastically’, why can’t it change again to the better, more etymological meaning?”

I’ve seen variations on this question pop up multiple times over the last few years when traditional rules have been challenged or debunked. It seems that the notions that language changes and that such change is normal have become accepted by many people, but some of those people then turn around and ask, “So if language changes, why can’t change it in the way I want?” For example, some may recognize that the that/which distinction is an invention that’s being forced on the language, but they may believe that this is a good change that increases clarity.

On the surface, this seems like a reasonable question. If language is arbitrary and changeable, why can’t we all just decide to change it in a positive way? After all, this is essentially the rationale behind the movements that advocate bias-free or plain language. But whereas those movements are motivated by social or cognitive science and have measurable benefits, this argument in favor of old prescriptive rules is just a case of motivated reasoning.

The bias-free and plain language movements are based on the premises that people deserve to be treated equally and that language should be accessible to its audience. Arguing that decimated really should mean “reduced by one-tenth” is based on a desire to hang on to rules that one was taught in one’s youth. It’s an entirely post hoc rationale, because it’s only employed to defend bad rules, not to determine the best meaning for or use of every word. For example, if we really thought that narrower etymological senses were always better, shouldn’t we insist that cupboard only be used to refer to a board on which one places cups?

This argument is based in part on a misunderstanding of what the descriptivist/prescriptivist debate is all about. Nobody is insisting that decimate must mean “obliterate”, only observing that it is used in the broader sense far more often than the narrower etymological sense. Likewise, no one is insisting that the word must never be allowed to change again, only noting that it is unlikely that the “destroy one-tenth” sense will ever be the dominant sense. Arguing against a particular prescription is not the same as making the opposite prescription.

But perhaps more importantly, this argument is based on a fundamental misunderstanding of how language change works. As Allan Metcalf said in a recent Lingua Franca post, “It seems a basic principle of language that if an expression is widely used, that must be because it is widely useful. People wouldn’t use a word if they didn’t find it useful.” And as Jan Freeman has said, “we don’t especially need a term that means ‘kill one in 10.’” That is, the “destroy one-tenth” sense is not dominant precisely because it is not useful.

The language changed when people began using the word in a more useful way, or to put it more accurately, people changed the language by using the word in a more useful way. You can try to persuade them to change back by arguing that the narrow meaning is better, but this argument hasn’t gotten much traction in the 250 years since people started complaining about the broader sense. (The broader sense, unsurprisingly, dates back to the mid-1600s, meaning that English speakers were using it for a full two centuries before someone decided to be bothered by it.)

But even if you succeed, all you’ll really accomplish is driving decimate out of use altogether. Just remember that death is also a kind of change.

By

Book Review: Perfect English Grammar

Disclosure: I received a free review PDF of this book from Callisto Media.

perfectenglishgrammar

Grant Barrett, cohost of the public radio program A Way with Words, recently published a book called Perfect English Grammar: The Indispensable Guide to Excellent Writing and Speaking. In it, Barrett sets out to help writers like himself who may not have gotten the best education in grammar or composition in school, ranging from middle-school students to “business professionals and community leaders who need a refresher on grammar points they last thought about decades ago.”

The book is designed as a reference book, something to be pulled out and consulted in those moments when you can’t remember the difference between a present perfect and a past perfect or between an initialism and a conjunction. The book is well organized, with chapters like “Verbs” broken down into topics like person, number, mood, linking verbs, and so on. The different topics are also very clearly marked, with bold colors and clear headings that make it easy to flip through in case you’d rather browse than use the table of contents or index.

Barrett starts with some general principles of writing like writing for your audience rather than yourself, avoiding using a thesaurus to learn fancy new words, and sticking to whichever style guide is appropriate in your field. He then moves on to the basics of composition, with a reminder to be aware of register and some good tips for getting started if you’re feeling stuck.

One weak spot in the chapter on composition was the section on paragraph and essay structure. Though Barrett says that paragraphs don’t have to be a certain length, he says that a paragraph should have a topic sentence, supporting sentences, and a conclusion sentence, and he explains that the classic five-paragraph essay has a similar structure. I’ve never been a fan of the five-paragraph essay as a way to teach composition. Perhaps it’s a necessary stepping-stone on the way to better composition, but to me it always felt more like a straitjacket, designed to keep students from hurting themselves and their teachers. But the chapter ends with some good advice on writing transitions, avoiding common mistakes, and having your work edited.

The later chapters on parts of speech, spelling and style, and sentence structure provide helpful introductions or refreshers to the topics, and I like that Barrett uses more current linguistic terminology. For example, he talks about verb tense and aspect rather than just tense (though I think the explanation of aspect could have been a little clearer), and he groups articles, possessives, quantifiers, and others under determiners. He also defends the passive voice, saying, “Both active and passive voices are essential to everyday writing and speaking. Broadside suggestions that you should avoid the passive voice are misguided and should be ignored.”

Though his treatment of various aspects of grammar is sometimes a little brief, he uses grammar mostly as a way to talk about frequent problem areas for novice writers, and this is where the book is most valuable. You have to have at least a basic understanding of what an independent clause is before you can identify a comma splice, and you have to be able to identify a subject and verb and be aware of some common tricky areas before you can identify a subject-verb agreement problem.

However, I found a few pieces of usage advice a little less helpful. For instance, Barrett advocates the singular they (which I was happy to see) but warns against sentential hopefully—even though it is, as he says, fully grammatical—because some people have been erroneously taught to dislike it. He also recommends following the rule requiring the strict placement of only, which Jan Freeman (among others) addressed here. In that column, published in 2009, Freeman asked for readers to send her examples of truly ambiguous onlys. I was apparently the first person to send her such an example, nearly five years after her column was published.

Most of the usage advice, though, is solid, and some of it is even quite refreshing, like this passage in which he addresses the usual advice about avoiding adverbs: “There is nothing whatsoever intrinsically wrong with adverbs. In fact, avoiding them leads to bland, forgettable writing. You can and should use adverbs.” My biggest complaint with the chapter on usage and style is simply that it is too short; there are many more usage items that a novice writer may need help with that aren’t covered here.

Despite these quibbles, I think the book is full of good advice that will be helpful to both novices and more experienced writers who may need a refresher on basic topics of grammar, usage, and style.

By

Why Is It “Woe Is Me”?

I recently received an email asking about the expression woe is me, namely what the plural would be and why it’s not woe am I. Though the phrase may strike modern speakers as bizarre if not downright ungrammatical, there’s actually a fairly straightforward explanation: it’s an archaic dative expression. Strange as it may seem, the correct form really is woe is me, not woe am I or woe is I, and the first-person plural would simply be woe is us. I’ll explain why.

Today English only has three cases—nominative (or subjective), objective, and genitive (or possessive)—and these cases only apply to personal pronouns and who. Old English, on the other hand, had four cases (and vestiges of a fifth), and they applied to all nouns, pronouns, and adjectives. Among these four were two different cases for objects: accusative and dative. (The forms that we now think of simply as object pronouns actually descend from the dative pronouns, though they now cover the functions of both the accusative and dative.) These correspond roughly to direct and indirect objects, respectively, though they could be used in other ways too.

For instance, some prepositions took accusative objects, and some took dative objects (and some took either depending on the meaning). Nouns and pronouns in the accusative and dative cases could also be used in ways that seem strange to modern speakers. The dative, for example, could be used in places where we would normally use to and a pronoun. In some constructions we still have the choice between a pronoun or to and a pronoun—think of how you can say either I gave her the ball or I gave the ball to her—but in Old English you could do this to a much greater degree.

In the phrase woe is me, woe is the subject and me is a dative object, something that isn’t allowed in English today. It really means woe is to me. Today the phrase woe is me is pretty fixed, but some past variations on the phrase make the meaning a little clearer. Sometimes it was used with a verb, and sometimes woe was simply followed by a noun or prepositional phrase. In the King James Bible, we find “If I be wicked, woe unto me” (Job 10:15). One example from Old English reads, “Wa biþ þonne þæm mannum” (woe be then [to] those men).

So “woe is I” is not simply a fancy or archaic way of saying “I am woe” and is thus not parallel to constructions like “it is I”, where the nominative form is usually prescribed and the objective form is proscribed. In “woe is me”, “me” is not a subject complement (also known as a predicative complement) but a type of dative construction.

Thus the singular is is always correct, because it agrees with the singular mass noun woe. And though we don’t have distinct dative pronouns anymore, you can still use any pronoun in the object case, so woe is us would also be correct.

Addendum: Arika Okrent, writing at Mental Floss, has also just posted a piece on this construction. She goes into a little more detail on related constructions in English, German, and Yiddish.

And here are a couple of articles by Jan Freeman from 2007, specifically addressing Patricia O’Conner’s Woe Is I and a column by William Safire on the phrase:

Woe Is Us, Part 1
Woe Is Us, Continued

By

12 Mistakes Nearly Everyone Who Writes About Grammar Mistakes Makes

There are a lot of bad grammar posts in the world. These days, anyone with a blog and a bunch of pet peeves can crank out a click-bait listicle of supposed grammar errors. There’s just one problem—these articles are often full of mistakes of one sort or another themselves. Once you’ve read a few, you start noticing some patterns. Inspired by a recent post titled “Grammar Police: Twelve Mistakes Nearly Everyone Makes”, I decided to make a list of my own.

1. Confusing grammar with spelling, punctuation, and usage. Many people who write about grammar seem to think that grammar means “any sort of rule of language, especially writing”. But strictly speaking, grammar refers to the structural rules of language, namely morphology (basically the way words are formed from roots and affixes), phonology (the system of sounds in a language), and syntax (the way phrases and clauses are formed from words). Most complaints about grammar are really about punctuation, spelling (such as problems with you’re/your and other homophone confusion) or usage (which is often about semantics). This post, for instance, spends two of its twelve points on commas and a third on quotation marks.

2. Treating style choices as rules. This article says that you should always use an Oxford (or serial) comma (the comma before and or or in a list) and that quotation marks should always follow commas and periods, but the latter is true only in most American styles (linguists often put the commas and periods outside quotes, and so do many non-American styles), and the former is only true of some American styles. I may prefer serial commas, but I’m not going to insist that everyone who doesn’t use them is making a mistake. It’s simply a matter of style, and style varies from one publisher to the next.

3. Ignoring register. There’s a time and a place for following the rules, but the writers of these lists typically treat English as though it had only one register: formal writing. They ignore the fact that following the rules in the wrong setting often sounds stuffy and stilted. Formal written English is not the only legitimate form of the language, and the rules of formal written English don’t apply in all situations. Sure, it’s useful to know when to use who and whom, but it’s probably more useful to know that saying To whom did you give the book? in casual conversation will make you sound like a pompous twit.

4. Saying that a disliked word isn’t a word. You may hate irregardless (I do), but that doesn’t mean it’s not a word. If it has its own meaning and you can use it in a sentence, guess what—it’s a word. Flirgle, on the other hand, is not a word—it’s just a bunch of sounds that I strung together in word-like fashion. Irregardless and its ilk may not be appropriate for use in formal registers, and you certainly don’t have to like them, but as Stan Carey says, “‘Not a word’ is not an argument.”

5. Turning proposals into ironclad laws. This one happens more often than you think. A great many rules of grammar and usage started life as proposals that became codified as inviolable laws over the years. The popular that/which rule, which I’ve discussed at length before, began as a proposal—not “everyone gets this wrong” but “wouldn’t it be nice if we made a distinction here?” But nowadays people have forgotten that a century or so ago, this rule simply didn’t exist, and they say things like “This is one of the most common mistakes out there, and understandably so.” (Actually, no, you don’t understand why everyone gets this “wrong”, because you don’t realize that this rule is a relatively recent invention by usage commentators that some copy editors and others have decided to enforce.) It’s easy to criticize people for not following rules that you’ve made up.

6. Failing to discuss exceptions to rules. Invented usage rules often ignore the complexities of actual usage. Lists of rules such as these go a step further and often ignore the complexities of those rules. For example, even if you follow the that/which rule, you need to know that you can’t use that after a preposition or after the demonstrative pronoun that—you have to use a restrictive which. Likewise, the less/fewer rule is usually reduced to statements like “use fewer for things you can count”, which leads to ugly and unidiomatic constructions like “one fewer thing to worry about”. Affect and effect aren’t as simple as some people make them out to be, either; affect is usually a verb and effect a noun, but affect can also be a noun (with stress on the first syllable) referring to the outward manifestation of emotions, while effect can be a verb meaning to cause or to make happen. Sometimes dumbing down rules just makes them dumb.

7. Overestimating the frequency of errors. The writer of this list says that misuse of nauseous is “Undoubtedly the most common mistake I encounter.” This claim seems worth doubting to me; I can’t remember the last time I heard someone say “nauseous”. Even if you consider it a misuse, it’s got to rate pretty far down the list in terms of frequency. This is why linguists like to rely on data for testable claims—because people tend to fall prey to all kinds of cognitive biases such as the frequency illusion.

8. Believing that etymology is destiny. Words change meaning all the time—it’s just a natural and inevitable part of language. But some people get fixated on the original meanings of some words and believe that those are the only correct meanings. For example, they’ll say that you can only use decimate to mean “to destroy one in ten”. This may seem like a reasonable argument, but it quickly becomes untenable when you realize that almost every single word in the language has changed meaning at some point, and that’s just in the few thousand years in which language has been written or can be reconstructed. And sometimes a new meaning is more useful anyway (which is precisely why it displaced an old meaning). As Jan Freeman said, “We don’t especially need a term that means ‘kill one in 10.’”

9. Simply bungling the rules. If you’re going to chastise people for not following the rules, you should know those rules yourself and be able to explain them clearly. You may dislike singular they, for instance, but you should know that it’s not a case of subject-predicate disagreement, as the author of this list claims—it’s an issue of pronoun-antecedent agreement, which is not the same thing. This list says that “‘less’ is reserved for hypothetical quantities”, but this isn’t true either; it’s reserved for noncount nouns, singular count nouns, and plural count nouns that aren’t generally thought of as discrete entities. Use of less has nothing to do with being hypothetical. And this one says that punctuation always goes inside quotation marks. In most American styles, it’s only commas and periods that always go inside. Colons, semicolons, and dashes always go outside, and question marks and exclamation marks only go inside sometimes.

10. Saying that good grammar leads to good communication. Contrary to popular belief, bad grammar (even using the broad definition that includes usage, spelling, and punctuation) is not usually an impediment to communication. A sentence like Ain’t nobody got time for that is quite intelligible, even though it violates several rules of Standard English. The grammar and usage of nonstandard varieties of English are often radically different from Standard English, but different does not mean worse or less able to communicate. The biggest differences between Standard English and all its nonstandard varieties are that the former has been codified and that it is used in all registers, from casual conversation to formal writing. Many of the rules that these lists propagate are really more about signaling to the grammatical elite that you’re one of them—not that this is a bad thing, of course, but let’s not mistake it for something it’s not. In fact, claims about improving communication are often just a cover for the real purpose of these lists, which is . . .

11. Using grammar to put people down. This post sympathizes with someone who worries about being crucified by the grammar police and then says a few paragraphs later, “All hail the grammar police!” In other words, we like being able to crucify those who make mistakes. Then there are the put-downs about people’s education (“You’d think everyone learned this rule in fourth grade”) and more outright insults (“5 Grammar Mistakes that Make You Sound Like a Chimp”). After all, what’s the point in signaling that you’re one of the grammatical elite if you can’t take a few potshots at the ignorant masses?

12. Forgetting that correct usage ultimately comes from users. The disdain for the usage of common people is symptomatic of a larger problem: forgetting that correct usage ultimately comes from the people, not from editors, English teachers, or usage commentators. You’re certainly entitled to have your opinion about usage, but at some point you have to recognize that trying to fight the masses on a particular point of usage (especially if it’s a made-up rule) is like trying to fight the rising tide. Those who have invested in learning the rules naturally feel defensive of them and of the language in general, but you have no more right to the language than anyone else. You can be restrictive if you want and say that Standard English is based on the formal usage of educated writers, but any standard that is based on a set of rules that are simply invented and passed down is ultimately untenable.

And a bonus mistake:

13. Making mistakes themselves. It happens to the best of us. The act of making grammar or spelling mistakes in the course of pointing out someone else’s mistakes even has a name, Muphry’s law. This post probably has its fair share of typos. (If you spot one, feel free to point it out—politely!—in the comments.)

This post also appears on Huffington Post.

By

What Descriptivism Is and Isn’t

A few weeks ago, the New Yorker published what is nominally a review of Henry Hitchings’ book The Language Wars (which I still have not read but have been meaning to) but which was really more of a thinly veiled attack on what its author, Joan Acocella, sees as the moral and intellectual failings of linguistic descriptivism. In what John McIntyre called “a bad week for Joan Acocella”, the whole mess was addressed multiple times by various bloggers and other writers.* I wanted to write about it at the time but was too busy, but then the New Yorker did me a favor by publishing a follow-up, “Inescapably, You’re Judged by Your Language”, which was equally off-base, so I figured that the door was still open.

I suspected from the first paragraph that Acocella’s article was headed for trouble, and the second paragraph quickly confirmed it. For starters, her brief description of the history and nature of English sounds like it’s based more on folklore than fact. A lot of people lived in Great Britain before the Anglo-Saxons arrived, and their linguistic contributions were effectively nil. But that’s relatively small stuff. The real problem is that she doesn’t really understand what descriptivism is, and she doesn’t understand that she doesn’t understand, so she spends the next five pages tilting at windmills.

Acocella says that descriptivists “felt that all we could legitimately do in discussing language was to say what the current practice was.” This statement is far too narrow, and not only because it completely leaves out historical linguistics. As a linguist, I think it’s odd to describe linguistics as merely saying what the current practice is, since it makes it sound as though all linguists study is usage. Do psycholinguists say what the current practice is when they do eye-tracking studies or other psychological experiments? Do phonologists or syntacticians say what the current practice is when they devise abstract systems of ordered rules to describe the phonological or syntactic system of a language? What about experts in translation or first-language acquisition or computational linguistics? Obviously there’s far more to linguistics than simply saying what the current practice is.

But when it does come to describing usage, we linguists love facts and complexity. We’re less interested in declaring what’s correct or incorrect than we are in uncovering all the nitty-gritty details. It is true, though, that many linguists are at least a little antipathetic to prescriptivism, but not without justification. Because we linguists tend to deal in facts, we take a rather dim view of claims about language that don’t appear to be based in fact, and, by extension, of the people who make those claims. And because many prescriptions make assertions that are based in faulty assumptions or spurious facts, some linguists become skeptical or even hostile to the whole enterprise.

But it’s important to note that this hostility is not actually descriptivism. It’s also, in my experience, not nearly as common as a lot of prescriptivists seem to assume. I think most linguists don’t really care about prescriptivism unless they’re dealing with an officious copyeditor on a manuscript. It’s true that some linguists do spend a fair amount of effort attacking prescriptivism in general, but again, this is not actually descriptivism; it’s simply anti-prescriptivism.

Some other linguists (and some prescriptivists) argue for a more empirical basis for prescriptions, but this isn’t actually descriptivism either. As Language Log’s Mark Liberman argued here, it’s just prescribing on the basis of evidence rather than person taste, intuition, tradition, or peevery.

Of course, all of this is not to say that descriptivists don’t believe in rules, despite what the New Yorker writers think. Even the most anti-prescriptivist linguist still believes in rules, but not necessarily the kind that most people think of. Many of the rules that linguists talk about are rather abstract schematics that bear no resemblance to the rules that prescriptivists talk about. For example, here’s a rather simple one, the rule describing intervocalic alveolar flapping (in a nutshell, the process by which a word like latter comes to sound like ladder) in some dialects of English:

intervocalic alveolar flapping

Rules like these constitute the vast bulk of the language, though they’re largely subconscious and unseen, like a sort of linguistic dark matter. The entire canon of prescriptions (my advisor has identified at least 10,000 distinct prescriptive rules in various handbooks, though only a fraction of these are repeated) seems rather peripheral and inconsequential to most linguists, which is another reason why we get annoyed when prescriptivists insist on their importance or identify standard English with them. Despite what most people think, standard English is not really defined by prescriptive rules, which makes it somewhat disingenuous and ironic for prescriptivists to call us hypocrites for writing in standard English.

If there’s anything disingenuous about linguists’ belief in rules, it’s that we’re not always clear about what kinds of rules we’re talking about. It’s easy to say that we believe in the rules of standard English and good communication and whatnot, but we’re often pretty vague about just what exactly those rules are. But that’s probably a topic for another day.

*A roundup of some of the posts on the recent brouhaha:

Cheap Shot”, “A Bad Week for Joan Acocella”, “Daddy, Are Prescriptivists Real?”, and “Unmourned: The Queen’s English Society” by John McIntyre

Rules and Rules” and “A Half Century of Usage Denialism” by Mark Liberman

Descriptivists as Hypocrites (Again)” by Jan Freeman

Ignorant Blathering at The New Yorker”, by Stephen Dodson, aka Languagehat

Re: The Language Wars” and “False Fronts in the Language Wars” by Steven Pinker

The New Yorker versus the Descriptivist Specter” by Ben Zimmer

Speaking Truth about Power” by Nancy Friedman

Sator Resartus” by Ben Yagoda

I’m sure there are others that I’ve missed. If you know of any more, feel free to make note of them in the comments.

By

Scriptivists Revisited

Before I begin: I know—it’s been a terribly, horribly, unforgivably long time since my last post. Part of it is that I’m often busy with grad school and work and family, and part of it is that I’ve been thinking an awful lot lately about prescriptivism and descriptivism and linguists and editors and don’t really know where to begin.

I know that I’ve said some harsh things about prescriptivists before, but I don’t actually hate prescriptivism in general. As I’ve said before, prescriptivism and descriptivism are not really diametrically opposed, as some people believe they are. Stan Carey explores some of the common ground between the two in a recent post, and I think there’s a lot more to be said about the issue.

I think it’s possible to be a descriptivist and prescriptivist simultaneously. In fact, I think it’s difficult if not impossible to fully disentangle the two approaches. The fact is that many or most prescriptive rules are based on observed facts about the language, even though those facts may be incomplete or misunderstood in some way. Very seldom does anyone make up a rule out of whole cloth that bears no resemblance to reality. Rules often arise because someone has observed a change or variation in the language and is seeking to slow or reverse that change (as in insisting that “comprised of” is always an error) or to regularize the variation (as in insisting that “which” be used for nonrestrictive relative clauses and “that” for restrictive ones).

One of my favorite language blogs, Motivated Grammar, declares “Prescriptivism must die!” but to be honest, I’ve never quite been comfortable with that slogan. Now, I love a good debunking of language myths as much as the next guy—and Gabe Doyle does a commendable job of it—but not all prescriptivism is a bad thing. The impulse to identify and fix potential problems with the language is a natural one, and it can be used for both good and ill. Just take a look at the blogs of John E. McIntyre, Bill Walsh, and Jan Freeman for examples of well-informed, sensible language advice. Unfortunately, as linguists and many others know, senseless language advice is all too common.

Linguists often complain about and debunk such bad language advice—and rightly so, in my opinion—but I think in doing so they often make the mistake of dismissing prescriptivism altogether. Too often linguists view prescriptivism as an annoyance to be ignored or as a rival approach that must be quashed, but either way they miss the fact that prescriptivism is a metalinguistic phenomenon worth exploring and understanding. And why is it worth exploring? Because it’s an essential part of how ordinary speakers—and even linguists—use language in their daily lives, whether they realize it or not.

Contrary to what a lot of linguists say, language isn’t really a natural phenomenon—it’s a learned behavior. And as with any other human behavior, we generally strive to make our language match observed standards. Or as Emily Morgan so excellently says in a guest post on Motivated Grammar, “Language is something that we as a community of speakers collectively create and reinvent each time we speak.” She says that this means that language is “inextricably rooted in a descriptive generalization about what that community does,” but it also means that it is rooted in prescriptive notions of language. Because when speakers create and reinvent language, they do so by shaping their language to fit listeners’ expectations.

That is, for the most part, there’s no difference in speakers’ minds between what they should do with language and what they do do with language. They use language the way they do because they feel as though they should, and this in turn reinforces the model that influences everyone else’s behavior. I’ve often reflected on the fact that style guides like The Chicago Manual of Style will refer to dictionaries for spelling issues—thus prescribing how to spell—but these dictionaries simply describe the language found in edited writing. Description and prescription feed each other in an endless loop. This may not be mathematical logic, but it is a sort of logic nonetheless. Philosophers love to say that you can’t derive an ought from an is, and yet people do nonetheless. If you want to fit in with a certain group, then you should behave in a such a way as to be accepted by that group, and that group’s behavior is simply an aggregate of the behaviors of everyone else trying to fit in.

And at this point, linguists are probably thinking, “And people should be left alone to behave the way they wish to behave.” But leaving people alone means letting them decide which behaviors to favor and which to disfavor—that is, which rules to create and enforce. Linguists often criticize those who create and propagate rules, as if such rules are bad simply as a result of their artificiality, but, once again, the truth is that all language is artificial; it doesn’t exist until we make it exist. And if we create it, why should we always be coolly dispassionate about it? Objectivity might be great in the scientific study of language, but why should language users approach language the same way? Why should we favor “natural” or “spontaneous” changes and yet disfavor more conscious changes?

This is something that Deborah Cameron addresses in her book Verbal Hygiene (which I highly, highly recommend)—the notion that “spontaneous” or “natural” changes are okay, while deliberate ones are meddlesome and should be resisted. As Cameron counters, “If you are going to make value judgements at all, then surely there are more important values than spontaneity. How about truth, beauty, logic, utility?” (1995, 20). Of course, linguists generally argue that an awful lot of prescriptions do nothing to create more truth, beauty, logic, or utility, and this is indeed a problem, in my opinion.

But when linguists debunk such spurious prescriptions, they miss something important: people want language advice from experts, and they’re certainly not getting it from linguists. The industry of bad language advice exists partly because the people who arguably know the most about how language really works—the linguists—aren’t at all interested in giving advice on language. Often they take the hands-off attitude exemplified in Robert Hall’s book Leave Your Language Alone, crying, “Linguistics is descriptive, not prescriptive!” But in doing so, linguists are nonetheless injecting themselves into the debate rather than simply observing how people use language. If an objective, hands-off approach is so valuable, then why don’t linguists really take their hands off and leave prescriptivists alone?

I think the answer is that there’s a lot of social value in following language rules, whether or not they are actually sensible. And linguists, being the experts in the field, don’t like ceding any social or intellectual authority to a bunch of people that they view as crackpots and petty tyrants. They chafe at the idea that such ill-informed, superstitious advice—what Language Log calls “prescriptivist poppycock”—can or should have any value at all. It puts informed language users in the position of having to decide whether to follow a stupid rule so as to avoid drawing the ire of some people or to break the rule and thereby look stupid to those people. Arnold Zwicky explores this conundrum in a post titled “Crazies Win.”

Note something interesting at the end of that post: Zwicky concludes by giving his own advice—his own prescription—regarding the issue of split infinitives. Is this a bad thing? No, not at all, because prescriptivism is not the enemy. As John Algeo said in an article in College English, “The problem is not that some of us have prescribed (we have all done so and continue to do so in one way or another); the trouble is that some of us have prescribed such nonsense” (“Linguistic Marys, Linguistic Marthas: The Scope of Language Study,” College English 31, no. 3 [December 1969]: 276). As I’ve said before, the nonsense is abundant. Just look at this awful Reader’s Digest column or this article on a Monster.com site for teachers for a couple recent examples.

Which brings me back to a point I’ve made before: linguists need to be more involved in not just educating the public about language, but in giving people the sensible advice they want. Trying to kill prescriptivism is not the answer to the language wars, and truly leaving language alone is probably a good way to end up with a dead language. Exploring it and trying to figure out how best to use it—this is what keeps language alive and thriving and interesting. And that’s good for prescriptivists and descriptivists alike.

%d bloggers like this: