Arrant Pedantry


Fifty Shades of Bad Grammar Advice

A few weeks ago, the folks at the grammar-checking website Grammarly wrote a piece about supposed grammar mistakes in Fifty Shades of Grey. Despite being a runaway hit, the book has frequently been criticized for its terrible prose, and Grammarly apparently saw an opportunity to fix some of the book’s problems (and probably sell its grammar-checking services along the way).

The first problem, of course, is that most of the errors Grammarly identified have nothing to do with grammar. The second is that most of their edits not only fail to fix the clunky prose but actually make it worse.

Mark Allen already took Grammarly to task in a post on the Copyediting blog, saying that their edits “lack restraint”, that “the list is full of style choices and non-errors”, and that “it fails to make a case for the value of proofreading, and, by association, . . . reflects poorly on the craft of copyediting.” I agreed and thought at the time that nothing more needed to be said.

But then Grammarly decided to go even further. In this infographic, they claim to have found “similar gaffes” in the works of authors ranging from Nicholas Sparks to Shakespeare.

The first edit suggests that Nicholas Sparks needs a comma in the sentence “I am a common man with common thoughts and I’ve led a common life.” It’s true that this is a compound sentence, and such sentences typically require a comma between the two independent clauses. But The Chicago Manual of Style says that the comma can be omitted when the clauses are short and closely related. This isn’t an error so much as a style choice.

Incidentally, Grammarly says that “E. L. James is not the first author to include a comma in her work when a semi-colon would be more appropriate, or vice versa.” But the supposed error here isn’t that James used a comma when she should have used a semicolon; it’s that she didn’t use a comma at all. (Also note that “semicolon” is not spelled with a hyphen and that the comma before “or vice versa” is not necessary.)

Error number 2 is comma misuse (which is somehow different from error number 1, which is also comma misuse). Grammarly says, “Many writers forget to include a comma when one is necessary, or include a comma when it is not necessary.” (By the way, the comma before “or include a comma when it is not necessary” is not necessary.) The supposed offender here is Hemingway, who wrote, “We would be together and have our books and at night be warm in bed together with the windows open and the stars bright.” Grammarly suggests putting a comma after “at night”, but that would be a mistake.

The sentence has a compound predicate with three verb phrases strung together with ands. Hemingway says that “We would (1) be together and (2) have our books and (3) at night be warm in bed together with the windows open and the stars bright.” You don’t need a comma between the parts of a compound predicate, and if you want to set off the phrase “at night”, then you need commas on both sides: “We would be together and have our books and, at night, be warm in bed together with the windows open and the stars bright.” But that destroys the rhythm of the sentence and interferes with Hemingway’s signature style.

Error number 3 is wordiness, and the offender is Edith Wharton, who wrote, “Each time you happen to me all over again.” Grammarly suggests axing “all over”, leaving “Each time you happen to me again”. But this edit doesn’t fix a wordy sentence so much as it kills its emphasis. This is dialogue; shouldn’t dialogue sound like the way people talk?

Error number 4, colloquialisms, is not even an error by Grammarly’s own admission—it’s a stylistic choice. And choosing to use colloquialisms—more particularly, contractions—is a perfectly valid stylistic choice in fiction, especially in dialogue. Changing “doesn’t sound very exciting” to “it does not sound very exciting” is probably fine if you’re editing dialogue for Data from Star Trek, but it just isn’t how normal people talk.

The next error, commonly confused words, is a bit of a head-scratcher. Here Grammarly fingers F. Scott Fitzgerald for writing “to-night” rather than “tonight”. But this has nothing to do with confused words, because they’re the same word. To-night was the more common spelling until the 1930s, when the unhyphenated tonight surpassed it. This is not an error at all, let alone an error involving commonly confused words.

The sixth error, sentence fragments, is again debatable, and Grammarly even acknowledges that using fragments “is one way to emphasize an idea.” Once again, Grammarly says that it’s a style choice that for some reason you should never make. The Chicago Manual of Style, on the other hand, rightly acknowledges that the proscription against sentence fragments has “no historical or grammatical foundation.”

Error number 7 is another puzzler. They say that determiners “help writers to be specific about what they are talking about.” Then they say that Boris Pasternak should have written “sent down to the earth” rather than “sent down to earth” in Doctor Zhivago. Where on the earth did they get that idea? Not only is “down to earth” far more common in writing, but there’s nothing unclear about it. Adding the “the” doesn’t solve any problem because there is no problem here. Incidentally, they say the error has to do with determiners, but they’re really talking about articles—a, an, and the. Articles are simply one type of determiner, which also includes possessive determiners, demonstratives, and quantifiers.

I’ll skip error number 8 for the moment and go to number 9, the passive voice. Again they note the passive voice is a stylistic choice and not a grammatical error, and then they edit it out anyway. In place of Mr. Darcy’s “My feelings will not be repressed” we now have “I will not repress my feelings.” Grammarly claims that the passive can cause “a lack of clarity in your writing”, but what is unclear about this line? Is anyone confused about it in the slightest? Instead of added clarity, we get a ham-fisted edit that shifts the focus from where it should be—the feelings—onto Mr. Darcy himself. This is exactly the sort of sentence that calls for the passive voice.

The eighth error is probably the most infuriating because it gets so many things wrong. Here they take Shakespeare himself to task over his supposed preposition misuse. They say that in The Tempest, Shakespeare should have written “such stuff on which dreams are made on” rather than “such stuff as dreams are made on”. The first problem with Grammarly’s correction is that it doubles the preposition “on”, creating a grammatical problem rather than fixing it.

The second problem with this correction is that which can’t be used as a relative pronoun referring to such—only as can do that. Their fix is not just awkward but doubly ungrammatical.

The third is that it simply ruins the meter of the line. Remember that Shakespeare often wrote in a meter called iambic pentameter, which means that each foot contains two syllables with stress on the second syllable and that there are five feet per line. Here’s the sentence from The Tempest:

We are such stuff
As dreams are made on, and our little life
Is rounded with a sleep.

(Note that these aren’t full lines because I’m omitting the text from surrounding sentences that make up part of the first and third lines.) Pay attention to the rhythm of those lines.

we ARE such STUFF

Now compare Grammarly’s fix:

we ARE such STUFF
on WHICH dreams ARE made ON and OUR littLE life

The second line has too many syllables, and the stresses have all shifted. Shakespeare’s line puts most of the stresses on nouns and verbs, while Grammarly’s fix puts it mostly on function words—pronouns, prepositions, determiners—and, maybe worst of all, on the second syllable of “little”. They have taken lines from one of the greatest writers in all of English history and turned them into ungrammatical doggerel. It takes some nerve to edit the Bard; it apparently takes sheer blinkered idiocy to edit him so badly.

So, just to recap, that’s nine supposed grammatical errors that Grammarly says will ruin your prose, most of which are not errors and have nothing to do with grammar. Their suggested fixes, on the other hand, sometimes introduce grammatical errors and always worsen the writing. The takeaway from all of this is not, as Grammarly says, that loves conquers all, but rather that Grammarly doesn’t know the first thing about grammar, let alone good writing.

Addendum: I decided to stop giving Grammarly such a bad time and help them out by editing their infographic pro bono.


Why Is It “Woe Is Me”?

I recently received an email asking about the expression woe is me, namely what the plural would be and why it’s not woe am I. Though the phrase may strike modern speakers as bizarre if not downright ungrammatical, there’s actually a fairly straightforward explanation: it’s an archaic dative expression. Strange as it may seem, the correct form really is woe is me, not woe am I or woe is I, and the first-person plural would simply be woe is us. I’ll explain why.

Today English only has three cases—nominative (or subjective), objective, and genitive (or possessive)—and these cases only apply to personal pronouns and who. Old English, on the other hand, had four cases (and vestiges of a fifth), and they applied to all nouns, pronouns, and adjectives. Among these four were two different cases for objects: accusative and dative. (The forms that we now think of simply as object pronouns actually descend from the dative pronouns, though they now cover the functions of both the accusative and dative.) These correspond roughly to direct and indirect objects, respectively, though they could be used in other ways too.

For instance, some prepositions took accusative objects, and some took dative objects (and some took either depending on the meaning). Nouns and pronouns in the accusative and dative cases could also be used in ways that seem strange to modern speakers. The dative, for example, could be used in places where we would normally use to and a pronoun. In some constructions we still have the choice between a pronoun or to and a pronoun—think of how you can say either I gave her the ball or I gave the ball to her—but in Old English you could do this to a much greater degree.

In the phrase woe is me, woe is the subject and me is a dative object, something that isn’t allowed in English today. It really means woe is to me. Today the phrase woe is me is pretty fixed, but some past variations on the phrase make the meaning a little clearer. Sometimes it was used with a verb, and sometimes woe was simply followed by a noun or prepositional phrase. In the King James Bible, we find “If I be wicked, woe unto me” (Job 10:15). One example from Old English reads, “Wa biþ þonne þæm mannum” (woe be then [to] that man).

So “woe is I” is not simply a fancy or archaic way of saying “I am woe” and is thus not parallel to constructions like “it is I”, where the nominative form is usually prescribed and the objective form is proscribed. In “woe is me”, “me” is not a subject complement (also known as a predicative complement) but a type of dative construction.

Thus the singular is is always correct, because it agrees with the singular mass noun woe. And though we don’t have distinct dative pronouns anymore, you can still use any pronoun in the object case, so woe is us would also be correct.

Addendum: Arika Okrent, writing at Mental Floss, has also just posted a piece on this construction. She goes into a little more detail on related constructions in English, German, and Yiddish.

And here are a couple of articles by Jan Freeman from 2007, specifically addressing Patricia O’Conner’s Woe Is I and a column by William Safire on the phrase:

Woe Is Us, Part 1
Woe Is Us, Continued


On Visual Thesaurus: “Clear and/or Unclear”

And/or is a surprisingly contentious little conjunction. Some lawyers love it, but most editors hate it—and many judges hate it too. Find out what the problem is in my newest post on Visual Thesaurus, “Clear and/or Unclear”.


New Post on Visual Thesaurus: Less Usage Problems

I have a new post on Visual Thesaurus, and this one’s open to non-subscribers:

The distinction between less and fewer is one of the most popular rules in the peevers’ arsenal. It’s a staple of lists of grammar rules that everyone supposedly gets wrong, and sticklers have pressured stores into changing their signs from “10 items or less” to “10 items or fewer.” Students have it drilled into their heads that fewer is for things you can count while less is for things you can’t. But there’s a problem: the rule as it’s commonly taught is wrong, and it’s dulling our sense of what’s actually right.

Go here to read the rest.


Do Usage Debates Make You Nauseous?

Several days ago, the Twitter account for the Chicago Manual of Style tweeted, “If you’re feeling sick, use nauseated rather than nauseous. Despite common usage, whatever is nauseous induces nausea.” The relevant entry in Chicago reads,

Whatever is nauseous induces a feeling of nausea—it makes us feel sick to our stomachs. To feel sick is to be nauseated. The use of nauseous to mean nauseated may be too common to be called error anymore, but strictly speaking it is poor usage. Because of the ambiguity of nauseous, the wisest course may be to stick to the participial adjectives nauseated and nauseating.

Though it seems like a straightforward usage tip, it’s based on some dubious motives and one rather strange assumption about language. It’s true that nauseous once meant causing nausea and that it has more recently acquired the sense of having nausea, but causing nausea wasn’t even the word’s original meaning in English. The word was first recorded in the early 17th century in the sense of inclined to nausea or squeamish. So you were nauseous not if you felt sick at the moment but if you had a sensitive stomach. This sense became obsolete in the late 17th century, supplanted by the causing nausea sense. The latter sense is the one that purists cling to, but it too is going obsolete.

I searched for nauseous in the Corpus of Contemporary American English and looked at the first 100 hits. Of those 100 hits, only one was used in the sense of causing nausea: “the nauseous tints and tinges of corruption.” The rest were all clearly used in the sense of having nausea—“I was nauseous” and “it might make you feel a little nauseous” and so on. Context is key: when nauseous is used with people, it means that they feel sick, but when it’s used with things, it means they’re sickening. And anyway, if nauseous is ambiguous, then every word with multiple meanings is ambiguous, including the word word, which has eleven main definitions as a noun in Merriam-Webster’s Collegiate. So where’s this ambiguity that Chicago warns of?

The answer is that there really isn’t any. In this case it’s nothing more than a red herring. Perhaps it’s possible to concoct a sentence that, lacking sufficient context, is truly ambiguous. But the corpus search shows that it just isn’t a problem, and thus fear of ambiguity can’t be the real reason for avoiding nauseous. Warnings of ambiguity are often used not to call attention to a real problem but to signal that a word has at least two senses or uses and that the author does not like one of them. Bryan Garner (the author of the above entry from Chicago), in his Modern American Usage, frequently warns of such “skunked” words and usually recommends avoiding them altogether. This may seem like sensible advice, but it seems to me to be motivated by a sense of jealousy—if the word can’t mean what the advice-giver wants it to mean, then no one can use it.

But the truly strange assumption is that words have meaning that is somehow independent of their usage. If 99 percent of the population uses nauseous in the sense of having nausea, then who’s to say that they’re wrong? Who has the authority to declare this sense “poor usage”? And yet Garner says, rather unequivocally, “Whatever is nauseous induces a feeling of nausea.” How does he know this is what nauseous means? It’s not as if there is some platonic form of words, some objective true meaning from which a word must never stray. After all, language changes, and an earlier form is not necessarily better or truer than a newer one. As Merriam-Webster editor Kory Stamper recently pointed out on Twitter, stew once meant “whorehouse”, and this sense dates to the 1300s. The food sense arose four hundred years later, in the 1700s. Is this poor usage because it’s a relative upstart supplanting an older established sense? Of course not.

People stopped using nauseous to mean “inclined to nausea” several hundred years ago, and so it no longer means that. Similarly, most people no longer use nauseous to mean “causing nausea”, and so that meaning is waning. In another hundred years, it may be gone altogether. For now, it hangs on, but this doesn’t mean that the newer and overwhelmingly more common sense is poor usage. The new sense is only poor usage inasmuch as someone says it is. In other words, it all comes down to someone’s opinion. As I’ve said before, pronouncements on usage that are based simply on someone’s opinion are ultimately unreliable, and any standard that doesn’t take into account near-universal usage by educated speakers in edited writing is doomed to irrelevance.

So go ahead and use nauseous. The “having nausea” sense is now thoroughly established, and it seems silly to avoid a perfectly good word just because a few peevers dislike it. Even if you stick to the more traditional “causing nausea” sense, you’re unlikely to confuse anyone, because context will make the meaning clear. Just be careful about people who make unsupported claims about language.


Mother’s Day

Today is officially Mother’s Day, and as with other holidays with possessive or plural endings, there’s a lot of confusion about what the correct form of the name is. The creator of Mother’s Day in the United States, Anna Jarvis, specifically stated that it should be a singular possessive to focus on individual mothers rather than mothers in general. But as sociolinguist Matt Gordon noted on Twitter, “that logic is quite peccable”; though it’s a nice sentiment, it’s grammatical nonsense.

English has a singular possessive and a plural possessive; it does not have a technically-plural-but-focusing-on-the-singular possessive. Though Jarvis may have wanted everyone to focus on their respective mothers, the fact is that it still celebrates all mothers. If I told you that tomorrow was Jonathon’s Day, you’d assume that it’s my day, not that it’s the day for all Jonathons but that they happen to be celebrating separately. That’s simply not how grammatical number works in English. If you have more than one thing, it’s plural, even if you’re considering those things individually.

This isn’t the only holiday that employs some grammatically suspect reasoning in its official spelling—Veterans Day officially has no apostrophe because the day doesn’t technically belong to veterans. But this is silly—apostrophes are used for lots of things beyond simple ownership.

It could be worse, though. The US Board on Geographic Names discourages possessives altogether, though it allows the possessive s without an apostrophe. The peak named for Pike is Pikes Peak, which is worse than grammatical nonsense—it’s an officially enshrined error. The worst part is that there isn’t even a reason given for this policy, though presumably it’s because they don’t want to indicate private ownership of geographical features. (Again, the apostrophe doesn’t necessarily show ownership.) But in this case you can’t even argue that Pike is a plural attributive noun, because there’s only one Pike who named the peak.

The sad truth is that the people in charge of deciding where or whether to put apostrophes in things don’t always have the best grasp of grammar, and they don’t always think to consult someone who does. But even if the grammar of Mother’s Day makes me roll my eyes, I can still appreciate the sentiment. In the end, arguing about the placement of an apostrophe is a quibble. What matters most is what the day really means. And this day is for you, Mom.


Over Has Always Meant More Than. Get Over it.

Last month, at the yearly conference of the American Copy Editors Society, the editors of the AP Stylebook announced that over in the sense of more than was now acceptable. For decades, newspaper copy editors had been changing constructions like over three hundred people to more than three hundred people; now, with a word from AP’s top editors, that rule was being abandoned.

According to Merriam-Webster editor Peter Sokolowski, who was in attendance, the announcement was met with gasps. Editors quickly took to Twitter and to blogs to express their approval or dismay. Some saw it as part of the dumbing-down of the language or as a tacit admission that newspapers no longer have the resources to maintain their standards. Others saw it as the banishment of a baseless superstition that has wasted copy editors’ time without improving the text.

The argument had been that over must refer to spatial relationships and that numerical relationships must use more than. But nobody objects to other figurative uses of over, such as over the weekend or get over it or in over your head or what’s come over you? The rule forbidding the use of over to mean more than was first codified in the 1800s, but over can be found in this sense going back a thousand years or more, in some of the earliest documents written in English.

Not only that, but parallel uses can be found in other Germanic languages, including German, Dutch, and Swedish. (Despite all its borrowings from French, Latin, and elsewhere, English is considered a Germanic language.) There’s nothing wrong with the German Kinder über 14 Jahre (children over 14 years) (to borrow an example from the Collins German-English Dictionary) or the Swedish Över femhundra kom (more than five hundred came). This means that this use of over actually predates English and must have been inherited from the common ancestor of all the Germanic languages, Proto-Germanic, some two thousand years ago.

Mignon Fogarty, aka Grammar Girl, wrote that “no rationale exists for the ‘over can’t mean more than’ rule.” And in a post on the Merriam-Webster Unabridged blog, Sokolowski gave his own debunking, concluding that “we just don’t need artificial rules that do not promote the goal of clarity.” But none of this was good enough for some people. AP’s announcement caused a rift in the editing staff at Mashable, who debated the rule on the lifestyle blog.

Alex Hazlett argued that the rule “was an arbitrary style decision that had nothing to do with grammar, defensible only by that rationale of last resort: tradition.” Megan Hess, though, took an emotional and hyperbolic tack, claiming that following rules like this prevents the world from slipping into “a Lord of the Flies-esque dystopia.” From there her argument quickly becomes circular: “The distinction is one that distinguishes clean, precise language and attention to detail — and serves as a hallmark of a proper journalism training.” In other words, editors should follow the rule because they’ve been trained to follow the rule, and the rule is simply a mark of clean copy. And how do you know the copy is clean? Because it follows rules like this. As Sokolowski says, this is nothing more than a shibboleth—the distinction serves no purpose other than to distinguish those in the know from everyone else.

It’s also a perfect example of a mumpsimus. The story goes that an illiterate priest in the Middle Ages had learned to recite the Latin Eucharist wrong: instead of sumpsimus (Latin for “we have taken”), he said mumpsimus, which is not a Latin word at all. When someone finally told him that he’d been saying it wrong and that it should be sumpsimus, he responded that he would not trade his old mumpsimus for this person’s new sumpsimus. He didn’t just refuse to change—he refused to recognize that he was wrong and had always been wrong.

But so what if everyone’s been using over this way for longer than the English language has existed? Just because everyone does it doesn’t mean it’s right, right? Well, technically, yes, but let’s flip the question around: what makes it wrong to use over to mean more than? The fact that the over-haters have had such an emotional reaction is telling. It’s surprisingly easy to talk yourself into hating a particular word or phrase and to start judging everyone who allegedly misuses it. And once you’ve developed a visceral reaction to a perceived misuse, it’s hard to be persuaded that your feelings aren’t justified.

We editors take a lot of pride in our attention to language—which usually means our attention to the usage and grammar rules that we’ve been taught—so it can seem like a personal affront to be told that we were wrong and have always been wrong. Not only that, but it can shake our faith in other rules. If we were wrong about this, what else might we have been wrong about? But perhaps rather than priding ourselves on following the rules, we should pride ourselves on mastering them, which means learning how to tell the good rules from the bad.

Learning that you were wrong simply means that now you’re right, and that can only be a good thing.


Now on Visual Thesaurus: “Electrocution: A Shocking Misuse?”

I have a new post up on Visual Thesaurus about the use, misuse, and history of the word electrocute. Some usage commentators today insist that it be used only to refer to death by electric shock; that is, you can’t say you’ve been electrocuted if you lived to tell the tale. But the history, unsurprisingly, is more complicated: there have been disputes about the word since its birth.

As always, the article is for subscribers only, but a subscription costs a paltry $2.95 a month or $19.95 (and would make a great gift for the word lover in your life). Check it out.


Yes, Irregardless Is a Word

My last post, “12 Mistakes Nearly Everyone Who Writes about Grammar Mistakes Makes”, drew a lot of comments, some supportive and some critical. But no point drew as much ire as my claim that irregardless is a word. Some stated flatly, “Irregardless is not a word.” One ignorantly demanded, “Show me a dictionary that actually contains that word.” (I could show him several.) Still others argued that it was a double negative, that it was logically and morphologically ill-formed and thus had no meaning. One commenter said that “with the negating preface [prefix] ‘ir-’ and the negating suffix ‘-less’, it is a double negative” and that “it is not a synonym with ‘regardless’.” Another was even cleverer, saying, “The prefix ir-, meaning not, changes the meaning of the word regardless, so not only is it not a standard word, but it’s also misused in nearly all cases.” But these arguments still miss the point: irregardless is indeed a word, and it means the same thing as regardless.

In my last post I argued that there’s a clear difference between a word like irregardless and a nonword like flirgle. By any objective criterion, irregardless is a word. It has an established form and meaning, it’s used in speech and occasionally in writing, and it’s even found in reputable dictionaries, including Merriam-Webster’s Collegiate Dictionary and The Oxford English Dictionary (though it is, quite appropriately, labeled nonstandard). We can identify its part of speech (it’s an adverb) and describe how it’s used. By contrast, though, consider flirgle. You don’t know what its part of speech is or how to use it, and if I were to use it in a sentence, you wouldn’t know what it meant. This is because it’s just something I made up by stringing some sounds together. But when someone uses irregardless, you know exactly what it means, even if you want to pretend otherwise.

This is because words get their wordhood not from etymology or logic or some cultural institution granting them official status, but by convention. It doesn’t matter that nice originally meant “ignorant” or that contact was originally only a noun or that television is formed from a blend of Greek and Latin roots; what matters is how people use these words now. This makes some people uncomfortable because it sounds like anarchy, but it’s more like the ultimate democracy or free market. We all want to understand one another and be understood, so it’s in our mutual interest to communicate in ways that are understandable. Language is a self-regulating system guided by the invisible hand of its users’ desire to communicate—not that this stops people from feeling the need for overt regulation.

One commenter, the same who said, “Irregardless is not a word,” noted rather aptly, “There is absolutely no value to ‘irregardless’ except to recognize people who didn’t study.” Exactly. There is nothing wrong with its ability to communicate; it’s only the word’s metacommunication—that is, what it communicates about its user—that is problematic. To put it a different way, the problem with irregardless is entirely social: if you use it, you’ll be thought of as uneducated, even though everyone can understand you just fine.

On Google Plus, my friend Rivka said, “Accepting it as a word is the first part of the slippery slope.” This seems like a valid fear, but I believe it is misplaced. First of all, we need to be clear about what it means to accept irregardless as a word. I accept that it’s a word, but this does not mean that I find the word acceptable. I can accept that people do all kinds of things that I don’t like. But the real problem isn’t what we mean by accept; it’s what we mean by word. When people say that something isn’t a word, they aren’t really making a testable claim about the objective linguistic status of the word; they’re making a sociolinguistic evaluation of the word. They may say that it’s not a word, but they really mean that it’s a word that’s not allowed in Standard English. This is because we think of Standard English as the only legitimate form of English. We think that the standard has words and grammar, while nonstandard dialects have nonwords and broken grammar, or no grammar at all. Yes, it’s important to recognize and teach the difference between Standard English and nonstandard forms, but it’s also important to be clear about the difference between facts about the language and our feelings about the language.

But the irregardless-haters can also take heart: the word has been around for at least a century now, and although many other new words have been coined and become part of Standard English in that time, irregardless shows no signs of moving towards acceptability. Most people who write for publication are well aware of the stigma attached to it, and even if they aren’t, few copyeditors are willing to let it into print. It’s telling that of the Oxford English Dictionary’s eight citations of the word, two merely cite the word in other dictionaries, three more are mentions or citations in linguistics or literary journals, and one more appears to be using the word ironically. We talk about the word irregardless—mostly just to complain about it—far more than we actually use it.

So yes, irregardless is a word, even though it’s nonstandard. You don’t have to like it, and you certainly don’t have to use it, but you also don’t have to worry about it becoming acceptable anytime soon.

This post also appears on Huffington Post.


12 Mistakes Nearly Everyone Who Writes About Grammar Mistakes Makes

There are a lot of bad grammar posts in the world. These days, anyone with a blog and a bunch of pet peeves can crank out a click-bait listicle of supposed grammar errors. There’s just one problem—these articles are often full of mistakes of one sort or another themselves. Once you’ve read a few, you start noticing some patterns. Inspired by a recent post titled “Grammar Police: Twelve Mistakes Nearly Everyone Makes”, I decided to make a list of my own.

1. Confusing grammar with spelling, punctuation, and usage. Many people who write about grammar seem to think that grammar means “any sort of rule of language, especially writing”. But strictly speaking, grammar refers to the structural rules of language, namely morphology (basically the way words are formed from roots and affixes), phonology (the system of sounds in a language), and syntax (the way phrases and clauses are formed from words). Most complaints about grammar are really about punctuation, spelling (such as problems with you’re/your and other homophone confusion) or usage (which is often about semantics). This post, for instance, spends two of its twelve points on commas and a third on quotation marks.

2. Treating style choices as rules. This article says that you should always use an Oxford (or serial) comma (the comma before and or or in a list) and that quotation marks should always follow commas and periods, but the latter is true only in most American styles (linguists often put the commas and periods outside quotes, and so do many non-American styles), and the former is only true of some American styles. I may prefer serial commas, but I’m not going to insist that everyone who doesn’t use them is making a mistake. It’s simply a matter of style, and style varies from one publisher to the next.

3. Ignoring register. There’s a time and a place for following the rules, but the writers of these lists typically treat English as though it had only one register: formal writing. They ignore the fact that following the rules in the wrong setting often sounds stuffy and stilted. Formal written English is not the only legitimate form of the language, and the rules of formal written English don’t apply in all situations. Sure, it’s useful to know when to use who and whom, but it’s probably more useful to know that saying To whom did you give the book? in casual conversation will make you sound like a pompous twit.

4. Saying that a disliked word isn’t a word. You may hate irregardless (I do), but that doesn’t mean it’s not a word. If it has its own meaning and you can use it in a sentence, guess what—it’s a word. Flirgle, on the other hand, is not a word—it’s just a bunch of sounds that I strung together in word-like fashion. Irregardless and its ilk may not be appropriate for use in formal registers, and you certainly don’t have to like them, but as Stan Carey says, “‘Not a word’ is not an argument.”

5. Turning proposals into ironclad laws. This one happens more often than you think. A great many rules of grammar and usage started life as proposals that became codified as inviolable laws over the years. The popular that/which rule, which I’ve discussed at length before, began as a proposal—not “everyone gets this wrong” but “wouldn’t it be nice if we made a distinction here?” But nowadays people have forgotten that a century or so ago, this rule simply didn’t exist, and they say things like “This is one of the most common mistakes out there, and understandably so.” (Actually, no, you don’t understand why everyone gets this “wrong”, because you don’t realize that this rule is a relatively recent invention by usage commentators that some copy editors and others have decided to enforce.) It’s easy to criticize people for not following rules that you’ve made up.

6. Failing to discuss exceptions to rules. Invented usage rules often ignore the complexities of actual usage. Lists of rules such as these go a step further and often ignore the complexities of those rules. For example, even if you follow the that/which rule, you need to know that you can’t use that after a preposition or after the demonstrative pronoun that—you have to use a restrictive which. Likewise, the less/fewer rule is usually reduced to statements like “use fewer for things you can count”, which leads to ugly and unidiomatic constructions like “one fewer thing to worry about”. Affect and effect aren’t as simple as some people make them out to be, either; affect is usually a verb and effect a noun, but affect can also be a noun (with stress on the first syllable) referring to the outward manifestation of emotions, while effect can be a verb meaning to cause or to make happen. Sometimes dumbing down rules just makes them dumb.

7. Overestimating the frequency of errors. The writer of this list says that misuse of nauseous is “Undoubtedly the most common mistake I encounter.” This claim seems worth doubting to me; I can’t remember the last time I heard someone say “nauseous”. Even if you consider it a misuse, it’s got to rate pretty far down the list in terms of frequency. This is why linguists like to rely on data for testable claims—because people tend to fall prey to all kinds of cognitive biases such as the frequency illusion.

8. Believing that etymology is destiny. Words change meaning all the time—it’s just a natural and inevitable part of language. But some people get fixated on the original meanings of some words and believe that those are the only correct meanings. For example, they’ll say that you can only use decimate to mean “to destroy one in ten”. This may seem like a reasonable argument, but it quickly becomes untenable when you realize that almost every single word in the language has changed meaning at some point, and that’s just in the few thousand years in which language has been written or can be reconstructed. And sometimes a new meaning is more useful anyway (which is precisely why it displaced an old meaning). As Jan Freeman said, “We don’t especially need a term that means ‘kill one in 10.'”

9. Simply bungling the rules. If you’re going to chastise people for not following the rules, you should know those rules yourself and be able to explain them clearly. You may dislike singular they, for instance, but you should know that it’s not a case of subject-predicate disagreement, as the author of this list claims—it’s an issue of pronoun-antecedent agreement, which is not the same thing. This list says that “‘less’ is reserved for hypothetical quantities”, but this isn’t true either; it’s reserved for noncount nouns, singular count nouns, and plural count nouns that aren’t generally thought of as discrete entities. Use of less has nothing to do with being hypothetical. And this one says that punctuation always goes inside quotation marks. In most American styles, it’s only commas and periods that always go inside. Colons, semicolons, and dashes always go outside, and question marks and exclamation marks only go inside sometimes.

10. Saying that good grammar leads to good communication. Contrary to popular belief, bad grammar (even using the broad definition that includes usage, spelling, and punctuation) is not usually an impediment to communication. A sentence like Ain’t nobody got time for that is quite intelligible, even though it violates several rules of Standard English. The grammar and usage of nonstandard varieties of English are often radically different from Standard English, but different does not mean worse or less able to communicate. The biggest differences between Standard English and all its nonstandard varieties are that the former has been codified and that it is used in all registers, from casual conversation to formal writing. Many of the rules that these lists propagate are really more about signaling to the grammatical elite that you’re one of them—not that this is a bad thing, of course, but let’s not mistake it for something it’s not. In fact, claims about improving communication are often just a cover for the real purpose of these lists, which is . . .

11. Using grammar to put people down. This post sympathizes with someone who worries about being crucified by the grammar police and then says a few paragraphs later, “All hail the grammar police!” In other words, we like being able to crucify those who make mistakes. Then there are the put-downs about people’s education (“You’d think everyone learned this rule in fourth grade”) and more outright insults (“5 Grammar Mistakes that Make You Sound Like a Chimp”). After all, what’s the point in signaling that you’re one of the grammatical elite if you can’t take a few potshots at the ignorant masses?

12. Forgetting that correct usage ultimately comes from users. The disdain for the usage of common people is symptomatic of a larger problem: forgetting that correct usage ultimately comes from the people, not from editors, English teachers, or usage commentators. You’re certainly entitled to have your opinion about usage, but at some point you have to recognize that trying to fight the masses on a particular point of usage (especially if it’s a made-up rule) is like trying to fight the rising tide. Those who have invested in learning the rules naturally feel defensive of them and of the language in general, but you have no more right to the language than anyone else. You can be restrictive if you want and say that Standard English is based on the formal usage of educated writers, but any standard that is based on a set of rules that are simply invented and passed down is ultimately untenable.

And a bonus mistake:

13. Making mistakes themselves. It happens to the best of us. The act of making grammar or spelling mistakes in the course of pointing out someone else’s mistakes even has a name, Muphry’s law. This post probably has its fair share of typos. (If you spot one, feel free to point it out—politely!—in the comments.)

This post also appears on Huffington Post.