Arrant Pedantry

By

Language, Logic, and Correctness

In “Why Descriptivists Are Usage Liberals”, I said that there some logical problems with declaring something to be right or wrong based on evidence. A while back I explored this problem in a piece titled “What Makes It Right?” over on Visual Thesaurus.

The terms prescriptive and descriptive were borrowed from philosophy, where they are used to talk about ethics, and the tension between these two approaches is reflected in language debates today. The questions we have today about correct usage are essentially the same questions philosophers have been debating since the days of Socrates and Plato: what is right, and how do we know?

As I said on Visual Thesaurus, all attempts to answer these questions run into a fundamental logical problem: just because something is doesn’t mean it ought to be. Most people are uncomfortable with the idea of moral relativism and believe at some level that there must be some kind of objective truth. Unfortunately, it’s not entirely clear just where we find this truth or how objective it really is, but we at least operate under the convenient assumption that it exists.

But things get even murkier when we try to apply this same assumption to language. While we may feel safe saying that murder is wrong and would still be wrong even if a significant portion of the population committed murder, we can’t safely make similar arguments about language. Consider the word bird. In Old English, the form of English spoken from about 500 AD to about 1100 AD, the word was brid. Bird began as a dialectal variant that spread and eventually supplanted brid as the standard form by about 1600. Have we all been saying this word wrong for the last four hundred years or so? Is saying bird just as wrong as saying nuclear as nucular?

No, of course not. Even if it had been considered an error once upon a time, it’s not an error anymore. Its widespread use in Standard English has made it standard, while brid would now be considered an error (if someone were to actually use it). There is no objectively correct form of the word that exists independent of its use. That is, there is no platonic form of the language, no linguistic Good to which a grammarian-king can look for guidance in guarding the city.

This is why linguistics is at its core an empirical endeavor. Linguists concern themselves with investigating linguistic facts, not with making value judgements about what should be considered correct or incorrect. As I’ve said before, there are no first principles from which we can determine what’s right and wrong. Take, for example, the argument that you should use the nominative form of pronouns after a copula verb. Thus you should say It is I rather than It is me. But this argument assumes as prior the premise that copula verbs work this way and then deduces that anything that doesn’t work this way is wrong. Where would such a putative rule come from, and how do we know it’s valid?

Linguists often try to highlight the problems with such assumptions by pointing out, for example, that French requires an object pronoun after the copula (in French you say c’est moi [it’s me], not c’est je [it’s I]) or that English speakers, including renowned writers, have long used object forms in this position. That is, there is no reason to suppose that this rule has to exist, because there are clear counterexamples. But then, as I said before, some linguists leave the realm of strict logic and argue that if everyone says it’s me, then it must be correct.

Some people then counter by calling this argument fallacious, and strictly speaking, it is. Mededitor has called this the Jane Austen fallacy (if Jane Austen or some other notable past writer has done it, then it must be okay), and one commenter named Kevin S. has made similar arguments in the comments on Kory Stamper’s blog, Harmless Drudgery.

There, Kevin S. attacked Ms. Stamper for noting that using lay in place of lie dates at least to the days of Chaucer, that it is very common, and that it “hasn’t managed to destroy civilization yet.” These are all objective facts, yet Kevin S. must have assumed that Ms. Stamper was arguing that if it’s old and common, it must be correct. In fact, she acknowledged that it is nonstandard and didn’t try to argue that it wasn’t or shouldn’t be. But Kevin S. pointed out a few fallacies in the argument that he assumed that Ms. Stamper was making: an appeal to authority (if Chaucer did it, it must be okay), the “OED fallacy” (if it has been used that way in the past, it must be correct), and the naturalistic fallacy, which is deriving an ought from an is (lay for lie is common; therefore it ought to be acceptable).

And as much as I hate to say it, technically, Kevin S. is right. Even though he was responding to an argument that hadn’t been made, linguists and lexicographers do frequently make such arguments, and they are in fact fallacies. (I’m sure I’ve made such arguments myself.) Technically, any argument that something should be considered correct or incorrect isn’t a logical argument but a persuasive one. Again, this goes back to the basic difference between descriptivism and prescriptivism. We can make statements about the way English appears to work, but making statements about the way English should work or the way we think people should feel about it is another matter.

It’s not really clear what Kevin S.’s point was, though, because he seemed to be most bothered by Ms. Stamper’s supposed support of some sort of flabby linguistic relativism. But his own implied argument collapses in a heap of fallacies itself. Just as we can’t necessarily call something correct just because it occurred in history or because it’s widespread, we can’t necessarily call something incorrect just because someone invented a rule saying so.

I could invent a rule saying that you shouldn’t ever use the word sofa because we already have the perfectly good word couch, but you would probably roll your eyes and say that’s stupid because there’s nothing wrong with the word sofa. Yet we give heed to a whole bunch of similarly arbitrary rules invented two or three hundred years ago. Why? Technically, they’re no more valid or logically sound than my rule.

So if there really is such a thing as correctness in language, and if any argument about what should be considered correct or incorrect is technically a logical fallacy, then how can we arrive at any sort of understanding of, let alone agreement on, what’s correct?

This fundamental inability to argue logically about language is a serious problem, and it’s one that nobody has managed to solve or, in my opinion, ever will completely solve. This is why the war of the scriptivists rages on with no end in sight. We see the logical fallacies in our opponents’ arguments and the flawed assumptions underlying them, but we don’t acknowledge—or sometimes even see—the problems with our own. Even if we did, what could we do about them?

My best attempt at an answer is that both sides simply have to learn from each other. Language is a democracy, true, but, just like the American government, it is not a pure democracy. Some people—including editors, writers, English teachers, and usage commentators—have a disproportionate amount of influence. Their opinions carry more weight because people care what they think.

This may be inherently elitist, but it is not necessarily a bad thing. We naturally trust the opinions of those who know the most about a subject. If your car won’t start, you take it to a mechanic. If your tooth hurts, you go to the dentist. If your writing has problems, you ask an editor.

Granted, using lay for lie is not bad in the same sense that a dead starter motor or an abscessed tooth is bad: it’s a problem only in the sense that some judge it to be wrong. Using lay for lie is perfectly comprehensible, and it doesn’t violate some basic rule of English grammar such as word order. Furthermore, it won’t destroy the language. Just as we have pairs like lay and lie or sit and set, we used to have two words for hang, but nobody claims that we’ve lost a valuable distinction here by having one word for both transitive and intransitive uses.

Prescriptivists want you to know that people will judge you for your words (and—let’s be honest—usually they’re the ones doing the judging), and descriptivists want you to soften those judgements or even negate them by injecting them with a healthy dose of facts. That is, there are two potential fixes for the problem of using words or constructions that will cause people to judge you: stop using that word or construction, or get people to stop judging you and others for that use.

In reality, we all use both approaches, and, more importantly, we need both approaches. Even most dyed-in-the-wool prescriptivists will tell you that the rule banning split infinitives is bogus, and even most liberal descriptivists will acknowledge that if you want to be taken seriously, you need to use Standard English and avoid major errors. Problems occur when you take a completely one-sided approach, insisting either that something is an error even if almost everyone does it or that something isn’t an error even though almost everyone rejects it. In other words, good usage advice has to consider not only the facts of usage but speakers’ opinions about usage.

For instance, you can recognize that irregardless is a word, and you can even argue that there’s nothing technically wrong with it because nobody cares that the verbs bone and debone mean the same thing, but it would be irresponsible not to mention that the word is widely considered an error in educated speech and writing. Remember that words and constructions are not inherently correct or incorrect and that mere use does not necessarily make something correct; correctness is a judgement made by speakers of the language. This means that, paradoxically, something can be in widespread use even among educated speakers and can still be considered an error.

This also means that on some disputed items, there may never be anything approaching consensus. While the facts of usage may be indisputable, opinions may still be divided. Thus it’s not always easy or even possible to label something as simply correct or incorrect. Even if language is a democracy, there is no simple majority rule, no up and down vote to determine whether something is correct. Something may be only marginally acceptable or correct only in certain situations or according to certain people.

But as in a democracy, it is important for people to be informed before metaphorically casting their vote. Bryan Garner argues in his Modern American Usage that what people want in language advice is authority, and he’s certainly willing to give it to you. But I think what people really need is information. For example, you can state authoritatively that regardless of past or present usage, singular they is a grammatical error and always will be, but this is really an argument, not a statement of fact. And like all arguments, it should be supported with evidence. An argument based solely or primarily on one author’s opinion—or even on many people’s opinions—will always be a weaker argument than one that considers both facts and opinion.

This doesn’t mean that you have to accept every usage that’s supported by evidence, nor does it mean that all evidence is created equal. We’re all human, we all still have opinions, and sometimes those opinions are in defiance of facts. For example, between you and I may be common even in educated speech, but I will probably never accept it, let alone like it. But I should not pretend that my opinion is fact, that my arguments are logically foolproof, or that I have any special authority to declare it wrong. I think the linguist Thomas Pyles said it best:

Too many of us . . . would seem to believe in an ideal English language, God-given instead of shaped and molded by man, somewhere off in a sort of linguistic stratosphere—a language which nobody actually speaks or writes but toward whose ineffable standards all should aspire. Some of us, however, have in our worst moments suspected that writers of handbooks of so-called “standard English usage” really know no more about what the English language ought to be than those who use it effectively and sometimes beautifully. In truth, I long ago arrived at such a conclusion: frankly, I do not believe that anyone knows what the language ought to be. What most of the authors of handbooks do know is what they want English to be, which does not interest me in the least except as an indication of the love of some professors for absolute and final authority.1”Linguistics and Pedagogy: The Need for Conciliation,” in Selected Essays on English Usage, ed. John Algeo (Gainesville: University Presses of Florida, 1979), 169–70.

In usage, as in so many other things, you have to learn to live with uncertainty.

Notes   [ + ]

1. ”Linguistics and Pedagogy: The Need for Conciliation,” in Selected Essays on English Usage, ed. John Algeo (Gainesville: University Presses of Florida, 1979), 169–70.

By

Why Descriptivists Are Usage Liberals

Outside of linguistics, the people who care most about language tend to be prescriptivists—editors, writers, English teachers, and so on—while linguists and lexicographers are descriptivists. “Descriptive, not prescriptive!” is practically the linguist rallying cry. But we linguists have done a terrible job of explaining just what that means and why it matters. As I tried to explain in “What Descriptivism Is and Isn’t”, descriptivism is essentially just an interest in facts. That is, we make observations about what the language is rather than state opinions about how we’d like it to be.

Descriptivism is often cast as the opposite of prescriptivism, but they aren’t opposites at all. But no matter how many times we insist that “descriptivism isn’t ‘anything goes’”, people continue to believe that we’re all grammatical anarchists and linguistic relativists, declaring everything correct and saying that there’s no such thing as a grammatical error.

Part of the problem is that whenever you conceive of two approaches as opposing points of view, people will assume that they’re opposite in every regard. Prescriptivists generally believe that communication is important, that having a standard form of the language facilitates communication, and that we need to uphold the rules to maintain the standard. And what people often see is that linguists continually tear down the rules and say that they don’t really matter. The natural conclusion for many people is that linguists don’t care about maintaining the standard or supporting good communication—they want a linguistic free-for-all instead. Then descriptivists appear to be hypocrites for using the very standard they allegedly despise.

It’s true that many descriptivists oppose rules that they disagree with, but as I’ve said before, this isn’t really descriptivism—it’s anti-prescriptivism, for lack of a better term. (Not because it’s the opposite of prescriptivism, but because it often prescribes the opposite of what traditional linguistic prescriptivism does.) Just ask yourself how an anti-prescriptive sentiment like “There’s nothing wrong with singular they” is a description of linguistic fact.

So if that’s not descriptivism, then why do so many linguists have such liberal views on usage? What does being against traditional rules have to do with studying language? And how can linguists oppose rules and still be in favor of good communication and Standard English?

The answer, in a nutshell, is that we don’t think that the traditional rules have much to do with either good communication or Standard English. The reason why we think that is a little more complicated.

Linguists have had a hard time defining just what Standard English is, but there are several ideas that recur in attempts to define it. First, although Standard English can certainly be spoken, it is often conceived of as a written variety, especially in the minds of non-linguists. Second, it is generally more formal, making it appropriate for a wide range of serious topics. Third, it is educated, or rather, it is used by educated speakers. Fourth, it is supraregional, meaning that it is not tied to a specific region, as most dialects are, but that it can be used across an entire language area. And fifth, it is careful or edited. Notions of uniformity and prestige are often thrown into the mix as well.

Careful is a vague term, but it means that users of Standard English put some care into what they say or write. This is especially true of most published writing; the entire profession of editing is dedicated to putting care into the written word. So it’s tempting to say that following the rules is an important part of Standard English and that tearing down those rules tears down at least that part of Standard English.

But the more important point is that Standard English is ultimately rooted in the usage of actual speakers and writers. It’s not just that there no legislative body declaring what’s standard, but that there are no first principles from which we can deduce what’s standard. All languages are different, and they change over time, so how can we know what’s right or wrong except by looking at the evidence? This is what descriptivists try to do when discussing usage: look at the evidence from historical and current usage and draw meaningful conclusions about what’s right or wrong. (There are some logical problems with this, but I’ll address those another time.)

Let’s take singular they, for example. The evidence shows that it’s been in use for centuries not just by common folk or educated speakers but by well-respected writers from Geoffrey Chaucer to Jane Austen. The evidence also shows that it’s used in fairly predictable ways, generally to refer to indefinite pronouns or to nouns that don’t specify gender. Its use has not caused the grammar of English to collapse, and it seems like a rather felicitous solution to the gender-neutral pronoun problem. So at least from a dispassionate linguistic point of view, there is no problem with it.

From another point of view, though, there is something wrong with it: some people don’t like it. This is a social rather than a linguistic fact, but it’s a fact nonetheless. But this social fact arose because at some point someone declared—contrary to the linguistic facts—that singular they is a grammatical error that should be avoided. Here’s where descriptivists depart from description and get into anti-prescription. If people have been taught to dislike this usage, it stands to reason that they could be taught to get over this dislike.

That is, linguists are engaging in anti-prescriptivism to counter the prescriptivism that isn’t rooted in linguistic fact. So when they debunk or tear down traditional rules, it’s not that they don’t value Standard English or good communication; it’s that they think that those particular rules have nothing to do with either.

To be fair, I think that many linguists think they’re still merely describing when they’re countering prescriptive attitudes. Saying that singular they has been used for centuries by respected writers, that it appears to follow fairly well-defined rules, and that the proscription against it is not based in linguistic fact is descriptive; saying that people need to get over their dislike and accept it is not.

And this is precisely why I think descriptivism and prescriptivism not only can but should coexist. It’s not wrong to have opinions on what’s right or wrong, but I think it’s better if those opinions have some basis in fact. Guidance on issues of usage can really only be relevant and valid if it takes all the evidence into account—who uses a certain word of construction, in what circumstances, and so on. These are all facts that can be investigated, and linguistics provides a solid methodological framework for doing so. Anything that ignores the facts reduces to one sort of ipse dixit or another, either a statement from an authority declaring something to be right or wrong or one’s own preferences or pet peeves.

Linguists value good communication, and we recognize the importance of Standard English. But our opinions on both are informed by our study of language and by our emphasis on facts and evidence. This isn’t “anything goes”, or at least no more so than language has always been. People have always worried about language change, but language has always turned out fine. Inventing new rules to try to regulate language will not save it from destruction, and tossing out the rules that have no basis in fact will not hasten the language’s demise. But recognizing that some rules don’t matter may alleviate some of those worries, and I think that’s a good thing for both camps.

By

Yes, Irregardless Is a Word

My last post, “12 Mistakes Nearly Everyone Who Writes about Grammar Mistakes Makes”, drew a lot of comments, some supportive and some critical. But no point drew as much ire as my claim that irregardless is a word. Some stated flatly, “Irregardless is not a word.” One ignorantly demanded, “Show me a dictionary that actually contains that word.” (I could show him several.) Still others argued that it was a double negative, that it was logically and morphologically ill-formed and thus had no meaning. One commenter said that “with the negating preface [prefix] ‘ir-’ and the negating suffix ‘-less’, it is a double negative” and that “it is not a synonym with ‘regardless’.” Another was even cleverer, saying, “The prefix ir-, meaning not, changes the meaning of the word regardless, so not only is it not a standard word, but it’s also misused in nearly all cases.” But these arguments still miss the point: irregardless is indeed a word, and it means the same thing as regardless.

In my last post I argued that there’s a clear difference between a word like irregardless and a nonword like flirgle. By any objective criterion, irregardless is a word. It has an established form and meaning, it’s used in speech and occasionally in writing, and it’s even found in reputable dictionaries, including Merriam-Webster’s Collegiate Dictionary and The Oxford English Dictionary (though it is, quite appropriately, labeled nonstandard). We can identify its part of speech (it’s an adverb) and describe how it’s used. By contrast, though, consider flirgle. You don’t know what its part of speech is or how to use it, and if I were to use it in a sentence, you wouldn’t know what it meant. This is because it’s just something I made up by stringing some sounds together. But when someone uses irregardless, you know exactly what it means, even if you want to pretend otherwise.

This is because words get their wordhood not from etymology or logic or some cultural institution granting them official status, but by convention. It doesn’t matter that nice originally meant “ignorant” or that contact was originally only a noun or that television is formed from a blend of Greek and Latin roots; what matters is how people use these words now. This makes some people uncomfortable because it sounds like anarchy, but it’s more like the ultimate democracy or free market. We all want to understand one another and be understood, so it’s in our mutual interest to communicate in ways that are understandable. Language is a self-regulating system guided by the invisible hand of its users’ desire to communicate—not that this stops people from feeling the need for overt regulation.

One commenter, the same who said, “Irregardless is not a word,” noted rather aptly, “There is absolutely no value to ‘irregardless’ except to recognize people who didn’t study.” Exactly. There is nothing wrong with its ability to communicate; it’s only the word’s metacommunication—that is, what it communicates about its user—that is problematic. To put it a different way, the problem with irregardless is entirely social: if you use it, you’ll be thought of as uneducated, even though everyone can understand you just fine.

On Google Plus, my friend Rivka said, “Accepting it as a word is the first part of the slippery slope.” This seems like a valid fear, but I believe it is misplaced. First of all, we need to be clear about what it means to accept irregardless as a word. I accept that it’s a word, but this does not mean that I find the word acceptable. I can accept that people do all kinds of things that I don’t like. But the real problem isn’t what we mean by accept; it’s what we mean by word. When people say that something isn’t a word, they aren’t really making a testable claim about the objective linguistic status of the word; they’re making a sociolinguistic evaluation of the word. They may say that it’s not a word, but they really mean that it’s a word that’s not allowed in Standard English. This is because we think of Standard English as the only legitimate form of English. We think that the standard has words and grammar, while nonstandard dialects have nonwords and broken grammar, or no grammar at all. Yes, it’s important to recognize and teach the difference between Standard English and nonstandard forms, but it’s also important to be clear about the difference between facts about the language and our feelings about the language.

But the irregardless-haters can also take heart: the word has been around for at least a century now, and although many other new words have been coined and become part of Standard English in that time, irregardless shows no signs of moving towards acceptability. Most people who write for publication are well aware of the stigma attached to it, and even if they aren’t, few copyeditors are willing to let it into print. It’s telling that of the Oxford English Dictionary’s eight citations of the word, two merely cite the word in other dictionaries, three more are mentions or citations in linguistics or literary journals, and one more appears to be using the word ironically. We talk about the word irregardless—mostly just to complain about it—far more than we actually use it.

So yes, irregardless is a word, even though it’s nonstandard. You don’t have to like it, and you certainly don’t have to use it, but you also don’t have to worry about it becoming acceptable anytime soon.

This post also appears on Huffington Post.

By

12 Mistakes Nearly Everyone Who Writes About Grammar Mistakes Makes

There are a lot of bad grammar posts in the world. These days, anyone with a blog and a bunch of pet peeves can crank out a click-bait listicle of supposed grammar errors. There’s just one problem—these articles are often full of mistakes of one sort or another themselves. Once you’ve read a few, you start noticing some patterns. Inspired by a recent post titled “Grammar Police: Twelve Mistakes Nearly Everyone Makes”, I decided to make a list of my own.

1. Confusing grammar with spelling, punctuation, and usage. Many people who write about grammar seem to think that grammar means “any sort of rule of language, especially writing”. But strictly speaking, grammar refers to the structural rules of language, namely morphology (basically the way words are formed from roots and affixes), phonology (the system of sounds in a language), and syntax (the way phrases and clauses are formed from words). Most complaints about grammar are really about punctuation, spelling (such as problems with you’re/your and other homophone confusion) or usage (which is often about semantics). This post, for instance, spends two of its twelve points on commas and a third on quotation marks.

2. Treating style choices as rules. This article says that you should always use an Oxford (or serial) comma (the comma before and or or in a list) and that quotation marks should always follow commas and periods, but the latter is true only in most American styles (linguists often put the commas and periods outside quotes, and so do many non-American styles), and the former is only true of some American styles. I may prefer serial commas, but I’m not going to insist that everyone who doesn’t use them is making a mistake. It’s simply a matter of style, and style varies from one publisher to the next.

3. Ignoring register. There’s a time and a place for following the rules, but the writers of these lists typically treat English as though it had only one register: formal writing. They ignore the fact that following the rules in the wrong setting often sounds stuffy and stilted. Formal written English is not the only legitimate form of the language, and the rules of formal written English don’t apply in all situations. Sure, it’s useful to know when to use who and whom, but it’s probably more useful to know that saying To whom did you give the book? in casual conversation will make you sound like a pompous twit.

4. Saying that a disliked word isn’t a word. You may hate irregardless (I do), but that doesn’t mean it’s not a word. If it has its own meaning and you can use it in a sentence, guess what—it’s a word. Flirgle, on the other hand, is not a word—it’s just a bunch of sounds that I strung together in word-like fashion. Irregardless and its ilk may not be appropriate for use in formal registers, and you certainly don’t have to like them, but as Stan Carey says, “‘Not a word’ is not an argument.”

5. Turning proposals into ironclad laws. This one happens more often than you think. A great many rules of grammar and usage started life as proposals that became codified as inviolable laws over the years. The popular that/which rule, which I’ve discussed at length before, began as a proposal—not “everyone gets this wrong” but “wouldn’t it be nice if we made a distinction here?” But nowadays people have forgotten that a century or so ago, this rule simply didn’t exist, and they say things like “This is one of the most common mistakes out there, and understandably so.” (Actually, no, you don’t understand why everyone gets this “wrong”, because you don’t realize that this rule is a relatively recent invention by usage commentators that some copy editors and others have decided to enforce.) It’s easy to criticize people for not following rules that you’ve made up.

6. Failing to discuss exceptions to rules. Invented usage rules often ignore the complexities of actual usage. Lists of rules such as these go a step further and often ignore the complexities of those rules. For example, even if you follow the that/which rule, you need to know that you can’t use that after a preposition or after the demonstrative pronoun that—you have to use a restrictive which. Likewise, the less/fewer rule is usually reduced to statements like “use fewer for things you can count”, which leads to ugly and unidiomatic constructions like “one fewer thing to worry about”. Affect and effect aren’t as simple as some people make them out to be, either; affect is usually a verb and effect a noun, but affect can also be a noun (with stress on the first syllable) referring to the outward manifestation of emotions, while effect can be a verb meaning to cause or to make happen. Sometimes dumbing down rules just makes them dumb.

7. Overestimating the frequency of errors. The writer of this list says that misuse of nauseous is “Undoubtedly the most common mistake I encounter.” This claim seems worth doubting to me; I can’t remember the last time I heard someone say “nauseous”. Even if you consider it a misuse, it’s got to rate pretty far down the list in terms of frequency. This is why linguists like to rely on data for testable claims—because people tend to fall prey to all kinds of cognitive biases such as the frequency illusion.

8. Believing that etymology is destiny. Words change meaning all the time—it’s just a natural and inevitable part of language. But some people get fixated on the original meanings of some words and believe that those are the only correct meanings. For example, they’ll say that you can only use decimate to mean “to destroy one in ten”. This may seem like a reasonable argument, but it quickly becomes untenable when you realize that almost every single word in the language has changed meaning at some point, and that’s just in the few thousand years in which language has been written or can be reconstructed. And sometimes a new meaning is more useful anyway (which is precisely why it displaced an old meaning). As Jan Freeman said, “We don’t especially need a term that means ‘kill one in 10.’”

9. Simply bungling the rules. If you’re going to chastise people for not following the rules, you should know those rules yourself and be able to explain them clearly. You may dislike singular they, for instance, but you should know that it’s not a case of subject-predicate disagreement, as the author of this list claims—it’s an issue of pronoun-antecedent agreement, which is not the same thing. This list says that “‘less’ is reserved for hypothetical quantities”, but this isn’t true either; it’s reserved for noncount nouns, singular count nouns, and plural count nouns that aren’t generally thought of as discrete entities. Use of less has nothing to do with being hypothetical. And this one says that punctuation always goes inside quotation marks. In most American styles, it’s only commas and periods that always go inside. Colons, semicolons, and dashes always go outside, and question marks and exclamation marks only go inside sometimes.

10. Saying that good grammar leads to good communication. Contrary to popular belief, bad grammar (even using the broad definition that includes usage, spelling, and punctuation) is not usually an impediment to communication. A sentence like Ain’t nobody got time for that is quite intelligible, even though it violates several rules of Standard English. The grammar and usage of nonstandard varieties of English are often radically different from Standard English, but different does not mean worse or less able to communicate. The biggest differences between Standard English and all its nonstandard varieties are that the former has been codified and that it is used in all registers, from casual conversation to formal writing. Many of the rules that these lists propagate are really more about signaling to the grammatical elite that you’re one of them—not that this is a bad thing, of course, but let’s not mistake it for something it’s not. In fact, claims about improving communication are often just a cover for the real purpose of these lists, which is . . .

11. Using grammar to put people down. This post sympathizes with someone who worries about being crucified by the grammar police and then says a few paragraphs later, “All hail the grammar police!” In other words, we like being able to crucify those who make mistakes. Then there are the put-downs about people’s education (“You’d think everyone learned this rule in fourth grade”) and more outright insults (“5 Grammar Mistakes that Make You Sound Like a Chimp”). After all, what’s the point in signaling that you’re one of the grammatical elite if you can’t take a few potshots at the ignorant masses?

12. Forgetting that correct usage ultimately comes from users. The disdain for the usage of common people is symptomatic of a larger problem: forgetting that correct usage ultimately comes from the people, not from editors, English teachers, or usage commentators. You’re certainly entitled to have your opinion about usage, but at some point you have to recognize that trying to fight the masses on a particular point of usage (especially if it’s a made-up rule) is like trying to fight the rising tide. Those who have invested in learning the rules naturally feel defensive of them and of the language in general, but you have no more right to the language than anyone else. You can be restrictive if you want and say that Standard English is based on the formal usage of educated writers, but any standard that is based on a set of rules that are simply invented and passed down is ultimately untenable.

And a bonus mistake:

13. Making mistakes themselves. It happens to the best of us. The act of making grammar or spelling mistakes in the course of pointing out someone else’s mistakes even has a name, Muphry’s law. This post probably has its fair share of typos. (If you spot one, feel free to point it out—politely!—in the comments.)

This post also appears on Huffington Post.

By

My Thesis

I’ve been putting this post off for a while for a couple of reasons: first, I was a little burned out and was enjoying not thinking about my thesis for a while, and second, I wasn’t sure how to tackle this post. My thesis is about eighty pages long all told, and I wasn’t sure how to reduce it to a manageable length. But enough procrastinating.

The basic idea of my thesis was to see which usage changes editors are enforcing in print and thus infer what kind of role they’re playing in standardizing (specifically codifying) usage in Standard Written English. Standard English is apparently pretty difficult to define precisely, but most discussions of it say that it’s the language of educated speakers and writers, that it’s more formal, and that it achieves greater uniformity by limiting or regulating the variation found in regional dialects. Very few writers, however, consider the role that copy editors play in defining and enforcing Standard English, and what I could find was mostly speculative or anecdotal. That’s the gap my research aimed to fill, and my hunch was that editors were not merely policing errors but were actively introducing changes to Standard English that set it apart from other forms of the language.

Some of you may remember that I solicited help with my research a couple of years ago. I had collected about two dozen manuscripts edited by student interns and then reviewed by professionals, and I wanted to increase and improve my sample size. Between the intern and volunteer edits, I had about 220,000 words of copy-edited text. Tabulating the grammar and usage changes took a very long time, and the results weren’t as impressive as I’d hoped they’d be. There were still some clear patterns, though, and I believe they confirmed my basic idea.

The most popular usage changes were standardizing the genitive form of names ending in -s (Jones’>Jones’s), which>that, towards>toward, moving only, and increasing parallelism. These changes were not only numerically the most popular, but they were edited at fairly high rates—up to 80 percent. That is, if towards appeared ten times, it was changed to toward eight times. The interesting thing about most of these is that they’re relatively recent inventions of usage writers. I’ve already written about which hunting on this blog, and I recently wrote about towards for Visual Thesaurus.

In both cases, the rule was invented not to halt language change, but to reduce variation. For example, in unedited writing, English speakers use towards and toward with roughly equal frequency; in edited writing, toward outnumbers towards 10 to 1. With editors enforcing the rule in writing, the rule quickly becomes circular—you should use toward because it’s the norm in Standard (American) English. Garner used a similarly circular defense of the that/which rule in this New York Times Room for Debate piece with Robert Lane Greene:

But my basic point stands: In American English from circa 1930 on, “that” has been overwhelmingly restrictive and “which” overwhelmingly nonrestrictive. Strunk, White and other guidebook writers have good reasons for their recommendation to keep them distinct — and the actual practice of edited American English bears this out.

He’s certainly correct in saying that since 1930 or so, editors have been changing restrictive which to that. But this isn’t evidence that there’s a good reason for the recommendation; it’s only evidence that editors believe there’s a good reason.

What is interesting is that usage writers frequently invoke Standard English in defense of the rules, saying that you should change towards to toward or which to that because the proscribed forms aren’t acceptable in Standard English. But if Standard English is the formal, nonregional language of educated speakers and writers, then how can we say that towards or restrictive which are nonstandard? What I realized is this: part of the problem with defining Standard English is that we’re talking about two similar but distinct things—the usage of educated speakers, and the edited usage of those speakers. But because of the very nature of copy editing, we conflate the two. Editing is supposed to be invisible, so we don’t know whether what we’re seeing is the author’s or the editor’s.

Arguments about proper usage become confused because the two sides are talking past each other using the same term. Usage writers, editors, and others see linguists as the enemies of Standard (Edited) English because they see them tearing down the rules that define it, setting it apart from educated but unedited usage, like that/which and toward/towards. Linguists, on the other hand, see these invented rules as being unnecessarily imposed on people who already use Standard English, and they question the motives of those who create and enforce the rules. In essence, Standard English arises from the usage of educated speakers and writers, while Standard Edited English adds many more regulative rules from the prescriptive tradition.

My findings have some serious implications for the use of corpora to study usage. Corpus linguistics has done much to clarify questions of what’s standard, but the results can still be misleading. With corpora, we can separate many usage myths and superstitions from actual edited usage, but we can’t separate edited usage from simple educated usage. We look at corpora of edited writing and think that we’re researching Standard English, but we’re unwittingly researching Standard Edited English.

None of this is to say that all editing is pointless, or that all usage rules are unnecessary inventions, or that there’s no such thing as error because educated speakers don’t make mistakes. But I think it’s important to differentiate between true mistakes and forms that have simply been proscribed by grammarians and editors. I don’t believe that towards and restrictive which can rightly be called errors, and I think it’s even a stretch to call them stylistically bad. I’m open to the possibility that it’s okay or even desirable to engineer some language changes, but I’m unconvinced that either of the rules proscribing these is necessary, especially when the arguments for them are so circular. At the very least, rules like this serve to signal to readers that they are reading Standard Edited English. They are a mark of attention to detail, even if the details in question are irrelevant. The fact that someone paid attention to them is perhaps what is most important.

And now, if you haven’t had enough, you can go ahead and read the whole thesis here.

By

Reflections on National Grammar Day

I know I’m a week late to the party, but I’ve been thinking a lot about National Grammar Day and want to blog about it anyway. Please forgive me for my untimeliness.

First off, I should say for those who don’t know me that I work as a copy editor. I clearly understand the value of using Standard American English when it is called for, and I know its rules and conventions quite well. I’m also a student of linguistics, and I find language fascinating. I understand the desire to celebrate language and to promote its good use, but unfortunately it appears that National Grammar Day does neither.

If you go to National Grammar Day’s web site and click on “About SPOGG” at the top of the page, you find this:

The Society for the Promotion of Good Grammar is for pen-toters appalled by wanton displays of Bad English. . . . SPOGG is for people who crave good, clean English — sentences cast well and punctuated correctly. It’s about clarity.

I can get behind those last two sentences (noting, of course, this description seems to exclude spoken English), but the first obviously flies in the face of the society’s name—is it trying to promote “good” (read “standard”) grammar, or simply ridicule what it deems to be displays of bad English? Well, if you read the SPOGG Blog, it appears to be the latter. None of the posts on the front page seem to deal with clarity; in each case it seems quite clear what the author intended, so obviously SPOGG is not about clarity after all.

In fact, what I gather from this post in particular is that SPOGG is more about the social value of using Standard English than it is about anything else. The message here is quite clear: using nonstandard English is like having spinach in your teeth. It’s like wearing a speedo on the bus. SPOGG isn’t about good, clean English or about clarity. It’s only about mocking those who violate a set of taboos. By following the rules, you signal to others that you belong to a certain group, one whose members care about linguistic manners in the same way that some people care about not putting their elbows on the table while they eat.

And that’s perfectly fine with me. If you delight in fussy little rules about spelling and punctuation, that’s your choice. But I think it’s important to distinguish between the rules that are truly important and the guidelines and conventions that are more flexible and optional. John McIntyre made this point quite well in his post today on his blog, You Don’t Say.

Unfortunately, I find that SPOGG’s founder, Martha Brockenbrough, quite frequently fails to make this distinction. She also shows an appalling lack of knowledge on issues like how language changes, what linguists do, and, to top it all off, what grammar actually is. Of course, she falls back on the “Geez, can’t you take a joke?” defense, which doesn’t really seem to fly, as Arnold Zwicky and others have already noted.

As I said at the start, I can appreciate the desire to celebrate grammar. I just wish National Grammar Day actually did that.

%d bloggers like this: