Arrant Pedantry

By

Why Teach Grammar?

Today is National Grammar Day, and I’ve been thinking a lot lately about what grammar is and why we study it. Last week in the Atlantic, Michelle Navarre Cleary wrote that we should do away with diagramming sentences and other explicit grammar instruction. Her argument, in a nutshell, is that grammar instruction not only doesn’t help students write better, but it actually teaches them to hate writing.

It’s really no surprise—as an editor and a student of language, I’ve run into a lot of people who never learned the difference between a preposition and a participle and are insecure about their writing or their speech. I once had a friend who was apparently afraid to talk to me because she thought I was silently correcting everything she said. When I found out about it, I reassured her that I wasn’t; not only had I never noticed anything wrong with the way she talked, but I don’t worry about correcting people unless they’re paying me for it. But I worried that this was how people saw me: a know-it-all jerk who silently judged everyone else for their errors. I love language, and it saddened me to think that there are people who find it not fascinating but frustrating.

But given the state of grammar instruction in the United States today, it’s not hard to see why a lot of people feel this way. I learned hardly any sentence diagramming until I got to college, and my public school education in grammar effectively stopped in eighth or ninth grade when I learned what a prepositional phrase was. In high school, our grammar work consisted of taking sentences like “He went to the store” and changing them to “Bob went to the store” (because you can’t use he without an antecedent; never mind that such a sentence would not occur in isolation and would surely make sense in context).

Meanwhile, many students are marked down on their papers for supposed grammar mistakes (which are usually matters of spelling, punctuation, or style): don’t use contractions, don’t start a sentence with conjunctions, don’t use any form of the verb be, don’t write in the first person, don’t refer to yourself in the third person, don’t use the passive voice, and on and on. Of course most students are going to come out of writing class feeling insecure. They’re punished for failing to master rules that don’t make sense.

And it doesn’t help that there’s often a disconnect between what the rules say good writing is and what it actually is. Good writing breaks these rules all the time, and following all the rules does little if anything to make bad writing good. We know the usual justifications: students have to master the basics before they can become experts, and once they become experts, they’ll know when it’s okay to break the rules.

But these justifications presuppose that teaching students not to start a sentence with a conjunction or not to use the passive voice has something to do with good writing, when it simply doesn’t. I’ve said before that we don’t consider whether we’re giving students training wheels or just putting sticks in their spokes. Interestingly, Cleary uses a similar argument in her Atlantic piece: “Just as we teach children how to ride bikes by putting them on a bicycle, we need to teach students how to write grammatically by letting them write.”

I’m still not convinced, though, that learning grammar has much at all to do with learning to write. Having a PhD in linguistics doesn’t mean you know how to write well, and being an expert writer doesn’t mean you know anything about syntax and morphology beyond your own native intuition. And focusing on grammar instruction may distract from the more fundamental writing issues of rhetoric and composition. So why worry about grammar at all if it has nothing to do with good writing? Language Log’s Mark Liberman said it well:

We don’t put chemistry into the school curriculum because it will make students better cooks, or even because it might make them better doctors, much less because we need a relatively small number of professional chemists. We believe (I hope) that a basic understanding of atoms and molecules is knowledge that every citizen of the modern world should have.

It may seem like a weak defense in a world that increasingly focuses on marketable skills, but it’s maybe the best justification we have. Language is amazing; no other animal has the capacity for expression that we do. Language is so much more than a grab-bag of peeves and strictures to inflict on freshman writing students; it’s a fundamental part of who we are as a species. Shouldn’t we expect an educated person to know something about it?

So yes, I think we should teach grammar, not because it will help people write better, but simply because it’s interesting and worth knowing about. But we need to recognize that it doesn’t belong in the same class as writing or literature; though it certainly has connections to both, linguistics is a separate field and should be treated as such. And we need to teach grammar not as something to hate or even as something to learn as a means to an end, but as a fascinating and complex system to be discovered and explored for its own sake. In short, we need to teach grammar as something to love.

By

12 Mistakes Nearly Everyone Who Writes About Grammar Mistakes Makes

There are a lot of bad grammar posts in the world. These days, anyone with a blog and a bunch of pet peeves can crank out a click-bait listicle of supposed grammar errors. There’s just one problem—these articles are often full of mistakes of one sort or another themselves. Once you’ve read a few, you start noticing some patterns. Inspired by a recent post titled “Grammar Police: Twelve Mistakes Nearly Everyone Makes”, I decided to make a list of my own.

1. Confusing grammar with spelling, punctuation, and usage. Many people who write about grammar seem to think that grammar means “any sort of rule of language, especially writing”. But strictly speaking, grammar refers to the structural rules of language, namely morphology (basically the way words are formed from roots and affixes), phonology (the system of sounds in a language), and syntax (the way phrases and clauses are formed from words). Most complaints about grammar are really about punctuation, spelling (such as problems with you’re/your and other homophone confusion) or usage (which is often about semantics). This post, for instance, spends two of its twelve points on commas and a third on quotation marks.

2. Treating style choices as rules. This article says that you should always use an Oxford (or serial) comma (the comma before and or or in a list) and that quotation marks should always follow commas and periods, but the latter is true only in most American styles (linguists often put the commas and periods outside quotes, and so do many non-American styles), and the former is only true of some American styles. I may prefer serial commas, but I’m not going to insist that everyone who doesn’t use them is making a mistake. It’s simply a matter of style, and style varies from one publisher to the next.

3. Ignoring register. There’s a time and a place for following the rules, but the writers of these lists typically treat English as though it had only one register: formal writing. They ignore the fact that following the rules in the wrong setting often sounds stuffy and stilted. Formal written English is not the only legitimate form of the language, and the rules of formal written English don’t apply in all situations. Sure, it’s useful to know when to use who and whom, but it’s probably more useful to know that saying To whom did you give the book? in casual conversation will make you sound like a pompous twit.

4. Saying that a disliked word isn’t a word. You may hate irregardless (I do), but that doesn’t mean it’s not a word. If it has its own meaning and you can use it in a sentence, guess what—it’s a word. Flirgle, on the other hand, is not a word—it’s just a bunch of sounds that I strung together in word-like fashion. Irregardless and its ilk may not be appropriate for use in formal registers, and you certainly don’t have to like them, but as Stan Carey says, “‘Not a word’ is not an argument.”

5. Turning proposals into ironclad laws. This one happens more often than you think. A great many rules of grammar and usage started life as proposals that became codified as inviolable laws over the years. The popular that/which rule, which I’ve discussed at length before, began as a proposal—not “everyone gets this wrong” but “wouldn’t it be nice if we made a distinction here?” But nowadays people have forgotten that a century or so ago, this rule simply didn’t exist, and they say things like “This is one of the most common mistakes out there, and understandably so.” (Actually, no, you don’t understand why everyone gets this “wrong”, because you don’t realize that this rule is a relatively recent invention by usage commentators that some copy editors and others have decided to enforce.) It’s easy to criticize people for not following rules that you’ve made up.

6. Failing to discuss exceptions to rules. Invented usage rules often ignore the complexities of actual usage. Lists of rules such as these go a step further and often ignore the complexities of those rules. For example, even if you follow the that/which rule, you need to know that you can’t use that after a preposition or after the demonstrative pronoun that—you have to use a restrictive which. Likewise, the less/fewer rule is usually reduced to statements like “use fewer for things you can count”, which leads to ugly and unidiomatic constructions like “one fewer thing to worry about”. Affect and effect aren’t as simple as some people make them out to be, either; affect is usually a verb and effect a noun, but affect can also be a noun (with stress on the first syllable) referring to the outward manifestation of emotions, while effect can be a verb meaning to cause or to make happen. Sometimes dumbing down rules just makes them dumb.

7. Overestimating the frequency of errors. The writer of this list says that misuse of nauseous is “Undoubtedly the most common mistake I encounter.” This claim seems worth doubting to me; I can’t remember the last time I heard someone say “nauseous”. Even if you consider it a misuse, it’s got to rate pretty far down the list in terms of frequency. This is why linguists like to rely on data for testable claims—because people tend to fall prey to all kinds of cognitive biases such as the frequency illusion.

8. Believing that etymology is destiny. Words change meaning all the time—it’s just a natural and inevitable part of language. But some people get fixated on the original meanings of some words and believe that those are the only correct meanings. For example, they’ll say that you can only use decimate to mean “to destroy one in ten”. This may seem like a reasonable argument, but it quickly becomes untenable when you realize that almost every single word in the language has changed meaning at some point, and that’s just in the few thousand years in which language has been written or can be reconstructed. And sometimes a new meaning is more useful anyway (which is precisely why it displaced an old meaning). As Jan Freeman said, “We don’t especially need a term that means ‘kill one in 10.'”

9. Simply bungling the rules. If you’re going to chastise people for not following the rules, you should know those rules yourself and be able to explain them clearly. You may dislike singular they, for instance, but you should know that it’s not a case of subject-predicate disagreement, as the author of this list claims—it’s an issue of pronoun-antecedent agreement, which is not the same thing. This list says that “‘less’ is reserved for hypothetical quantities”, but this isn’t true either; it’s reserved for noncount nouns, singular count nouns, and plural count nouns that aren’t generally thought of as discrete entities. Use of less has nothing to do with being hypothetical. And this one says that punctuation always goes inside quotation marks. In most American styles, it’s only commas and periods that always go inside. Colons, semicolons, and dashes always go outside, and question marks and exclamation marks only go inside sometimes.

10. Saying that good grammar leads to good communication. Contrary to popular belief, bad grammar (even using the broad definition that includes usage, spelling, and punctuation) is not usually an impediment to communication. A sentence like Ain’t nobody got time for that is quite intelligible, even though it violates several rules of Standard English. The grammar and usage of nonstandard varieties of English are often radically different from Standard English, but different does not mean worse or less able to communicate. The biggest differences between Standard English and all its nonstandard varieties are that the former has been codified and that it is used in all registers, from casual conversation to formal writing. Many of the rules that these lists propagate are really more about signaling to the grammatical elite that you’re one of them—not that this is a bad thing, of course, but let’s not mistake it for something it’s not. In fact, claims about improving communication are often just a cover for the real purpose of these lists, which is . . .

11. Using grammar to put people down. This post sympathizes with someone who worries about being crucified by the grammar police and then says a few paragraphs later, “All hail the grammar police!” In other words, we like being able to crucify those who make mistakes. Then there are the put-downs about people’s education (“You’d think everyone learned this rule in fourth grade”) and more outright insults (“5 Grammar Mistakes that Make You Sound Like a Chimp”). After all, what’s the point in signaling that you’re one of the grammatical elite if you can’t take a few potshots at the ignorant masses?

12. Forgetting that correct usage ultimately comes from users. The disdain for the usage of common people is symptomatic of a larger problem: forgetting that correct usage ultimately comes from the people, not from editors, English teachers, or usage commentators. You’re certainly entitled to have your opinion about usage, but at some point you have to recognize that trying to fight the masses on a particular point of usage (especially if it’s a made-up rule) is like trying to fight the rising tide. Those who have invested in learning the rules naturally feel defensive of them and of the language in general, but you have no more right to the language than anyone else. You can be restrictive if you want and say that Standard English is based on the formal usage of educated writers, but any standard that is based on a set of rules that are simply invented and passed down is ultimately untenable.

And a bonus mistake:

13. Making mistakes themselves. It happens to the best of us. The act of making grammar or spelling mistakes in the course of pointing out someone else’s mistakes even has a name, Muphry’s law. This post probably has its fair share of typos. (If you spot one, feel free to point it out—politely!—in the comments.)

This post also appears on Huffington Post.

By

Guest Post at Logophilius

Today I have a guest post about rules and style choices at Andy Hollandbeck's blog Logophilius. Go take a look, and while you’re there, check out the rest of his site.

By

Rules, Evidence, and Grammar

In case you haven’t heard, it’s National Grammar Day, and that seemed as good a time as any to reflect a little on the role of evidence in discussing grammar rules. (Goofy at Bradshaw of the Future apparently had the same idea.) A couple of months ago, Geoffrey Pullum made the argument in this post on Lingua Franca that it’s impossible to talk about what’s right or wrong in language without considering the evidence. Is singular they grammatical and standard? How do you know?

For most people, I think, the answer is pretty simple: you look it up in a source that you trust. If the source says it’s grammatical or correct, it is. If it doesn’t, it isn’t. Singular they is wrong because many authoritative sources say it is. End of story. And if you try to argue that the sources aren’t valid or reliable, you’re labelled an anything-goes type who believes we should just toss all the rules out the window and embrace linguistic anarchy.

The question is, where did these sources get their authority to say what’s right and wrong?

That is, when someone says that you should never use they as a singular pronoun or start a sentence with hopefully or use less with count nouns, why do you suppose that the rules they put forth are valid? The rules obviously haven’t been inscribed on stone tablets by the finger of the Lord, but they have to come from somewhere. Every language is different, and languages and constantly changing, so I think we have to recognize that there is no universal, objective truth when it comes to grammar and usage.

David Foster Wallace apparently fell into the trap of thinking that there was, unfortunately. In his famous Harper’s article “Tense Present: Democracy, English, and the Wars over Usage,” he quotes the introduction to The American College Dictionary, which says, “A dictionary can be an “authority” only in the sense in which a book of chemistry or of physics or of botany can be an “authority”: by the accuracy and the completeness of its record of the observed facts of the field examined, in accord with the latest principles and techniques of the particular science.”

He retorts,

This is so stupid it practically drools. An “authoritative” physics text presents the results of physicists’ observations and physicists’ theories about those observations. If a physics textbook operated on Descriptivist principles, the fact that some Americans believe that electricity flows better downhill (based on the observed fact that power lines tend to run high above the homes they serve) would require the Electricity Flows Better Downhill Theory to be included as a “valid” theory in the textbook—just as, for Dr. Fries, if some Americans use infer for imply, the use becomes an ipso facto “valid” part of the language.

The irony of his first sentence is almost overwhelming. Physics is a set of universal laws that can be observed and tested, and electricity works regardless of what anyone believes. Language, on the other hand, is quite different. In fact, Wallace tacitly acknowledges the difference—without explaining his apparent contradiction—immediately after: “It isn’t scientific phenomena they’re tabulating but rather a set of human behaviors, and a lot of human behaviors are—to be blunt—moronic. Try, for instance, to imagine an ‘authoritative’ ethics textbook whose principles were based on what most people actually do.”[1]

Now here he hits on an interesting question. Any argument about right or wrong in language ultimately comes down to one of two options: it’s wrong because it’s absolutely, objectively wrong, or it’s wrong because arbitrary societal convention says it’s wrong. The former is untenable, but the latter doesn’t give us any straightforward answers. If there is no objective truth in usage, then how do we know what’s right and wrong?

Wallace tries to make the argument about ethics; sloppy language leads to real problems like people accidentally eating poison mushrooms. But look at his gargantuan list of peeves and shibboleths on the first page of the article. How many of them lead to real ethical problems? Does singular they pose any kind of ethical problem? What about sentential hopefully or less with count nouns? I don’t think so.

So if there’s no ethical problem with disputed usage, then we’re still left with the question, what makes it wrong? Here we get back to Pullum’s attempt to answer the question: let’s look at the evidence. And, because we can admit, like Wallace, that some people’s behavior is moronic, let’s limit ourselves to looking at the evidence from those speakers and writers whose language can be said to be most standard. What we find even then is that a lot of the usage and grammar rules that have been put forth, from Bishop Robert Lowth to Strunk and White to Bryan Garner, don’t jibe with actual usage.

Edward Finegan seizes on this discrepancy in an article a few years back. In discussing sentential hopefully, he quotes Garner as saying that it is “all but ubiquitous—even in legal print. Even so, the word received so much negative attention in the 1970s and 1980s that many writers have blacklisted it, so using it at all today is a precarious venture. Indeed, careful writers and speakers avoid the word even in its traditional sense, for they’re likely to be misunderstood if they use it in the old sense”[2] Finegan says, “I could not help but wonder how a reflective and careful analyst could concede that hopefully is all but ubiquitous in legal print and claim in the same breath that careful writers and speakers avoid using it.”[3]

The problem when you start questioning the received wisdom on grammar and usage is that you make a lot of people very angry. In a recent conversation on Twitter, Mignon Fogarty, aka Grammar Girl, said, “You would not believe (or maybe you would) how much grief I’m getting for saying ‘data’ can sometimes be singular.” I responded, “Sadly, I can. For some people, grammar is more about cherished beliefs than facts, and they don’t like having them challenged.” They don’t want to hear arguments about authority and evidence and deriving rules from what educated speakers actually use. They want to believe that there’s some deeper truths that justify their preferences and peeves, and that’s probably not going to change anytime soon. But for now, I’ll keep trying.

  1. [1] David Foster Wallace, “Tense Present: Democracy, English, and the Wars over Usage,” Harper’s Monthly, April 2001, 47.
  2. [2] “Bryan A. Garner, A Dictionary of Modern Legal Usage, 2nd ed. (New York: Oxford University Press, 1995).
  3. [3] Edward Finegan, “Linguistic Prescription: Familiar Practices and New Perspectives,” Annual Review of Applied Linguistics (2003) 23, 216.

By

Who, That, and the Nature of Bad Rules

A couple of weeks ago the venerable John E. McIntyre blogged about a familiar prescriptive bugbear, the question of that versus who(m). It all started on the blog of the Society for the Promotion of Good Grammar, where a Professor Jacoby, a college English professor, wrote in to share his justification for the rule, which is that you should avoid using that which human referents because it depersonalizes them. He calls this justification “quite profound,” which is probably a good sign that it’s not. Mr. McIntyre, ever the reasonable fellow, tried to inject some facts into the conversation, but apparently to no avail.

What I find most interesting about the whole discussion, however, is not the argument over whether that can be used with human referents, but what the whole argument says about prescriptivism and the way we talk about language and rules. (Indeed, the subject has already been covered very well by Gabe Doyle at Motivated Grammar, who made some interesting discoveries about relative pronoun usage that may indicate some cognitive motivation.) Typically, the person putting forth the rule assumes a priori that the rule is valid, and thereafter it seems that no amount of evidence or argument can change their mind. The entire discussion at the SPOGG blog proceeds without any real attempts to address Mr. McIntyre’s points, and it ends with the SPOGG correspondent who originally kicked off the discussion sullenly taking his football and going home.

James Milroy, an emeritus professor of sociolinguistics at the University of Michigan, once wrote that all rationalizations for prescriptions are post hoc; that is, the rules are taken to be true, and the justifications come afterward and really only serve to give the rule the illusion of validity:

Indeed all prescriptive arguments about correctness that depend on intra-linguistic factors are post-hoc rationalizations. . . . But an intra-linguistic rationalization is not the reason why some usages are believed to be wrong. The reason is that it is simply common sense: everybody knows it, it is part of the culture to know it, and you are an outsider if you think otherwise: you are not a participant in the common culture, and so your views can be dismissed. To this extent, linguists who state that I seen it is not ungrammatical are placing themselves outside the common culture.[1]

This may sound like a rather harsh description of prescriptivism, but I think there’s a lot of truth to it—especially the part about linguists unwittingly setting themselves outside of the culture. Linguists try to play the part of the boy who pointed out that the emperor has no clothes, but instead of breaking the illusion they are at best treated as suspect for not playing along. But the point linguists are trying to make isn’t that there’s no such thing as right or wrong in language (though there are some on the fringe who would make such claims)—they’re simply trying to point out that, quite frequently, the justifications are phony and attention to facts and evidence is mostly nonexistent. There are no real axioms or first principles from which prescriptive rules follow—at least, there don’t seem to be any that are consistently applied and followed to their logical conclusions. Instead the canon of prescriptions is a hodgepodge of style and usage opinions that have been passed down and are generally assumed to have the force of law. There are all kinds of unexamined assumptions packaged into prescriptions and their justifications, such as the following from Professor Jacoby:

  • Our society has a tendency to depersonalize people.
  • Depersonalizing people is bad.
  • Using that as a relative pronoun with human referents depersonalizes them.

There are probably more, but that covers the bases. Note that even if we agree that our society depersonalizes people and that this is a bad thing, it’s still quite a leap from this to the claim that that depersonalizes people. But, as Milroy argued, it’s not really about the justification. It’s about having a justification. You can go on until you’re blue in the face about the history of English relative pronoun usage (for instance, that demonstrative pronouns like that were the only option in Old English, and that this has changed several times over the last millennium and a half, and that it’s only recently that people have begun to claim that that with people is wrong) or about usage in other, related languages (such as German, which uses demonstrative pronouns as relative pronouns), but it won’t make any difference; at best, the person arguing for the rule will superficially soften their stance and make some bad analogies to fashion or ethics, saying that while it might not be a rule, it’s still a good guideline, especially for novices. After all, novices need rules that are more black and white—they need to use training wheels for a while before they can ride unaided. Too bad we also never stop to ask whether we’re actually providing novices with training wheels or just putting sticks in their spokes.

Meanwhile, prescriptivists frequently dismiss all evidence for one reason or another: It’s well established in the history of usage? Well, that just shows that people have always made mistakes. It’s even used by greats like Chaucer, Shakespeare, and other literary giants? Hey, even the greats make mistakes. Either that or they mastered the rules and thus know when it’s okay to break them. People today overwhelmingly break the rule? Well, that just shows how dire the situation is. You literally can’t win, because, as Geoffrey Pullum puts it, “nothing is relevant.”

So if most prescriptions are based on unexamined assumptions and post hoc rationalizations, where does that leave things? Do we throw it all out because it’s a charade? That seems rather extreme. There will always be rules, because that’s simply the nature of people. The question is, how do we establish which rules are valid, and how do we teach this to students and practice it as writers and editors? Honestly, I don’t know, but I know that it involves real research and a willingness to critically evaluate not only the rules but also the assumptions that underlie them. We have to stop having a knee-jerk reaction against linguistic methods and allow them to inform our understanding. And linguists need to learn that rules are not inherently bad. Indeed, as John Algeo put it, “The problem is not that some of us have prescribed (we have all done so and continue to do so in one way or another); the trouble is that some of us have prescribed such nonsense.”[2]

  1. [1] James Milroy, “Language Ideologies and the Consequences of Standardization,” Journal of Sociolinguistics 5, no. 4 (November 2001), 536.
  2. [2] “Linguistic Marys, Linguistic Marthas: The Scope of Language Study,” College English 31, no. 3 (December 1969): 276.

By

Less and Fewer

I know this topic has been addressed in detail elsewhere (see goofy’s post here for example), but a friend recently asked me about it, so I thought I’d take a crack at it. It’s fairly straightforward: there are the complex, implicit rules that people have been following for over a thousand years, and then there are the simple, explicit, artificial rules that some people have been trying to inflict on everyone else for the last couple of centuries.

The explicit rule is this: use fewer for count nouns (things that can be numbered), and use less for mass nouns (things that are typically measured). So you’d say fewer eggs but less milk, fewer books but less information. Units of time, money, distance, and so on are usually treated as mass nouns (so you’d say less than ten years old, not fewer than ten years old. One handy (but overly simplistic) way to tell mass nouns and count nouns apart (save for the exception I just noted) is this: if you can make it plural and use a numeral in front of it (five eggs), then it’s a count noun and it takes fewer.

The only problem with this rule is that it was invented by Robert Baker in 1770, and it contradicts historical and present-day usage. In actual practice, fewer has always been restricted to count nouns, but less is often used with count nouns, too, especially in certain constructions like twenty-five words or less, no less than one hundred people, and one less problem to worry about. It used to be that people used less when it sounded natural and nobody worried about it, but then some guy in the eighteenth century got the bright idea that we should always use one word for count nouns and one word for mass nouns, and people have been freaking out about it ever since.

Baker’s rule is appealing because it’s simple and (in my opinion) because it allows people to judge others who don’t know grammar. It makes a certain kind of sense to use one word for one thing and another word for another thing, but the fact is that language is seldom so neat and tidy. Real language is full of complexities and exceptions to rules, and the amazing thing is that we learn all of these rules naturally just by listening to and talking with other people. Breaking Baker’s rule is not a sign of lazy thinking or sloppy grammar or anything else negative—it’s just a sign that you’re a native speaker.

The fact that not everybody follows the simple, explicit rule, nearly 240 years after it was created, shows you just how hard it is to get people to change their linguistic habits. Is there any advantage to following the made-up rule? Probably not, aside from avoiding stigma from people who like to look down their noses at those who they deem to have poor grammar. So if you want to please the fussy grammarian types, be sure to use follow Baker’s made-up rule. If you don’t care about those types, use whatever comes naturally to you.

By

Rules Are Rules

Recently I was involved in an online discussion about the pronunciation of the word the before vowels. Someone wanted to know if it was pronounced /ði/ (“thee”) before vowels only in singing, or if it was a general rule of speech as well. His dad had said it was a rule, but he had never heard it before and wondered if maybe it was more of a convention than a rule. Throughout the conversation, several more people expressed similar opinions—they’d never heard this rule before and they doubted whether it was really a rule at all.

There are a few problems here. First of all, not everybody means exactly the same thing when they talk about rules. It’s like when laymen dismiss evolution because it’s “just a theory.” They forget that gravity is also just a theory. And when laymen talk about linguistic rules, they usually mean prescriptive rules. Prescriptive rules usually state that a particular thing should be done, which typically implies that it often isn’t done.

But when linguists talk about rules, they mean descriptive ones. Think of it this way: if you were going to teach a computer how to speak English fluently, what would it need to know? Well, one tiny little detail that it would need to know is that the word the is pronounced with a schwa (/ðə/) except when it is stressed or followed by a vowel. Nobody needs to be taught this rule, except for non-native speakers, because we all learn it by hearing it when we’re children. And thus it follows that it’s never taught in English class, so it throws some people for a bit of a loop when they heard it called a rule.

But even on the prescriptivist side of things, not all rules are created equal. There are a lot of rules that are generally covered in English classes, and they’re usually taught as simple black-and-white declarations: x is right and y is wrong. When people ask me questions about language, they usually seem to expect answers along these lines. Many issues of grammar and usage are complicated and have no clear right wrong answer. Same with style—open up two different style guides, and you’ll often find two (or more) ways to punctuate, hyphenate, and capitalize. A lot of times these things boil down to issues of formality, context, and personal taste.

Unfortunately, most of us hear language rules expressed as inviolable laws all the way through public school and probably into college. It’s hard to overcome a dozen years or more of education on a subject and start to learn that maybe things aren’t as simple as you’ve been told, that maybe those trusted authorities and gatekeepers of the language, the English teachers, were not always well-informed. But as writing becomes more and more important in modern life, it likewise becomes more important to teach people meaningful, well-founded rules that aren’t two centuries old. It’s time for English class to get educated.

By

Arrant Pedantry

When you study and work with language for a living, a lot of people naturally assume that you’re some sort of scowling, finger-wagging pedant who is secretly keeping a list of all the grammatical and usage errors they make. It’s difficult to make people understand that you only correct errors when you’re on the clock, and even then you sometimes do it grudgingly because that’s what the style guide says, not necessarily because you believe you’re actually improving the text. It’s even harder to make people understand that what you’re really interested in is understanding how language works, not telling people that they’re using the language wrong or that they’re somehow lacking or inferior because they split an infinitive and dangled a participle.

The problem is that too many people have had bad experiences with just such language pedants, the Miss Thistlebottoms of the world. Now, I have to say that I do believe that there should be standards in the language and that they should be taught to students and followed by writers and editors (when appropriate).

The problem is that the standards in English are too often defined or enforced by people who apparently pull rules out of thin air. These grammatical fussbudgets aren’t interested in a standard based on the usage of educated speakers and writers; rather, they seem to prefer rules that set them apart from the unwashed masses, that give them a reason to judge and condemn. The Elements of Style is their bible, Strunk and White are their prophets, and they sneer down their noses at those not of their faith. The objective, empirical truth of English usage is of no interest to them; they have faith in their false gospel of grammar.

Why do these grammar nazis bother me so? For a lot of reasons, actually. First of all, because a lot of people assume that I’m one of them, and that is simply not true. I was never much of a grammar nazi even when I was new to the field of editing; I favored the spirit of the law over the letter of the law. I still enjoy editing, and I have some very good friends who are excellent editors, but too many people in that profession are either incompetent or tyrannical (or likely both).

Second, I have a strong respect for the truth. Most grammaristos will believe whatever falsehoods they happened to hear in their English classes. If an English teacher tells them that it’s always wrong to split an infinitive, to strand a preposition, or to use they with a singular antecedent, they will unquestioningly accept it as gospel truth, no matter how nonsensical it may be. Any rational person could do a little research and find all three of those “rules” broken by virtually all the finest English writers of the last several centuries. You’d think this would be enough to convince them that such rules are faulty, but the grammar pedants will usually respond with a retort like “Just because Shakespeare made mistakes doesn’t make it alright.” You simply can’t argue with people like that.

And as if those rules weren’t ridiculous enough, there are teachers in the world who tell their students that it’s outright wrong to use the final serial comma or to use the subordinator that when it could be omitted. These sorts of rules only serve to teach students that English is a difficult, frustrating subject that doesn’t make sense. These students then spend the rest of their lives fearing anyone in a position of grammatical authority and believing that many of their natural tendencies in the language are probably wrong.

When people are blindly stupid about grammar and usage, it makes me angry, but when people have been cowed into believing that no matter what they do, they’re always going to get it wrong, it just makes me sad. There’s something seriously wrong with the way English grammar is taught today. At some point the system was taken over by people who favored literary analysis over any sort of teaching of the principles of the language, so what little grammar is being taught is fundamentally flawed because no one has taken the time to learn it properly before they attempt to teach it to others. It’s a subject that’s been highly abused, and too often it’s used for abusive purposes.

Unfortunately, I have no idea what the solution is. I may not be a grammar nazi, but neither am I a grammar anarchist. All I know is that I don’t like the way things are, and I think it’s time for a change.

By

Standards of Usage

Grammar is a poorly understood and much-maligned word. It’s usually used to mean the set of rules governing all aspects of language—a tedious and convoluted list of strictures and prohibitions telling us what we should and shouldn’t say or write. It’s a subject that most people do not like and one that they do not find very useful in real life. It’s also one that most people are very insecure about. For example, whenever people learn that I’m an editor, I typically get one of the following reactions: (1) they have no idea what an editor does, or (2) they get nervous and say, “Well, I ain’t got no good grammar.” It’s really amazing how many people have uttered that exact phrase.

Hated though it may be, grammar is an important subject, especially in a world that relies more and more on written communication. So, first off, let’s define some terms. Grammar is the study of word forms (morphology) and sentence structure (syntax). These are fields that native speakers typically have no problems with. We all know how form plurals and past tenses and how to string words together to form a sentence. Usage is the far more relevant field, the one that tells us not to use ain’t or double negatives. Style is the set of rules governing more aesthetic issues like punctuation and capitalization.

The problem is that usage rules are not handed down from on high (unless you consider teachers and editors to be prophets, that is). In many ways, usage rules are like fashion rules: they are a generally accepted set of guidelines intended to keep you from looking stupid. Of course, usage guidelines are far more enduring than fashion rules. Unfortunately, many of those long-lived rules are more apocryphal than canonical, and many of the self-proclaimed prophets are false.

So it is with hesitation and much hemming and hawing that I answer questions like the recent one from my sister: “Should it be ‘than I’ instead of ‘than me’?” To me, these are never simple yes-or-no questions. It all depends on the context, the level of formality, the preferences of the speaker, and so on. When speaking to friends, it would sound odd—nay, wrong—to use the more formal “than I.” Grammar is a fairly straightforward field, but usage is a quagmire of history, context, pontification, and pedanticism.

But how can an editor and a graduate in English language speak such blasphemy? Shouldn’t I be the one defending good wholesome rules like “than I”? Again, this is not a simple yes-or-no question. I may be a defender of the language, but I defend what I have come to believe is right, not what outdated textbooks, fastidious English teachers, or long-dead pedagogues decreed as correct.

The English I defend is good and simple. It is not full of Latinate rules that never applied to English. It is not full of petty distinctions that fly in the face of the usage of educated speakers. It is elegant, simple, clear, and free from awkwardness. It is English as it is and should be, not English as it never was.

And that, my friends, is my standard of usage.