Arrant Pedantry

By

Why Teach Grammar?

Today is National Grammar Day, and I’ve been thinking a lot lately about what grammar is and why we study it. Last week in the Atlantic, Michelle Navarre Cleary wrote that we should do away with diagramming sentences and other explicit grammar instruction. Her argument, in a nutshell, is that grammar instruction not only doesn’t help students write better, but it actually teaches them to hate writing.

It’s really no surprise—as an editor and a student of language, I’ve run into a lot of people who never learned the difference between a preposition and a participle and are insecure about their writing or their speech. I once had a friend who was apparently afraid to talk to me because she thought I was silently correcting everything she said. When I found out about it, I reassured her that I wasn’t; not only had I never noticed anything wrong with the way she talked, but I don’t worry about correcting people unless they’re paying me for it. But I worried that this was how people saw me: a know-it-all jerk who silently judged everyone else for their errors. I love language, and it saddened me to think that there are people who find it not fascinating but frustrating.

But given the state of grammar instruction in the United States today, it’s not hard to see why a lot of people feel this way. I learned hardly any sentence diagramming until I got to college, and my public school education in grammar effectively stopped in eighth or ninth grade when I learned what a prepositional phrase was. In high school, our grammar work consisted of taking sentences like “He went to the store” and changing them to “Bob went to the store” (because you can’t use he without an antecedent; never mind that such a sentence would not occur in isolation and would surely make sense in context).

Meanwhile, many students are marked down on their papers for supposed grammar mistakes (which are usually matters of spelling, punctuation, or style): don’t use contractions, don’t start a sentence with conjunctions, don’t use any form of the verb be, don’t write in the first person, don’t refer to yourself in the third person, don’t use the passive voice, and on and on. Of course most students are going to come out of writing class feeling insecure. They’re punished for failing to master rules that don’t make sense.

And it doesn’t help that there’s often a disconnect between what the rules say good writing is and what it actually is. Good writing breaks these rules all the time, and following all the rules does little if anything to make bad writing good. We know the usual justifications: students have to master the basics before they can become experts, and once they become experts, they’ll know when it’s okay to break the rules.

But these justifications presuppose that teaching students not to start a sentence with a conjunction or not to use the passive voice has something to do with good writing, when it simply doesn’t. I’ve said before that we don’t consider whether we’re giving students training wheels or just putting sticks in their spokes. Interestingly, Cleary uses a similar argument in her Atlantic piece: “Just as we teach children how to ride bikes by putting them on a bicycle, we need to teach students how to write grammatically by letting them write.”

I’m still not convinced, though, that learning grammar has much at all to do with learning to write. Having a PhD in linguistics doesn’t mean you know how to write well, and being an expert writer doesn’t mean you know anything about syntax and morphology beyond your own native intuition. And focusing on grammar instruction may distract from the more fundamental writing issues of rhetoric and composition. So why worry about grammar at all if it has nothing to do with good writing? Language Log’s Mark Liberman said it well:

We don’t put chemistry into the school curriculum because it will make students better cooks, or even because it might make them better doctors, much less because we need a relatively small number of professional chemists. We believe (I hope) that a basic understanding of atoms and molecules is knowledge that every citizen of the modern world should have.

It may seem like a weak defense in a world that increasingly focuses on marketable skills, but it’s maybe the best justification we have. Language is amazing; no other animal has the capacity for expression that we do. Language is so much more than a grab-bag of peeves and strictures to inflict on freshman writing students; it’s a fundamental part of who we are as a species. Shouldn’t we expect an educated person to know something about it?

So yes, I think we should teach grammar, not because it will help people write better, but simply because it’s interesting and worth knowing about. But we need to recognize that it doesn’t belong in the same class as writing or literature; though it certainly has connections to both, linguistics is a separate field and should be treated as such. And we need to teach grammar not as something to hate or even as something to learn as a means to an end, but as a fascinating and complex system to be discovered and explored for its own sake. In short, we need to teach grammar as something to love.

By

Who Edits the Editors?

Today is National Grammar Day, in case you hadn’t heard, and to celebrate I want to take a look at some of those who hold themselves up to be defenders of the English language: copy editors. A few weeks ago, the webcomic XKCD published this comic mocking the participants in a Wikipedia edit war over the title of Star Trek into Darkness. The question was whether “into” in the title should be capitalized. Normally, prepositions in titles are lowercase, but if there’s an implied colon after “Star Trek”, then “Into Darkness” is technically a subtitle, and the first word of a subtitle gets capitalized. As the comic noted, forty thousand words of argument back and forth had been written, but no consensus was in sight. (The discussion seems to be gone now.)

The Wikipedia discussion is an apt illustration of one of the perils of editing: long, drawn-out and seemingly irresolvable discussions about absolutely trivial things. Without prior knowledge about whether “into darkness” is a subtitle or part of the title, there’s no clear answer. It’s a good example of Parkinson’s law of triviality at work. Everyone wants to put in their two cents’ worth, but the debate will never end unless someone with authority simply makes an arbitrary but final decision one way or the other.

I wouldn’t have thought much else of that discussion if not for the fact that it was picked up by Nathan Heller in a column called “Copy-Editing the Culture” over at Slate. Someone cited one of my posts—”It’s just a joke. But no, seriously“—in the discussion in the comments, so I followed the link back and read the column. And what I found dismayed me.

The article begins (after a couple of paragraphs of self-indulgence) by claiming that “it is . . . entirely unclear what the title is trying to communicate.” This complaint is puzzling, since it seems fairly obvious what the title is supposed to mean, but the problems with the column become clearer as the reasoning becomes murkier: “Are there missing words—an implied verb, for example? The grammatical convention is to mark such elisions with a comma: Star Trek Going Into Darkness could become, conceivably, Star Trek, Into Darkness“. An implied verb? Marking such elisions with a comma? What on earth is he on about? I don’t see any reason why the title needs a verb, and I’ve never heard of marking elided verbs with a comma. Marking an elided “and” in headlines, perhaps, but that’s it.

[Update: It occurred to me what he probably meant, and I feel stupid for not seeing it. It's covered under 6.49 in the 16th edition ofChicago. A comma may be used to signal the elision of a word or words easily understood from context, though what they don't say is that it's a repeated word or words, and that's crucial. One example they give is In Illinois there are seventeen such schools; in Ohio, twenty; in Indiana, thirteen. The comma here indicates the elision of there are. The Star Trek, Into Darkness example doesn't work because it's a title with no other context. There aren't any repeated words that are understood from context and are thus candidates for elision. I could say, "Star Wars is going into light; Star Trek, into darkness", but Star Trek, into Darkness" simply doesn't make sense under any circumstances, which is probably why I didn't get what Heller meant.]

The article continues to trek into darkness with ever more convoluted reasoning: “Or perhaps the film’s creators intend Star Trek to be understood as a verb—to Star Trek—turning the title into an imperative: ‘Star Trek into darkness!’” Yes, clearly that’s how it’s to be understood—as an imperative! I suppose Journey to the Center of the Earth is intended to be read the same way. But Heller keeps on digging: “Perhaps those two words [Star Trek] are meant to function individually [whatever that means]. . . . If trek is a verb—“We trek into darkness”—what, precisely, is going on with the apparent subject of the sentence, star? Why is it not plural, to match the verb form: Stars Trek Into Darkness? Or if trek is a noun—“His trek into darkness”—where is the article or pronoun that would give the title sense: A Star Trek Into Darkness? And what, for that matter, is a star trek?”

This is perhaps the stupidest passage about grammar that I’ve ever read. Star Trek is a noun-noun compound, not a noun and a verb, as is clear from their lack of grammatical agreement. A star trek is a trek among the stars. Titles don’t need articles—remember Journey to the Center of the Earth? (Yes, I know that it is sometimes translated as A Journey to the Center of the Earth, but the article is optional and doesn’t exist in the original French.)

I know that some of you are thinking, “It’s a joke! Lighten up!” Obviously this argument has already occurred in the comments, which is why my post was linked to. I’ll grant that it’s probably intended to be a joke, but if so it’s the lamest, most inept language-related joke I’ve ever read. It’s like a bookkeeper feigning confusion about the equation 2 + 2 = 4, asking, “Shouldn’t it be 2 + 2 = 22?” Not only does Heller’s piece revel in grammatical ineptitude, but it reinforces the stereotype of editors as small-minded and officious pedants.

I’ve worked as a copy editor and layout artist for over ten years, and I’ve worked with a lot of different people in that time. I’ve known some really great editors and some really bad ones, and I think that even the best of us tend to get hung up on trivialities like whether to capitalize into far more than we should. When I first saw the Slate column, I hoped that it would address some of those foibles, but instead it took a turn for the insipid and never looked back. I looked at a few more entries in the column, and they all seem to work about the same way, seizing on meaningless trivialities and trying to make them seem significant.

So I have a plea for you this Grammar Day: stop using grammar as the butt of lame jokes or as a tool for picking apart people or things that you don’t like. And if that is how you’re going to use it, at least try to make sure you know what you’re talking about first. You’re making the rest of us look bad.

By

Rules, Evidence, and Grammar

In case you haven’t heard, it’s National Grammar Day, and that seemed as good a time as any to reflect a little on the role of evidence in discussing grammar rules. (Goofy at Bradshaw of the Future apparently had the same idea.) A couple of months ago, Geoffrey Pullum made the argument in this post on Lingua Franca that it’s impossible to talk about what’s right or wrong in language without considering the evidence. Is singular they grammatical and standard? How do you know?

For most people, I think, the answer is pretty simple: you look it up in a source that you trust. If the source says it’s grammatical or correct, it is. If it doesn’t, it isn’t. Singular they is wrong because many authoritative sources say it is. End of story. And if you try to argue that the sources aren’t valid or reliable, you’re labelled an anything-goes type who believes we should just toss all the rules out the window and embrace linguistic anarchy.

The question is, where did these sources get their authority to say what’s right and wrong?

That is, when someone says that you should never use they as a singular pronoun or start a sentence with hopefully or use less with count nouns, why do you suppose that the rules they put forth are valid? The rules obviously haven’t been inscribed on stone tablets by the finger of the Lord, but they have to come from somewhere. Every language is different, and languages and constantly changing, so I think we have to recognize that there is no universal, objective truth when it comes to grammar and usage.

David Foster Wallace apparently fell into the trap of thinking that there was, unfortunately. In his famous Harper’s article “Tense Present: Democracy, English, and the Wars over Usage,” he quotes the introduction to The American College Dictionary, which says, “A dictionary can be an “authority” only in the sense in which a book of chemistry or of physics or of botany can be an “authority”: by the accuracy and the completeness of its record of the observed facts of the field examined, in accord with the latest principles and techniques of the particular science.”

He retorts,

This is so stupid it practically drools. An “authoritative” physics text presents the results of physicists’ observations and physicists’ theories about those observations. If a physics textbook operated on Descriptivist principles, the fact that some Americans believe that electricity flows better downhill (based on the observed fact that power lines tend to run high above the homes they serve) would require the Electricity Flows Better Downhill Theory to be included as a “valid” theory in the textbook—just as, for Dr. Fries, if some Americans use infer for imply, the use becomes an ipso facto “valid” part of the language.

The irony of his first sentence is almost overwhelming. Physics is a set of universal laws that can be observed and tested, and electricity works regardless of what anyone believes. Language, on the other hand, is quite different. In fact, Wallace tacitly acknowledges the difference—without explaining his apparent contradiction—immediately after: “It isn’t scientific phenomena they’re tabulating but rather a set of human behaviors, and a lot of human behaviors are—to be blunt—moronic. Try, for instance, to imagine an ‘authoritative’ ethics textbook whose principles were based on what most people actually do.”[1]

Now here he hits on an interesting question. Any argument about right or wrong in language ultimately comes down to one of two options: it’s wrong because it’s absolutely, objectively wrong, or it’s wrong because arbitrary societal convention says it’s wrong. The former is untenable, but the latter doesn’t give us any straightforward answers. If there is no objective truth in usage, then how do we know what’s right and wrong?

Wallace tries to make the argument about ethics; sloppy language leads to real problems like people accidentally eating poison mushrooms. But look at his gargantuan list of peeves and shibboleths on the first page of the article. How many of them lead to real ethical problems? Does singular they pose any kind of ethical problem? What about sentential hopefully or less with count nouns? I don’t think so.

So if there’s no ethical problem with disputed usage, then we’re still left with the question, what makes it wrong? Here we get back to Pullum’s attempt to answer the question: let’s look at the evidence. And, because we can admit, like Wallace, that some people’s behavior is moronic, let’s limit ourselves to looking at the evidence from those speakers and writers whose language can be said to be most standard. What we find even then is that a lot of the usage and grammar rules that have been put forth, from Bishop Robert Lowth to Strunk and White to Bryan Garner, don’t jibe with actual usage.

Edward Finegan seizes on this discrepancy in an article a few years back. In discussing sentential hopefully, he quotes Garner as saying that it is “all but ubiquitous—even in legal print. Even so, the word received so much negative attention in the 1970s and 1980s that many writers have blacklisted it, so using it at all today is a precarious venture. Indeed, careful writers and speakers avoid the word even in its traditional sense, for they’re likely to be misunderstood if they use it in the old sense”[2] Finegan says, “I could not help but wonder how a reflective and careful analyst could concede that hopefully is all but ubiquitous in legal print and claim in the same breath that careful writers and speakers avoid using it.”[3]

The problem when you start questioning the received wisdom on grammar and usage is that you make a lot of people very angry. In a recent conversation on Twitter, Mignon Fogarty, aka Grammar Girl, said, “You would not believe (or maybe you would) how much grief I’m getting for saying ‘data’ can sometimes be singular.” I responded, “Sadly, I can. For some people, grammar is more about cherished beliefs than facts, and they don’t like having them challenged.” They don’t want to hear arguments about authority and evidence and deriving rules from what educated speakers actually use. They want to believe that there’s some deeper truths that justify their preferences and peeves, and that’s probably not going to change anytime soon. But for now, I’ll keep trying.

  1. [1] David Foster Wallace, “Tense Present: Democracy, English, and the Wars over Usage,” Harper’s Monthly, April 2001, 47.
  2. [2] “Bryan A. Garner, A Dictionary of Modern Legal Usage, 2nd ed. (New York: Oxford University Press, 1995).
  3. [3] Edward Finegan, “Linguistic Prescription: Familiar Practices and New Perspectives,” Annual Review of Applied Linguistics (2003) 23, 216.

By

Reflections on National Grammar Day

I know I’m a week late to the party, but I’ve been thinking a lot about National Grammar Day and want to blog about it anyway. Please forgive me for my untimeliness.

First off, I should say for those who don’t know me that I work as a copy editor. I clearly understand the value of using Standard American English when it is called for, and I know its rules and conventions quite well. I’m also a student of linguistics, and I find language fascinating. I understand the desire to celebrate language and to promote its good use, but unfortunately it appears that National Grammar Day does neither.

If you go to National Grammar Day’s web site and click on “About SPOGG” at the top of the page, you find this:

The Society for the Promotion of Good Grammar is for pen-toters appalled by wanton displays of Bad English. . . . SPOGG is for people who crave good, clean English — sentences cast well and punctuated correctly. It’s about clarity.

I can get behind those last two sentences (noting, of course, this description seems to exclude spoken English), but the first obviously flies in the face of the society’s name—is it trying to promote “good” (read “standard”) grammar, or simply ridicule what it deems to be displays of bad English? Well, if you read the SPOGG Blog, it appears to be the latter. None of the posts on the front page seem to deal with clarity; in each case it seems quite clear what the author intended, so obviously SPOGG is not about clarity after all.

In fact, what I gather from this post in particular is that SPOGG is more about the social value of using Standard English than it is about anything else. The message here is quite clear: using nonstandard English is like having spinach in your teeth. It’s like wearing a speedo on the bus. SPOGG isn’t about good, clean English or about clarity. It’s only about mocking those who violate a set of taboos. By following the rules, you signal to others that you belong to a certain group, one whose members care about linguistic manners in the same way that some people care about not putting their elbows on the table while they eat.

And that’s perfectly fine with me. If you delight in fussy little rules about spelling and punctuation, that’s your choice. But I think it’s important to distinguish between the rules that are truly important and the guidelines and conventions that are more flexible and optional. John McIntyre made this point quite well in his post today on his blog, You Don’t Say.

Unfortunately, I find that SPOGG’s founder, Martha Brockenbrough, quite frequently fails to make this distinction. She also shows an appalling lack of knowledge on issues like how language changes, what linguists do, and, to top it all off, what grammar actually is. Of course, she falls back on the “Geez, can’t you take a joke?” defense, which doesn’t really seem to fly, as Arnold Zwicky and others have already noted.

As I said at the start, I can appreciate the desire to celebrate grammar. I just wish National Grammar Day actually did that.