Arrant Pedantry

By

Scriptivists Revisited

Before I begin: I know—it’s been a terribly, horribly, unforgivably long time since my last post. Part of it is that I’m often busy with grad school and work and family, and part of it is that I’ve been thinking an awful lot lately about prescriptivism and descriptivism and linguists and editors and don’t really know where to begin.

I know that I’ve said some harsh things about prescriptivists before, but I don’t actually hate prescriptivism in general. As I’ve said before, prescriptivism and descriptivism are not really diametrically opposed, as some people believe they are. Stan Carey explores some of the common ground between the two in a recent post, and I think there’s a lot more to be said about the issue.

I think it’s possible to be a descriptivist and prescriptivist simultaneously. In fact, I think it’s difficult if not impossible to fully disentangle the two approaches. The fact is that many or most prescriptive rules are based on observed facts about the language, even though those facts may be incomplete or misunderstood in some way. Very seldom does anyone make up a rule out of whole cloth that bears no resemblance to reality. Rules often arise because someone has observed a change or variation in the language and is seeking to slow or reverse that change (as in insisting that “comprised of” is always an error) or to regularize the variation (as in insisting that “which” be used for nonrestrictive relative clauses and “that” for restrictive ones).

One of my favorite language blogs, Motivated Grammar, declares “Prescriptivism must die!” but to be honest, I’ve never quite been comfortable with that slogan. Now, I love a good debunking of language myths as much as the next guy—and Gabe Doyle does a commendable job of it—but not all prescriptivism is a bad thing. The impulse to identify and fix potential problems with the language is a natural one, and it can be used for both good and ill. Just take a look at the blogs of John E. McIntyre, Bill Walsh, and Jan Freeman for examples of well-informed, sensible language advice. Unfortunately, as linguists and many others know, senseless language advice is all too common.

Linguists often complain about and debunk such bad language advice—and rightly so, in my opinion—but I think in doing so they often make the mistake of dismissing prescriptivism altogether. Too often linguists view prescriptivism as an annoyance to be ignored or as a rival approach that must be quashed, but either way they miss the fact that prescriptivism is a metalinguistic phenomenon worth exploring and understanding. And why is it worth exploring? Because it’s an essential part of how ordinary speakers—and even linguists—use language in their daily lives, whether they realize it or not.

Contrary to what a lot of linguists say, language isn’t really a natural phenomenon—it’s a learned behavior. And as with any other human behavior, we generally strive to make our language match observed standards. Or as Emily Morgan so excellently says in a guest post on Motivated Grammar, “Language is something that we as a community of speakers collectively create and reinvent each time we speak.” She says that this means that language is “inextricably rooted in a descriptive generalization about what that community does,” but it also means that it is rooted in prescriptive notions of language. Because when speakers create and reinvent language, they do so by shaping their language to fit listeners’ expectations.

That is, for the most part, there’s no difference in speakers’ minds between what they should do with language and what they do do with language. They use language the way they do because they feel as though they should, and this in turn reinforces the model that influences everyone else’s behavior. I’ve often reflected on the fact that style guides like The Chicago Manual of Style will refer to dictionaries for spelling issues—thus prescribing how to spell—but these dictionaries simply describe the language found in edited writing. Description and prescription feed each other in an endless loop. This may not be mathematical logic, but it is a sort of logic nonetheless. Philosophers love to say that you can’t derive an ought from an is, and yet people do nonetheless. If you want to fit in with a certain group, then you should behave in a such a way as to be accepted by that group, and that group’s behavior is simply an aggregate of the behaviors of everyone else trying to fit in.

And at this point, linguists are probably thinking, “And people should be left alone to behave the way they wish to behave.” But leaving people alone means letting them decide which behaviors to favor and which to disfavor—that is, which rules to create and enforce. Linguists often criticize those who create and propagate rules, as if such rules are bad simply as a result of their artificiality, but, once again, the truth is that all language is artificial; it doesn’t exist until we make it exist. And if we create it, why should we always be coolly dispassionate about it? Objectivity might be great in the scientific study of language, but why should language users approach language the same way? Why should we favor “natural” or “spontaneous” changes and yet disfavor more conscious changes?

This is something that Deborah Cameron addresses in her book Verbal Hygiene (which I highly, highly recommend)—the notion that “spontaneous” or “natural” changes are okay, while deliberate ones are meddlesome and should be resisted. As Cameron counters, “If you are going to make value judgements at all, then surely there are more important values than spontaneity. How about truth, beauty, logic, utility?” (1995, 20). Of course, linguists generally argue that an awful lot of prescriptions do nothing to create more truth, beauty, logic, or utility, and this is indeed a problem, in my opinion.

But when linguists debunk such spurious prescriptions, they miss something important: people want language advice from experts, and they’re certainly not getting it from linguists. The industry of bad language advice exists partly because the people who arguably know the most about how language really works—the linguists—aren’t at all interested in giving advice on language. Often they take the hands-off attitude exemplified in Robert Hall’s book Leave Your Language Alone, crying, “Linguistics is descriptive, not prescriptive!” But in doing so, linguists are nonetheless injecting themselves into the debate rather than simply observing how people use language. If an objective, hands-off approach is so valuable, then why don’t linguists really take their hands off and leave prescriptivists alone?

I think the answer is that there’s a lot of social value in following language rules, whether or not they are actually sensible. And linguists, being the experts in the field, don’t like ceding any social or intellectual authority to a bunch of people that they view as crackpots and petty tyrants. They chafe at the idea that such ill-informed, superstitious advice—what Language Log calls “prescriptivist poppycock”—can or should have any value at all. It puts informed language users in the position of having to decide whether to follow a stupid rule so as to avoid drawing the ire of some people or to break the rule and thereby look stupid to those people. Arnold Zwicky explores this conundrum in a post titled “Crazies Win.”

Note something interesting at the end of that post: Zwicky concludes by giving his own advice—his own prescription—regarding the issue of split infinitives. Is this a bad thing? No, not at all, because prescriptivism is not the enemy. As John Algeo said in an article in College English, “The problem is not that some of us have prescribed (we have all done so and continue to do so in one way or another); the trouble is that some of us have prescribed such nonsense” (“Linguistic Marys, Linguistic Marthas: The Scope of Language Study,” College English 31, no. 3 [December 1969]: 276). As I’ve said before, the nonsense is abundant. Just look at this awful Reader’s Digest column or this article on a Monster.com site for teachers for a couple recent examples.

Which brings me back to a point I’ve made before: linguists need to be more involved in not just educating the public about language, but in giving people the sensible advice they want. Trying to kill prescriptivism is not the answer to the language wars, and truly leaving language alone is probably a good way to end up with a dead language. Exploring it and trying to figure out how best to use it—this is what keeps language alive and thriving and interesting. And that’s good for prescriptivists and descriptivists alike.

By

Linguists and Straw Men

Sorry I haven’t posted in so long (I know I say that a lot)—I’ve been busy with school and things. Anyway, a couple months back I got a comment on an old post of mine, and I wanted to address it. I know it’s a bit lame to respond to two-month-old comments, but it was on a two-year-old post, so I figure it’s okay.

The comment is here, under a post of mine entitled “Scriptivists”. I believe the comment is supposed to be a rebuttal of that post, but I’m a little confused by the attempt. The commenter apparently accuses me of burning straw men, but ironically, he sets up a massive straw man of his own.

His first point seems to make fun of linguists for using technical terminology, but I’m not sure what that really proves. After all, technical terminology allows you to be very specific about abstract or complicated issues, so how is that really a criticism? I suppose it keeps a lot of laypeople from understanding what you’re saying, but if that’s the worst criticism you’ve got, then I guess I’ve got to shrug my shoulders and say, “Guilty as charged.”

The second point just makes me scratch my head. Using usage evidence from the greatest writers is a bad thing now? Honestly, how do you determine what usage features are good and worthy of emulation if not by looking to the most respected writers in the language?

The last point is just stupid. How often do you see Geoffrey Pullum or Languagehat or any of the other linguistics bloggers whipping out the fact that they have graduate degrees?

And I must disagree with Mr. Kevin S. that the “Mrs. Grundys” of the world don’t actually exist. I’ve heard too many stupid usage superstitions being perpetuated today and seen too much Strunk & White worship to believe that that sort of prescriptivist is extinct. Take, for example, Sonia Sotomayor, who says that split infinities make her “blister”. Or take one of my sister-in-law’s professors, who insisted that her students could not use the following features in their writing:

  • The first person
  • The passive voice
  • Phrases like “this paper will show . . .” or “the data suggest . . .” because, according to her, papers are not capable of showing and data is not capable of suggesting.

How, exactly, are you supposed to write an academic paper without resorting to one of those devices—none of which, by the way, are actually wrong—at one time or another? These proscriptions were absolutely nonsensical, supported by neither logic nor usage nor common sense.

There’s still an awful lot of absolute bloody nonsense coming from the prescriptivists of the world. (Of course, this is not to say that all or even most prescriptivists are like this; take, for example, the inimitable John McIntyre, who is one of the most sensible and well-informed prescriptivists I’ve ever encountered.) And sorry to say, I don’t see the same sort of stubborn and ill-informed arguments coming from the descriptivists’ camp. And I’m pretty sure I’ve never seen a descriptivist who resembled the straw man that Kevin S. constructed.

By

Rules Are Rules

Recently I was involved in an online discussion about the pronunciation of the word the before vowels. Someone wanted to know if it was pronounced /ði/ (“thee”) before vowels only in singing, or if it was a general rule of speech as well. His dad had said it was a rule, but he had never heard it before and wondered if maybe it was more of a convention than a rule. Throughout the conversation, several more people expressed similar opinions—they’d never heard this rule before and they doubted whether it was really a rule at all.

There are a few problems here. First of all, not everybody means exactly the same thing when they talk about rules. It’s like when laymen dismiss evolution because it’s “just a theory.” They forget that gravity is also just a theory. And when laymen talk about linguistic rules, they usually mean prescriptive rules. Prescriptive rules usually state that a particular thing should be done, which typically implies that it often isn’t done.

But when linguists talk about rules, they mean descriptive ones. Think of it this way: if you were going to teach a computer how to speak English fluently, what would it need to know? Well, one tiny little detail that it would need to know is that the word the is pronounced with a schwa (/ðə/) except when it is stressed or followed by a vowel. Nobody needs to be taught this rule, except for non-native speakers, because we all learn it by hearing it when we’re children. And thus it follows that it’s never taught in English class, so it throws some people for a bit of a loop when they heard it called a rule.

But even on the prescriptivist side of things, not all rules are created equal. There are a lot of rules that are generally covered in English classes, and they’re usually taught as simple black-and-white declarations: x is right and y is wrong. When people ask me questions about language, they usually seem to expect answers along these lines. Many issues of grammar and usage are complicated and have no clear right wrong answer. Same with style—open up two different style guides, and you’ll often find two (or more) ways to punctuate, hyphenate, and capitalize. A lot of times these things boil down to issues of formality, context, and personal taste.

Unfortunately, most of us hear language rules expressed as inviolable laws all the way through public school and probably into college. It’s hard to overcome a dozen years or more of education on a subject and start to learn that maybe things aren’t as simple as you’ve been told, that maybe those trusted authorities and gatekeepers of the language, the English teachers, were not always well-informed. But as writing becomes more and more important in modern life, it likewise becomes more important to teach people meaningful, well-founded rules that aren’t two centuries old. It’s time for English class to get educated.

By

One Fewer Usage Error

In my mind, less and fewer illustrates quite well virtually all of the problems of prescriptivism: the codification of the opinion of some eighteenth-century writer, the disregard for well over a millennium of usage, the insistence on the utility in a superfluous distinction, and the oversimplification of the original rule leading to hypercorrection.

I found a very lovely example of hypercorrection the other day in The New York Times: “The figures are adjusted for one fewer selling day this September than a year ago.” Not even stuffy constructions like “10 items or fewer” make me cringe the way that made me cringe.

No usage or style guide that I know of recommends this usage. In my experience, most guides that enforce the less/fewer distinction grant exceptions when dealing with things like money, distance, or time or when following the word one. And why, exactly, is one an exception? I’m really not sure, but my best guess is that it sounds so strange that even the most strictly logical prescriptivists admit that less must be the correct choice.

Merriam-Webster’s Dictionary of English Usage has an excellent entry on less/fewer, but surprisingly, regarding the “one fewer” issue it says only, “And of course [less] follows one.” Perhaps the use of “one fewer” is so rare that the editors didn’t think to say more about it. Obviously someone should’ve said something to the copy editor at The New York Times.

By

Arrant Pedantry

When you study and work with language for a living, a lot of people naturally assume that you’re some sort of scowling, finger-wagging pedant who is secretly keeping a list of all the grammatical and usage errors they make. It’s difficult to make people understand that you only correct errors when you’re on the clock, and even then you sometimes do it grudgingly because that’s what the style guide says, not necessarily because you believe you’re actually improving the text. It’s even harder to make people understand that what you’re really interested in is understanding how language works, not telling people that they’re using the language wrong or that they’re somehow lacking or inferior because they split an infinitive and dangled a participle.

The problem is that too many people have had bad experiences with just such language pedants, the Miss Thistlebottoms of the world. Now, I have to say that I do believe that there should be standards in the language and that they should be taught to students and followed by writers and editors (when appropriate).

The problem is that the standards in English are too often defined or enforced by people who apparently pull rules out of thin air. These grammatical fussbudgets aren’t interested in a standard based on the usage of educated speakers and writers; rather, they seem to prefer rules that set them apart from the unwashed masses, that give them a reason to judge and condemn. The Elements of Style is their bible, Strunk and White are their prophets, and they sneer down their noses at those not of their faith. The objective, empirical truth of English usage is of no interest to them; they have faith in their false gospel of grammar.

Why do these grammar nazis bother me so? For a lot of reasons, actually. First of all, because a lot of people assume that I’m one of them, and that is simply not true. I was never much of a grammar nazi even when I was new to the field of editing; I favored the spirit of the law over the letter of the law. I still enjoy editing, and I have some very good friends who are excellent editors, but too many people in that profession are either incompetent or tyrannical (or likely both).

Second, I have a strong respect for the truth. Most grammaristos will believe whatever falsehoods they happened to hear in their English classes. If an English teacher tells them that it’s always wrong to split an infinitive, to strand a preposition, or to use they with a singular antecedent, they will unquestioningly accept it as gospel truth, no matter how nonsensical it may be. Any rational person could do a little research and find all three of those “rules” broken by virtually all the finest English writers of the last several centuries. You’d think this would be enough to convince them that such rules are faulty, but the grammar pedants will usually respond with a retort like “Just because Shakespeare made mistakes doesn’t make it alright.” You simply can’t argue with people like that.

And as if those rules weren’t ridiculous enough, there are teachers in the world who tell their students that it’s outright wrong to use the final serial comma or to use the subordinator that when it could be omitted. These sorts of rules only serve to teach students that English is a difficult, frustrating subject that doesn’t make sense. These students then spend the rest of their lives fearing anyone in a position of grammatical authority and believing that many of their natural tendencies in the language are probably wrong.

When people are blindly stupid about grammar and usage, it makes me angry, but when people have been cowed into believing that no matter what they do, they’re always going to get it wrong, it just makes me sad. There’s something seriously wrong with the way English grammar is taught today. At some point the system was taken over by people who favored literary analysis over any sort of teaching of the principles of the language, so what little grammar is being taught is fundamentally flawed because no one has taken the time to learn it properly before they attempt to teach it to others. It’s a subject that’s been highly abused, and too often it’s used for abusive purposes.

Unfortunately, I have no idea what the solution is. I may not be a grammar nazi, but neither am I a grammar anarchist. All I know is that I don’t like the way things are, and I think it’s time for a change.

%d bloggers like this: