Arrant Pedantry

By

It’s All Grammar—So What?

It’s a frequent complaint among linguists that laypeople use the term grammar in such a loose and unsystematic way that it’s more or less useless. They say that it’s overly broad, encompassing many different types of rules, and that it allows people to confuse things as different as syntax and spelling. They insist that spelling, punctuation, and ideas such as style or formality are not grammar at all, that grammar is really just the rules of syntax and morphology that define the language.

Arnold Zwicky, for instance, has complained that grammar as it’s typically used refers to nothing more than a “grab-bag of linguistic peeve-triggers”. I think this is an overly negative view; yes, there are a lot of people who peeve about grammar, but I think that most people, when they talk about grammar, are thinking about how to say things well or correctly.

Some people take linguists’ insistence on the narrower, more technical meaning of grammar as a sign of hypocrisy. After all, they say, with something of a smirk, shouldn’t we just accept the usage of the majority? If almost everyone uses grammar in a broad and vague way, shouldn’t we consider that usage standard? Linguists counter that this really is an important distinction, though I think it’s fair to say that they have a personal interest here; they teach grammar in the technical sense and are dismayed when people misunderstand what they do.

I’ve complained about this myself, but I’m starting to wonder whether it’s really something to worry about. (Of course, I’m probably doubly a hypocrite, what with all the shirts I sell with the word grammar on them.) After all, we see similar splits between technical and popular terminology in a lot of other fields, and they seem to get by just fine.

Take the terms fruit and vegetable, for instance. In popular use, fruits are generally sweeter, while vegetables are more savory or bitter. And while most people have probably heard the argument that tomatoes are actually fruits, not vegetables, they might not realize that squash, eggplants, peppers, peas, green beans, nuts, and grains are fruits too, at least by the botanical definition. And vegetable doesn’t even have a botanical definition—it’s just any part of a plant (other than fruits or seeds) that’s edible. It’s not a natural class at all.

In a bit of editorializing, the Oxford English Dictionary adds this note after its first definition of grammar:

As above defined, grammar is a body of statements of fact—a ‘science’; but a large portion of it may be viewed as consisting of rules for practice, and so as forming an ‘art’. The old-fashioned definition of grammar as ‘the art of speaking and writing a language correctly’ is from the modern point of view in one respect too narrow, because it applies only to a portion of this branch of study; in another respect, it is too wide, and was so even from the older point of view, because many questions of ‘correctness’ in language were recognized as outside the province of grammar: e.g. the use of a word in a wrong sense, or a bad pronunciation or spelling, would not have been called a grammatical mistake. At the same time, it was and is customary, on grounds of convenience, for books professedly treating of grammar to include more or less information on points not strictly belonging to the subject.

There are a few points here to consider. The definition of grammar has not been solely limited to syntax and morphology for many years. Once it started branching out into notions of correctness, it made sense to treat grammar, usage, spelling, and pronunciation together. From there it’s a short leap to calling the whole collection grammar, since there isn’t really another handy label. And since few people are taught much in the way of syntax and morphology unless they’re majoring in linguistics, it’s really no surprise that the loose sense of grammar predominates. I’ll admit, however, that it’s still a little exasperating to see lists of grammar rules that everyone gets wrong that are just spelling rules or, at best, misused words.

The root of the problem is that laypeople use words in ways that are useful and meaningful to them, and these ways don’t always jibe with scientific facts. It’s the same thing with grammar; laypeople use it to refer to language rules in general, especially the ones they’re most conscious of, which tend to be the ones that are the most highly regulated—usage, spelling, and style. Again, issues of syntax, morphology, semantics, usage, spelling, and style don’t constitute a natural class, but it’s handy to have a word that refers to the aspects of language that most people are conscious of and concerned with.

I think there still is a problem, though, and it’s that most people generally have a pretty poor understanding of things like syntax, morphology, and semantics. Grammar isn’t taught much in schools anymore, so many people graduate from high school and even college without much of an understanding of grammar beyond spelling and mechanics. I got out of high school without knowing anything more advanced than prepositional phrases. My first grammar class in college was a bit of a shock, because I’d never even learned about things like the passive voice or dependent clauses before that point, so I have some sympathy for those people who think that grammar is mostly just spelling and punctuation with a few minor points of usage or syntax thrown in.

So what’s the solution? Well, maybe I’m just biased, but I think it’s to teach more grammar. I know this is easier said than done, but I think it’s important for people to have an understanding of how language works. A lot of people are naturally interested in or curious about language, and I think we do those students a disservice if all we teach them is never to use infer for imply and to avoid the passive voice. Grammar isn’t just a set of rules telling you what not to do; it’s also a fascinatingly complex and mostly subconscious system that governs the singular human gift of language. Maybe we just need to accept the broader sense of grammar and start teaching people all of what it is.

Addendum: I just came across a blog post criticizing the word funner as bad grammar, and my first reaction was “That’s not grammar!” It’s always easier to preach than to practice, but my reaction has me reconsidering my laissez-faire attitude. While it seems handy to have a catch-all term for language errors, regardless of what type they are, it also seems handy—probably more so—to distinguish between violations of the regulative rules and constitutive rules of language. But this leaves us right where we started.

By

It’s Not Wrong, but You Still Shouldn’t Do It

A couple of weeks ago, in my post “The Value of Prescriptivism,” I mentioned some strange reasoning that I wanted to talk about later—the idea that there are many usages that are not technically wrong, but you should still avoid them because other people think they’re wrong. I used the example of a Grammar Girl post on hopefully wherein she lays out the arguments in favor of disjunct hopefully and debunks some of the arguments against it—and then advises, “I still have to say, don’t do it.” She then adds, however, “I am hopeful that starting a sentence with hopefully will become more acceptable in the future.”

On the face of it, this seems like a pretty reasonable approach. Sometimes the considerations of the reader have to take precedence over the facts of usage. If the majority of your readers will object to your word choice, then it may be wise to pick a different word. But there’s a different way to look at this, which is that the misinformed opinions of a very small but very vocal subset of readers take precedence over the facts and the opinions of others. Arnold Zwicky wrote about this phenomenon a few years ago in a Language Log post titled “Crazies win”.

Addressing split infinitives and the equivocal advice to avoid them unless it’s better not to, Zwicky says that “in practice, [split infinitive as last resort] is scarcely an improvement over [no split infinitives] and in fact works to preserve the belief that split infinitives are tainted in some way.” He then adds that the “only intellectually justifiable advice” is to “say flatly that there’s nothing wrong with split infinitives and you should use them whenever they suit you”. I agree wholeheartedly, and I’ll explain why.

The problem with the it’s-not-wrong-but-don’t-do-it philosophy is that, while it feels like a moderate, open-minded, and more descriptivist approach in theory, it is virtually indistinguishable from the it’s-wrong-so-don’t-do-it philosophy in practice. You can cite all the linguistic evidence you want, but it’s still trumped by the fact that you’d rather avoid annoying that small subset of readers. It pays lip service to the idea of descriptivism informing your prescriptions, but the prescription is effectively the same. All you’ve changed is the justification for avoiding the usage.

Even more neutral and descriptive pieces like this New York Times “On Language” article on singular they ends with a wistful, “It’s a shame that grammarians ever took umbrage at the singular they,” adding, “Like it or not, the universal they isn’t universally accepted — yet. Its fate is now in the hands of the jury, the people who speak the language.” Even though the authors seem to be avoiding giving out advice, it’s still implicit in the conclusion. It’s great to inform readers about the history of usage debates, but what they’ll most likely come away with is the conclusion that it’s wrong—or at least tainted—so they shouldn’t use it.

The worst thing about this waffly kind of advice, I think, is that it lets usage commentators duck responsibility for influencing usage. They tell you all the reasons why it should be alright to use hopefully or split infinitives or singular they, but then they sigh and put them away in the linguistic hope chest, telling you that you can’t use them yet, but maybe someday. Well, when? If all the usage commentators are saying, “It’s not acceptable yet,” at what point are they going to decide that it suddenly is acceptable? If you always defer to the peevers and crazies, it will never be acceptable (unless they all happen to die off without transmitting their ideas to the next generation).

And furthermore, I’m not sure it’s a worthwhile endeavor to try to avoid offending or annoying anyone in your writing. It reminds me of Aesop’s fable of the man, the boy, and the donkey: people will always find something to criticize, so it’s impossible to behave (or write) in such a way as to always avoid criticism. As the old man at the end says, “Please all, and you will please none.” You can’t please everyone, so you have to make a choice: will you please the small but vocal peevers, or the more numerous reasonable people? If you believe there’s nothing technically wrong with hopefully or singular they, maybe you should stand by those beliefs instead of caving to the critics. And perhaps through your reasonable but firm advice and your own exemplary writing, you’ll help a few of those crazies come around.

By

Attributives, Possessives, and Veterans Day

As you’re probably aware, today is Veterans Day, but there’s a lot of confusion about whether it’s actually Veteran’s, Veterans’, or Veterans Day. The Department of Veterans Affairs obviously gets asked about this a lot, because it’s the top question in their FAQs:

Q. Which is the correct spelling of Veterans Day?

  1. Veterans Day
  2. Veteran’s Day
  3. Veterans’ Day

A. Veterans Day (choice a, above). Veterans Day does not include an apostrophe but does include an “s” at the end of “veterans” because it is not a day that “belongs” to veterans, it is a day for honoring all veterans.

Interesting reasoning, but I think it’s flawed for two main reasons. First, there’s the fact that the apostrophe-s ending in English does not merely denote possession or ownership, despite the fact that it is commonly called the possessive case or ending. As Arnold Zwicky is fond of saying, labels are not definitions. Historically, the possessive ending, or genitive case, as it is more formally known, has covered a much wider range of relationships than simply possession, such as composition, description, purpose, and origin. In Old English the genitive was even used to form adverbs, producing forms like our modern-day towards, nowadays, since, and once (the -ce ending is a respelling of an original -s from the genitive case marker). So obviously the possessive or genitive ending is not just used to show ownership, despite the insistence that if something doesn’t belong to someone, you can’t use the apostrophe-s ending.

Second, they would have us believe that “veterans” is an attributive noun, making “Veterans Day” a simple noun-noun compound, but such compounds usually don’t work when the first noun is plural. In fact, some linguists have argued that noun-noun compounds where the first element is plural are generally disallowed in English (see, for example, this piece), though there are exceptions like fireworks display. Sometimes compounds with irregular plurals can work, like mice trap, but few if any English speakers find rats trap acceptable. The Chicago Manual of Style has this to say:

The line between a possessive or genitive form and a noun used attributively—to modify another noun—is sometimes fuzzy, especially in the plural. Although terms such as employees’ cafeteria sometimes appear without an apostrophe, Chicago dispenses with the apostrophe only in proper names (often corporate names) that do not use one or where there is clearly no possessive meaning. (7.25)

Again they fall prey to the idea that in order to use a genitive, there must be possession. But they do make an important point—the line does seem to be fuzzy, but I don’t think it’s nearly as fuzzy as they think. If it weren’t for the fact that the genitive ending and the regular plural ending sound the same, I don’t think there’d be any confusion. After all, even if people argue that it should be veterans hospital rather than veterans’ hospital, I don’t think anyone would argue that it should be children hospital rather than children’s hospital. But because they do sound the same, and because some people have gotten it into their heads that the so-called possessive ending can only be used to show that something belongs to someone, people argue that veterans must be a plural in a noun-noun compound, even though such compounds are generally not possible in English.

Of course, the question of whether or not there should be an apostrophe in Veterans Day is ultimately an incredibly trivial one. Like so many others, I’m grateful for the service given and sacrifices made by those in the armed forces, particularly my two grandfathers. As far as I’m concerned, this day does belong to them.

By

Scriptivists Revisited

Before I begin: I know—it’s been a terribly, horribly, unforgivably long time since my last post. Part of it is that I’m often busy with grad school and work and family, and part of it is that I’ve been thinking an awful lot lately about prescriptivism and descriptivism and linguists and editors and don’t really know where to begin.

I know that I’ve said some harsh things about prescriptivists before, but I don’t actually hate prescriptivism in general. As I’ve said before, prescriptivism and descriptivism are not really diametrically opposed, as some people believe they are. Stan Carey explores some of the common ground between the two in a recent post, and I think there’s a lot more to be said about the issue.

I think it’s possible to be a descriptivist and prescriptivist simultaneously. In fact, I think it’s difficult if not impossible to fully disentangle the two approaches. The fact is that many or most prescriptive rules are based on observed facts about the language, even though those facts may be incomplete or misunderstood in some way. Very seldom does anyone make up a rule out of whole cloth that bears no resemblance to reality. Rules often arise because someone has observed a change or variation in the language and is seeking to slow or reverse that change (as in insisting that “comprised of” is always an error) or to regularize the variation (as in insisting that “which” be used for nonrestrictive relative clauses and “that” for restrictive ones).

One of my favorite language blogs, Motivated Grammar, declares “Prescriptivism must die!” but to be honest, I’ve never quite been comfortable with that slogan. Now, I love a good debunking of language myths as much as the next guy—and Gabe Doyle does a commendable job of it—but not all prescriptivism is a bad thing. The impulse to identify and fix potential problems with the language is a natural one, and it can be used for both good and ill. Just take a look at the blogs of John E. McIntyre, Bill Walsh, and Jan Freeman for examples of well-informed, sensible language advice. Unfortunately, as linguists and many others know, senseless language advice is all too common.

Linguists often complain about and debunk such bad language advice—and rightly so, in my opinion—but I think in doing so they often make the mistake of dismissing prescriptivism altogether. Too often linguists view prescriptivism as an annoyance to be ignored or as a rival approach that must be quashed, but either way they miss the fact that prescriptivism is a metalinguistic phenomenon worth exploring and understanding. And why is it worth exploring? Because it’s an essential part of how ordinary speakers—and even linguists—use language in their daily lives, whether they realize it or not.

Contrary to what a lot of linguists say, language isn’t really a natural phenomenon—it’s a learned behavior. And as with any other human behavior, we generally strive to make our language match observed standards. Or as Emily Morgan so excellently says in a guest post on Motivated Grammar, “Language is something that we as a community of speakers collectively create and reinvent each time we speak.” She says that this means that language is “inextricably rooted in a descriptive generalization about what that community does,” but it also means that it is rooted in prescriptive notions of language. Because when speakers create and reinvent language, they do so by shaping their language to fit listeners’ expectations.

That is, for the most part, there’s no difference in speakers’ minds between what they should do with language and what they do do with language. They use language the way they do because they feel as though they should, and this in turn reinforces the model that influences everyone else’s behavior. I’ve often reflected on the fact that style guides like The Chicago Manual of Style will refer to dictionaries for spelling issues—thus prescribing how to spell—but these dictionaries simply describe the language found in edited writing. Description and prescription feed each other in an endless loop. This may not be mathematical logic, but it is a sort of logic nonetheless. Philosophers love to say that you can’t derive an ought from an is, and yet people do nonetheless. If you want to fit in with a certain group, then you should behave in a such a way as to be accepted by that group, and that group’s behavior is simply an aggregate of the behaviors of everyone else trying to fit in.

And at this point, linguists are probably thinking, “And people should be left alone to behave the way they wish to behave.” But leaving people alone means letting them decide which behaviors to favor and which to disfavor—that is, which rules to create and enforce. Linguists often criticize those who create and propagate rules, as if such rules are bad simply as a result of their artificiality, but, once again, the truth is that all language is artificial; it doesn’t exist until we make it exist. And if we create it, why should we always be coolly dispassionate about it? Objectivity might be great in the scientific study of language, but why should language users approach language the same way? Why should we favor “natural” or “spontaneous” changes and yet disfavor more conscious changes?

This is something that Deborah Cameron addresses in her book Verbal Hygiene (which I highly, highly recommend)—the notion that “spontaneous” or “natural” changes are okay, while deliberate ones are meddlesome and should be resisted. As Cameron counters, “If you are going to make value judgements at all, then surely there are more important values than spontaneity. How about truth, beauty, logic, utility?” (1995, 20). Of course, linguists generally argue that an awful lot of prescriptions do nothing to create more truth, beauty, logic, or utility, and this is indeed a problem, in my opinion.

But when linguists debunk such spurious prescriptions, they miss something important: people want language advice from experts, and they’re certainly not getting it from linguists. The industry of bad language advice exists partly because the people who arguably know the most about how language really works—the linguists—aren’t at all interested in giving advice on language. Often they take the hands-off attitude exemplified in Robert Hall’s book Leave Your Language Alone, crying, “Linguistics is descriptive, not prescriptive!” But in doing so, linguists are nonetheless injecting themselves into the debate rather than simply observing how people use language. If an objective, hands-off approach is so valuable, then why don’t linguists really take their hands off and leave prescriptivists alone?

I think the answer is that there’s a lot of social value in following language rules, whether or not they are actually sensible. And linguists, being the experts in the field, don’t like ceding any social or intellectual authority to a bunch of people that they view as crackpots and petty tyrants. They chafe at the idea that such ill-informed, superstitious advice—what Language Log calls “prescriptivist poppycock”—can or should have any value at all. It puts informed language users in the position of having to decide whether to follow a stupid rule so as to avoid drawing the ire of some people or to break the rule and thereby look stupid to those people. Arnold Zwicky explores this conundrum in a post titled “Crazies Win.”

Note something interesting at the end of that post: Zwicky concludes by giving his own advice—his own prescription—regarding the issue of split infinitives. Is this a bad thing? No, not at all, because prescriptivism is not the enemy. As John Algeo said in an article in College English, “The problem is not that some of us have prescribed (we have all done so and continue to do so in one way or another); the trouble is that some of us have prescribed such nonsense” (“Linguistic Marys, Linguistic Marthas: The Scope of Language Study,” College English 31, no. 3 [December 1969]: 276). As I’ve said before, the nonsense is abundant. Just look at this awful Reader’s Digest column or this article on a Monster.com site for teachers for a couple recent examples.

Which brings me back to a point I’ve made before: linguists need to be more involved in not just educating the public about language, but in giving people the sensible advice they want. Trying to kill prescriptivism is not the answer to the language wars, and truly leaving language alone is probably a good way to end up with a dead language. Exploring it and trying to figure out how best to use it—this is what keeps language alive and thriving and interesting. And that’s good for prescriptivists and descriptivists alike.

By

Reflections on National Grammar Day

I know I’m a week late to the party, but I’ve been thinking a lot about National Grammar Day and want to blog about it anyway. Please forgive me for my untimeliness.

First off, I should say for those who don’t know me that I work as a copy editor. I clearly understand the value of using Standard American English when it is called for, and I know its rules and conventions quite well. I’m also a student of linguistics, and I find language fascinating. I understand the desire to celebrate language and to promote its good use, but unfortunately it appears that National Grammar Day does neither.

If you go to National Grammar Day’s web site and click on “About SPOGG” at the top of the page, you find this:

The Society for the Promotion of Good Grammar is for pen-toters appalled by wanton displays of Bad English. . . . SPOGG is for people who crave good, clean English — sentences cast well and punctuated correctly. It’s about clarity.

I can get behind those last two sentences (noting, of course, this description seems to exclude spoken English), but the first obviously flies in the face of the society’s name—is it trying to promote “good” (read “standard”) grammar, or simply ridicule what it deems to be displays of bad English? Well, if you read the SPOGG Blog, it appears to be the latter. None of the posts on the front page seem to deal with clarity; in each case it seems quite clear what the author intended, so obviously SPOGG is not about clarity after all.

In fact, what I gather from this post in particular is that SPOGG is more about the social value of using Standard English than it is about anything else. The message here is quite clear: using nonstandard English is like having spinach in your teeth. It’s like wearing a speedo on the bus. SPOGG isn’t about good, clean English or about clarity. It’s only about mocking those who violate a set of taboos. By following the rules, you signal to others that you belong to a certain group, one whose members care about linguistic manners in the same way that some people care about not putting their elbows on the table while they eat.

And that’s perfectly fine with me. If you delight in fussy little rules about spelling and punctuation, that’s your choice. But I think it’s important to distinguish between the rules that are truly important and the guidelines and conventions that are more flexible and optional. John McIntyre made this point quite well in his post today on his blog, You Don’t Say.

Unfortunately, I find that SPOGG’s founder, Martha Brockenbrough, quite frequently fails to make this distinction. She also shows an appalling lack of knowledge on issues like how language changes, what linguists do, and, to top it all off, what grammar actually is. Of course, she falls back on the “Geez, can’t you take a joke?” defense, which doesn’t really seem to fly, as Arnold Zwicky and others have already noted.

As I said at the start, I can appreciate the desire to celebrate grammar. I just wish National Grammar Day actually did that.

%d bloggers like this: