Arrant Pedantry

By

Attributives, Possessives, and Veterans Day

As you’re probably aware, today is Veterans Day, but there’s a lot of confusion about whether it’s actually Veteran’s, Veterans’, or Veterans Day. The Department of Veterans Affairs obviously gets asked about this a lot, because it’s the top question in their FAQs:

Q. Which is the correct spelling of Veterans Day?

  1. Veterans Day
  2. Veteran’s Day
  3. Veterans’ Day

A. Veterans Day (choice a, above). Veterans Day does not include an apostrophe but does include an “s” at the end of “veterans” because it is not a day that “belongs” to veterans, it is a day for honoring all veterans.

Interesting reasoning, but I think it’s flawed for two main reasons. First, there’s the fact that the apostrophe-s ending in English does not merely denote possession or ownership, despite the fact that it is commonly called the possessive case or ending. As Arnold Zwicky is fond of saying, labels are not definitions. Historically, the possessive ending, or genitive case, as it is more formally known, has covered a much wider range of relationships than simply possession, such as composition, description, purpose, and origin. In Old English the genitive was even used to form adverbs, producing forms like our modern-day towards, nowadays, since, and once (the -ce ending is a respelling of an original -s from the genitive case marker). So obviously the possessive or genitive ending is not just used to show ownership, despite the insistence that if something doesn’t belong to someone, you can’t use the apostrophe-s ending.

Second, they would have us believe that “veterans” is an attributive noun, making “Veterans Day” a simple noun-noun compound, but such compounds usually don’t work when the first noun is plural. In fact, some linguists have argued that noun-noun compounds where the first element is plural are generally disallowed in English (see, for example, this piece), though there are exceptions like fireworks display. Sometimes compounds with irregular plurals can work, like mice trap, but few if any English speakers find rats trap acceptable. The Chicago Manual of Style has this to say:

The line between a possessive or genitive form and a noun used attributively—to modify another noun—is sometimes fuzzy, especially in the plural. Although terms such as employees’ cafeteria sometimes appear without an apostrophe, Chicago dispenses with the apostrophe only in proper names (often corporate names) that do not use one or where there is clearly no possessive meaning. (7.25)

Again they fall prey to the idea that in order to use a genitive, there must be possession. But they do make an important point—the line does seem to be fuzzy, but I don’t think it’s nearly as fuzzy as they think. If it weren’t for the fact that the genitive ending and the regular plural ending sound the same, I don’t think there’d be any confusion. After all, even if people argue that it should be veterans hospital rather than veterans’ hospital, I don’t think anyone would argue that it should be children hospital rather than children’s hospital. But because they do sound the same, and because some people have gotten it into their heads that the so-called possessive ending can only be used to show that something belongs to someone, people argue that veterans must be a plural in a noun-noun compound, even though such compounds are generally not possible in English.

Of course, the question of whether or not there should be an apostrophe in Veterans Day is ultimately an incredibly trivial one. Like so many others, I’m grateful for the service given and sacrifices made by those in the armed forces, particularly my two grandfathers. As far as I’m concerned, this day does belong to them.

By

Gray, Grey, and Circular Prescriptions

A few days ago John McIntyre took a whack at the Associated Press Stylebook’s penchant for flat assertions, this time regarding the spelling of gray/grey. McIntyre noted that gray certainly is more common in American English but that grey is not a misspelling.

In the comments I mused that perhaps gray is only more common because of prescriptions like this one. John Cowan noted that gray is the main head word in Webster’s 1828 dictionary, with grey cross-referenced to it, saying, “So I think we can take it that “gray” has been the standard AmE spelling long before the AP stylebook, or indeed the AP, were in existence.”

But I don’t think Webster’s dictionary really proves that at all. When confronted with multiple spellings of a word, lexicographers must choose which one to include as the main entry in the dictionary. Webster’s choice of gray over grey may have been entirely arbitrary. Furthermore, considering that he was a crusader for spelling reform, I don’t think we can necessarily take the spellings in his dictionary as evidence of what was more common or standard in American English.

So I headed over to Mark Davies’ Corpus of Historical American English to do a little research. I searched for both gray and grey as adjectives and came up with this. The grey line represents the total number of tokens per million words for both forms.

gray and grey in tokens per million words

Up until about the 1840s, gray and grey were about neck and neck. After that, gray really takes off while grey languishes. Now, I realize that this is a rather cursory survey of their historical distribution, and the earliest data in this corpus predates Webster’s dictionary by only a couple of decades. I don’t know how to explain the growth of gray/grey in the 1800s. But in spite of these problems, it appears that there are some very clear-cut trend lines—gray became overwhelmingly more common, but grey has severely diminished but not quite disappeared from American English.

This ties in nicely with a point I’ve made before: descriptivism and prescriptivism are not entirely separable, and there is considerable interplay between the two. It may be that Webster really was describing the linguistic scene as he saw it, choosing gray because he felt that it was more common, or it may be that his choice of gray was arbitrary or influenced by his personal preferences.

Either way, his decision to describe the word in a particular way apparently led to a prescriptive feedback loop: people chose to use the spelling gray because it was in the dictionary, reinforcing its position as the main entry in the dictionary and leading to its ascendancy over grey and eventually to the AP Stylebook‘s tweet about its preferred status. What may have started as a value-neutral decision by Webster about an utterly inconsequential issue of spelling variability has become an imperative to editors . . . about what is still an utterly inconsequential issue of spelling variability.

Personally, I’ve always had a soft spot for grey.

By

10:30 o’clock

My sister-in-law will soon graduate from high school, and we recently got her graduation announcement in the mail. It was pretty standard stuff—a script font in metallic ink on nice paper—but one small detail caught my eye. It says the commencement exercises will take place at “ten-thirty o’clock.” As far as I can remember, I’ve never before heard a rule against using “o’clock” with times other than the hour, but it struck me as wrong.

I checked Merriam-Webster first, but it was no help; all it says is “according to the clock,” though its example sentence is “the time is three o’clock.” I then pulled out my copy of Merriam-Webster’s Dictionary of English Usage, but it didn’t even have an entry for o’clock or clock. So then, because my wife was on the computer and I couldn’t access the OED online, I pulled out my compact OED and magnifying glass to see if it had anything to say.

Once I had flipped to the entry and scanned through the minuscule type, I found this one line: “The hour of the day is expressed by a cardinal numeral, followed by a phrase which was originally of the clock, now only retained in formal phraseology; shortened subsequently to . . . o’clock.” The citations begin with Chaucer and continue up to modern English.

And then, out of curiosity, I checked the Corpus of Contemporary American English, but I couldn’t find any examples of x:30 o’clock. Google, however, turned up plenty of examples, including a thread on Amazon’s Askville asking why you can’t say “11:30 o’clock.” The best explanation there seems to be that since the clock hands aren’t pointing at a specific hour, it can’t be anything-o’clock.

This answer doesn’t seem quite satisfying to me—it doesn’t explain why the hour hand has to be pointing directly at a number or why the minute hand doesn’t matter. But then I remembered that clock originally meant “bell” and that early clocks chimed on the hour (well, I suppose some modern clocks do too, but you see where I’m going). Early mechanical clocks were rather large, and most people measured time not by checking the clock face to see where the hands were, but by counting the number of chimes on the hour. So I would assume that this is why it sounds strange to use “o’clock” with fractions of hours. Thoughts, anyone?

By

Would you like to take a survey?

I’m working on a research project for a class this semester, and I need volunteers to take a short survey. It involves reading three short passages and answering a few questions. It should only take about 10 minutes. The results will not be published and no identifiable personal information will be collected. If anyone is interested, just follow this link. I would be much obliged.

By

Scriptivists Revisited

Before I begin: I know—it’s been a terribly, horribly, unforgivably long time since my last post. Part of it is that I’m often busy with grad school and work and family, and part of it is that I’ve been thinking an awful lot lately about prescriptivism and descriptivism and linguists and editors and don’t really know where to begin.

I know that I’ve said some harsh things about prescriptivists before, but I don’t actually hate prescriptivism in general. As I’ve said before, prescriptivism and descriptivism are not really diametrically opposed, as some people believe they are. Stan Carey explores some of the common ground between the two in a recent post, and I think there’s a lot more to be said about the issue.

I think it’s possible to be a descriptivist and prescriptivist simultaneously. In fact, I think it’s difficult if not impossible to fully disentangle the two approaches. The fact is that many or most prescriptive rules are based on observed facts about the language, even though those facts may be incomplete or misunderstood in some way. Very seldom does anyone make up a rule out of whole cloth that bears no resemblance to reality. Rules often arise because someone has observed a change or variation in the language and is seeking to slow or reverse that change (as in insisting that “comprised of” is always an error) or to regularize the variation (as in insisting that “which” be used for nonrestrictive relative clauses and “that” for restrictive ones).

One of my favorite language blogs, Motivated Grammar, declares “Prescriptivism must die!” but to be honest, I’ve never quite been comfortable with that slogan. Now, I love a good debunking of language myths as much as the next guy—and Gabe Doyle does a commendable job of it—but not all prescriptivism is a bad thing. The impulse to identify and fix potential problems with the language is a natural one, and it can be used for both good and ill. Just take a look at the blogs of John E. McIntyre, Bill Walsh, and Jan Freeman for examples of well-informed, sensible language advice. Unfortunately, as linguists and many others know, senseless language advice is all too common.

Linguists often complain about and debunk such bad language advice—and rightly so, in my opinion—but I think in doing so they often make the mistake of dismissing prescriptivism altogether. Too often linguists view prescriptivism as an annoyance to be ignored or as a rival approach that must be quashed, but either way they miss the fact that prescriptivism is a metalinguistic phenomenon worth exploring and understanding. And why is it worth exploring? Because it’s an essential part of how ordinary speakers—and even linguists—use language in their daily lives, whether they realize it or not.

Contrary to what a lot of linguists say, language isn’t really a natural phenomenon—it’s a learned behavior. And as with any other human behavior, we generally strive to make our language match observed standards. Or as Emily Morgan so excellently says in a guest post on Motivated Grammar, “Language is something that we as a community of speakers collectively create and reinvent each time we speak.” She says that this means that language is “inextricably rooted in a descriptive generalization about what that community does,” but it also means that it is rooted in prescriptive notions of language. Because when speakers create and reinvent language, they do so by shaping their language to fit listeners’ expectations.

That is, for the most part, there’s no difference in speakers’ minds between what they should do with language and what they do do with language. They use language the way they do because they feel as though they should, and this in turn reinforces the model that influences everyone else’s behavior. I’ve often reflected on the fact that style guides like The Chicago Manual of Style will refer to dictionaries for spelling issues—thus prescribing how to spell—but these dictionaries simply describe the language found in edited writing. Description and prescription feed each other in an endless loop. This may not be mathematical logic, but it is a sort of logic nonetheless. Philosophers love to say that you can’t derive an ought from an is, and yet people do nonetheless. If you want to fit in with a certain group, then you should behave in a such a way as to be accepted by that group, and that group’s behavior is simply an aggregate of the behaviors of everyone else trying to fit in.

And at this point, linguists are probably thinking, “And people should be left alone to behave the way they wish to behave.” But leaving people alone means letting them decide which behaviors to favor and which to disfavor—that is, which rules to create and enforce. Linguists often criticize those who create and propagate rules, as if such rules are bad simply as a result of their artificiality, but, once again, the truth is that all language is artificial; it doesn’t exist until we make it exist. And if we create it, why should we always be coolly dispassionate about it? Objectivity might be great in the scientific study of language, but why should language users approach language the same way? Why should we favor “natural” or “spontaneous” changes and yet disfavor more conscious changes?

This is something that Deborah Cameron addresses in her book Verbal Hygiene (which I highly, highly recommend)—the notion that “spontaneous” or “natural” changes are okay, while deliberate ones are meddlesome and should be resisted. As Cameron counters, “If you are going to make value judgements at all, then surely there are more important values than spontaneity. How about truth, beauty, logic, utility?” (1995, 20). Of course, linguists generally argue that an awful lot of prescriptions do nothing to create more truth, beauty, logic, or utility, and this is indeed a problem, in my opinion.

But when linguists debunk such spurious prescriptions, they miss something important: people want language advice from experts, and they’re certainly not getting it from linguists. The industry of bad language advice exists partly because the people who arguably know the most about how language really works—the linguists—aren’t at all interested in giving advice on language. Often they take the hands-off attitude exemplified in Robert Hall’s book Leave Your Language Alone, crying, “Linguistics is descriptive, not prescriptive!” But in doing so, linguists are nonetheless injecting themselves into the debate rather than simply observing how people use language. If an objective, hands-off approach is so valuable, then why don’t linguists really take their hands off and leave prescriptivists alone?

I think the answer is that there’s a lot of social value in following language rules, whether or not they are actually sensible. And linguists, being the experts in the field, don’t like ceding any social or intellectual authority to a bunch of people that they view as crackpots and petty tyrants. They chafe at the idea that such ill-informed, superstitious advice—what Language Log calls “prescriptivist poppycock”—can or should have any value at all. It puts informed language users in the position of having to decide whether to follow a stupid rule so as to avoid drawing the ire of some people or to break the rule and thereby look stupid to those people. Arnold Zwicky explores this conundrum in a post titled “Crazies Win.”

Note something interesting at the end of that post: Zwicky concludes by giving his own advice—his own prescription—regarding the issue of split infinitives. Is this a bad thing? No, not at all, because prescriptivism is not the enemy. As John Algeo said in an article in College English, “The problem is not that some of us have prescribed (we have all done so and continue to do so in one way or another); the trouble is that some of us have prescribed such nonsense” (“Linguistic Marys, Linguistic Marthas: The Scope of Language Study,” College English 31, no. 3 [December 1969]: 276). As I’ve said before, the nonsense is abundant. Just look at this awful Reader’s Digest column or this article on a Monster.com site for teachers for a couple recent examples.

Which brings me back to a point I’ve made before: linguists need to be more involved in not just educating the public about language, but in giving people the sensible advice they want. Trying to kill prescriptivism is not the answer to the language wars, and truly leaving language alone is probably a good way to end up with a dead language. Exploring it and trying to figure out how best to use it—this is what keeps language alive and thriving and interesting. And that’s good for prescriptivists and descriptivists alike.