Arrant Pedantry

By

The Reason Why This Is Correct

There’s a long-running debate over whether the construction reason why is acceptable. Critics generally argue that why essentially means reason, so saying reason why is like saying reason twice. Saying something twice is redundant, and redundancy is bad; ergo, reason why is bad. This is really a rather bizarre argument. Reason is a noun; why is usually an interrogative adverb. They do cover some of the same semantic space, but not the same syntactic space. Does this really make the construction redundant? Defendants generally admit that it’s redundant, but in a harmless way. But rebutting the critics by calling it “not ungrammatical” or saying that “redundancy is not inherently bad” is a pretty weak defense. However, that defense can be strengthened with the addition of something that has been missing from the discussion: an examination of the syntactic role of why in such constructions.

Nearly every discussion on reason why that I’ve ever seen—including Merriam-Webster’s Dictionary of English Usage and Garner’s Modern American Usage—leaves out this very important syntactic component. The only exceptions that I’ve seen are this post on the Grammarphobia blog and this one on Daily Writing Tips, which both mention that why is a conjunction. The writers at Grammarphobia argue that reason why is not actually redundant because of why’s syntactic role, but Mark Nichol at Daily Writing Tips seems much more confused about the issue. He says that even though reason why has been around for centuries and only came under fire in the twentieth century, he’ll continue to avoid it in his own writing “but will forgive the combination when I am editing that of others” (how magnanimous). But he doesn’t understand why reason why is okay but reason is because is not, because both why and because are conjunctions.

I won’t get into reason is because here, but suffice it to say that these are very different constructions. As I mentioned in my previous post on relative pronouns and adverbs, why functions as a relative adverb, but it appears almost exclusively after the word reason. (To be clear, all relative pronouns and adverbs can be considered conjunctions because they connect a subordinate clause—the relative clause—to a main one.) In a phrase like the reason why this is correct, why connects the relative clause this is correct to the noun it modifies, reason. Relative pronouns refer to a noun phrase, while relative adverbs refer to some kind of adverbial phrase. As with any relative clause, you can extract a main clause out of the relative clause by replacing the relative pronoun or adverb and doing a little rearranging (that’s the man who I met > I met the man), though with relative adverbs you often have to add in a function word or two: the reason why this is correct > this is correct for this reason. This is pretty obvious when you think about it. A phrase like the reason why this is correct contains another clause—this is correct. There has to be something to connect it syntactically to the rest of the phrase.

In defending the construction, Gabe Doyle at Motivated Grammar compares it to the redundancy in The person who left their wet swimsuit on my books is going to pay. This is actually a more apt comparison than Mr. Doyle realizes, because he doesn’t make the connection between the relative pronoun who and the relative adverb why. He argues that it is just as redundant as reason why (and therefore not a problem), because who means person in a sense.

But as I said above, this isn’t really redundancy. Who is a relative pronoun connecting a clause to a noun phrase. If who means the same thing as person, it’s only because that’s its job as a pronoun. Pronouns are supposed to refer to other things in the sentence, and thus they mean the same thing. Why works much the same way. Why means the same thing as reason only because it refers to it.

So what about reason that or just plain reason? Again, as I discussed in my last post on relative pronouns and adverbs, English has two systems of relativization: the wh words and that, and that is omissible except where it functions as the subject of the relative clause. Thus we have the option of saying the reason why this is correct, the reason that this is correct (though that sounds awkward in some instances), or just plain the reason this is correct (again, this is occasionally awkward). The Cambridge Grammar of the English Language also mentions the possibility the reason for which, though this also sounds awkward and stilted in most cases. But I suspect that many awkward plain reasons are the result of editorial intervention, as in this case I found in the research for my thesis: There are three preliminary reasons why the question of rationality might make a difference in the context of Leibniz’s thought.

It’s important to note, though, that there are some constructions in which why is more superfluous. As Robert Lane Greene noted on the Johnson blog, sometimes why is used after reason without a following relative clause. (Mr. Greene calls it a complement clause.) He gives the example I’m leaving your father. The reason why is that he’s a drunk. The why here doesn’t really serve a syntactic function, since it’s not introducing a clause, though the Oxford English Dictionary calls this an elliptical construction. In essence, the why is serving as a placeholder for the full relative clause: I’m leaving your father. The reason why (I’m leaving him) is that he’s a drunk. It’s not strictly necessary to delete the why here, though it is generally colloquial and may not sound right in formal writing.

But this is by no means a blanket injunction against reason why. I think the rule forbidding reason why probably arose out of simple grammatical misanalysis of this relative construction, or perhaps by broadening a ban on elliptical reason why into a ban on all instances of reason why. Whatever the reason for the ban, it’s misguided and should be laid to rest. Reason why is not only not ungrammatical or harmlessly redundant, but it’s a legitimately correct and fully grammatical construction. Just because there are other options doesn’t mean one is right and the rest are wrong.

By

Till Kingdom Come

The other day on Twitter, Bryan A. Garner posted, “May I ask a favor? Would all who read this please use the prep. ‘till’ in a tweet? Not till then will we start getting people used to it.” I didn’t help out, partly because I hate pleas of the “Repost this if you agree!” variety and partly because I knew it would be merely a symbolic gesture. Even if all of Garner’s followers and all of their followers used “till” in a tweet, it wouldn’t even be a blip on the radar of usage.

But it did get me thinking about the word till and the fact that a lot of people seem to regard it as incorrect and forms like 'til as correct. The assumption for many people seems to be that it’s a shortened form of until, so it requires an apostrophe to signal the omission. Traditionalists, however, know that although the two words are related, till actually came first, appearing in the language about four hundred years before until.

Both words came into English via Old Norse, where the preposition til had replaced the preposition to. (As I understand it, modern-day North Germanic languages like Swedish and Danish still use it this way.) Despite their similar appearances, to and till are not related; till comes from a different root meaning ‘end’ or ‘goal’ (compare modern German Ziel ‘goal’). Norse settlers brought the word til with them when they started raiding and colonizing northeastern Britain in the 800s.

There was also a compound form, until, from und + til. Und was another Old Norse preposition deriving from the noun und, which is cognate with the English word end. Till and until have been more or less synonymous throughout their history in English, despite their slightly different forms. And as a result of the haphazard process of spelling standardization in English, we ended up with two ls on till but only one on until. The apostrophized form 'til is an occasional variant that shows up far more in unedited than edited writing. Interestingly, the OED’s first citation for 'til comes from P. G. Perrin’s An Index to English in 1939: “Till, until, (’til), these three words are not distinguishable in meaning. Since ’til in speech sounds the same as till and looks slightly odd on paper, it may well be abandoned.”

Mark Davies’ Corpus of Historical American English, however, tells a slightly different story. It shows a slight increase in 'til since the mid-twentieth century, though it has been declining again slightly in the last thirty years. And keep in mind that these numbers come from a corpus of edited writing drawn from books, magazines, and newspapers. It may well be increasing much faster in unedited writing, with only the efforts of copy editors keeping it (mostly) out of print. This chart shows the relative proportions of the three forms—that is, the proportion of each compared to the total of all three.

Relative proportions of till, until, and 'til.

As Garner laments, till is becoming less and less common in writing and may all but disappear within the next century, though predicting the future of usage is always a guessing game, even with clear trends like this. Sometimes they spontaneously reverse, and it’s often not clear why. But why is till in decline? I honestly don’t know for sure, but I suspect it stems from either the idea that longer words are more formal or the perception that it’s a shortened form of until. Contractions and clipped forms are generally avoided in formal writing, so this could be driving till out of use.

Note that we don’t have this problem with to and unto, probably because to is one of the most common words in the language, occurring about 9,000 times per million words in the last decade in COHA. By comparison, unto occurs just under 70 times per million words. There’s no uncertainty or confusion about the use of spelling of to. We tend to be less sure of the meanings and spellings of less frequent words, and this uncertainty can lead to avoidance. If you don’t know which form is right, it’s easy to just not use it.

At any rate, many people are definitely unfamiliar with till and may well think that the correct form is 'til, as Gabe Doyle of Motivated Grammar did in this post four years ago, though he checked his facts and found that his original hunch was wrong.

He’s far from the only person who thought that 'til was correct. When my then-fiancee and I got our wedding announcements printed over eight years ago, the printer asked us if we really wanted “till” instead of “'til” (“from six till eight that evening”). I told him that yes, it was right, and he kind of shrugged and dropped the point, though I got the feeling he still thought I was wrong. He probably didn’t want to annoy a paying customer, though.

And though this is anecdotal and possibly falls prey to the recency illusion, it seems that 'til is on the rise in signage (frequently as ‘til, with a single opening quotation mark rather than an apostrophe), and I even spotted a til' the other day. (I wish I’d thought to get a picture of it.)

I think the evidence is pretty clear that, barring some amazing turnaround, till is dying. It’s showing up less in print, where it’s mostly been replaced by until, and the traditionally incorrect 'til may be hastening its death as people become unsure of which form is correct or even become convinced that till is wrong and 'til is right. I’ll keep using till myself, but I’m not holding out hope for a revival. Sorry, Garner.

By

Scriptivists Revisited

Before I begin: I know—it’s been a terribly, horribly, unforgivably long time since my last post. Part of it is that I’m often busy with grad school and work and family, and part of it is that I’ve been thinking an awful lot lately about prescriptivism and descriptivism and linguists and editors and don’t really know where to begin.

I know that I’ve said some harsh things about prescriptivists before, but I don’t actually hate prescriptivism in general. As I’ve said before, prescriptivism and descriptivism are not really diametrically opposed, as some people believe they are. Stan Carey explores some of the common ground between the two in a recent post, and I think there’s a lot more to be said about the issue.

I think it’s possible to be a descriptivist and prescriptivist simultaneously. In fact, I think it’s difficult if not impossible to fully disentangle the two approaches. The fact is that many or most prescriptive rules are based on observed facts about the language, even though those facts may be incomplete or misunderstood in some way. Very seldom does anyone make up a rule out of whole cloth that bears no resemblance to reality. Rules often arise because someone has observed a change or variation in the language and is seeking to slow or reverse that change (as in insisting that “comprised of” is always an error) or to regularize the variation (as in insisting that “which” be used for nonrestrictive relative clauses and “that” for restrictive ones).

One of my favorite language blogs, Motivated Grammar, declares “Prescriptivism must die!” but to be honest, I’ve never quite been comfortable with that slogan. Now, I love a good debunking of language myths as much as the next guy—and Gabe Doyle does a commendable job of it—but not all prescriptivism is a bad thing. The impulse to identify and fix potential problems with the language is a natural one, and it can be used for both good and ill. Just take a look at the blogs of John E. McIntyre, Bill Walsh, and Jan Freeman for examples of well-informed, sensible language advice. Unfortunately, as linguists and many others know, senseless language advice is all too common.

Linguists often complain about and debunk such bad language advice—and rightly so, in my opinion—but I think in doing so they often make the mistake of dismissing prescriptivism altogether. Too often linguists view prescriptivism as an annoyance to be ignored or as a rival approach that must be quashed, but either way they miss the fact that prescriptivism is a metalinguistic phenomenon worth exploring and understanding. And why is it worth exploring? Because it’s an essential part of how ordinary speakers—and even linguists—use language in their daily lives, whether they realize it or not.

Contrary to what a lot of linguists say, language isn’t really a natural phenomenon—it’s a learned behavior. And as with any other human behavior, we generally strive to make our language match observed standards. Or as Emily Morgan so excellently says in a guest post on Motivated Grammar, “Language is something that we as a community of speakers collectively create and reinvent each time we speak.” She says that this means that language is “inextricably rooted in a descriptive generalization about what that community does,” but it also means that it is rooted in prescriptive notions of language. Because when speakers create and reinvent language, they do so by shaping their language to fit listeners’ expectations.

That is, for the most part, there’s no difference in speakers’ minds between what they should do with language and what they do do with language. They use language the way they do because they feel as though they should, and this in turn reinforces the model that influences everyone else’s behavior. I’ve often reflected on the fact that style guides like The Chicago Manual of Style will refer to dictionaries for spelling issues—thus prescribing how to spell—but these dictionaries simply describe the language found in edited writing. Description and prescription feed each other in an endless loop. This may not be mathematical logic, but it is a sort of logic nonetheless. Philosophers love to say that you can’t derive an ought from an is, and yet people do nonetheless. If you want to fit in with a certain group, then you should behave in a such a way as to be accepted by that group, and that group’s behavior is simply an aggregate of the behaviors of everyone else trying to fit in.

And at this point, linguists are probably thinking, “And people should be left alone to behave the way they wish to behave.” But leaving people alone means letting them decide which behaviors to favor and which to disfavor—that is, which rules to create and enforce. Linguists often criticize those who create and propagate rules, as if such rules are bad simply as a result of their artificiality, but, once again, the truth is that all language is artificial; it doesn’t exist until we make it exist. And if we create it, why should we always be coolly dispassionate about it? Objectivity might be great in the scientific study of language, but why should language users approach language the same way? Why should we favor “natural” or “spontaneous” changes and yet disfavor more conscious changes?

This is something that Deborah Cameron addresses in her book Verbal Hygiene (which I highly, highly recommend)—the notion that “spontaneous” or “natural” changes are okay, while deliberate ones are meddlesome and should be resisted. As Cameron counters, “If you are going to make value judgements at all, then surely there are more important values than spontaneity. How about truth, beauty, logic, utility?” (1995, 20). Of course, linguists generally argue that an awful lot of prescriptions do nothing to create more truth, beauty, logic, or utility, and this is indeed a problem, in my opinion.

But when linguists debunk such spurious prescriptions, they miss something important: people want language advice from experts, and they’re certainly not getting it from linguists. The industry of bad language advice exists partly because the people who arguably know the most about how language really works—the linguists—aren’t at all interested in giving advice on language. Often they take the hands-off attitude exemplified in Robert Hall’s book Leave Your Language Alone, crying, “Linguistics is descriptive, not prescriptive!” But in doing so, linguists are nonetheless injecting themselves into the debate rather than simply observing how people use language. If an objective, hands-off approach is so valuable, then why don’t linguists really take their hands off and leave prescriptivists alone?

I think the answer is that there’s a lot of social value in following language rules, whether or not they are actually sensible. And linguists, being the experts in the field, don’t like ceding any social or intellectual authority to a bunch of people that they view as crackpots and petty tyrants. They chafe at the idea that such ill-informed, superstitious advice—what Language Log calls “prescriptivist poppycock”—can or should have any value at all. It puts informed language users in the position of having to decide whether to follow a stupid rule so as to avoid drawing the ire of some people or to break the rule and thereby look stupid to those people. Arnold Zwicky explores this conundrum in a post titled “Crazies Win.”

Note something interesting at the end of that post: Zwicky concludes by giving his own advice—his own prescription—regarding the issue of split infinitives. Is this a bad thing? No, not at all, because prescriptivism is not the enemy. As John Algeo said in an article in College English, “The problem is not that some of us have prescribed (we have all done so and continue to do so in one way or another); the trouble is that some of us have prescribed such nonsense” (“Linguistic Marys, Linguistic Marthas: The Scope of Language Study,” College English 31, no. 3 [December 1969]: 276). As I’ve said before, the nonsense is abundant. Just look at this awful Reader’s Digest column or this article on a Monster.com site for teachers for a couple recent examples.

Which brings me back to a point I’ve made before: linguists need to be more involved in not just educating the public about language, but in giving people the sensible advice they want. Trying to kill prescriptivism is not the answer to the language wars, and truly leaving language alone is probably a good way to end up with a dead language. Exploring it and trying to figure out how best to use it—this is what keeps language alive and thriving and interesting. And that’s good for prescriptivists and descriptivists alike.