May 9, 2018

Is Change Okay or Not?

A few weeks ago I got into a bit of an argument with my coworkers in staff meeting. One of them had asked our editorial interns to do a brief presentation on the that/which rule in our staff meeting, and they did. But one of the interns seemed a little unclear on the rule—she said she had learned the rule in her class on modern American usage, but she had also learned that either that or which is technically fine with restrictive clauses. So of course I asked if I could chime in.

I pointed out that the rule—which states that you should always use that for restrictive clauses (except where that is grammatically impermissible, as when the relative pronoun follows a preposition or the demonstrative pronoun that)—is a relatively recent invention and that it didn’t really start to take hold in American usage until the mid-twentieth century. Many writers still don’t follow it, which means that editors have a lot of opportunities to apply the rule, and it’s generally not enforced outside the US.

My coworkers didn’t really like the perceived implication that the rule is bogus and that we shouldn’t worry about it, and one of them countered by saying that it didn’t matter what people did in 1810—the history is interesting, but we should be concerned about what usage is now. After all, the data clearly shows that the that/which rule is being followed in recent publications. And then she deployed an argument I’ve been seeing more and more lately: we all know that language changes, so why can’t we accept this change? (I’ve also heard variations like “Language changes, so why can’t we make it change this way?”)

These are good questions, and I don’t believe that linguists have good answers to them. (Indeed, I’m not even sure that good answers—or at least logically sound answers—are even possible.) In her book Verbal Hygiene, the linguist Deborah Cameron argues that it’s silly for linguists to embrace change from below but to resist change from above. What makes a “natural” change better than an unnatural one? We talk about how language changes, but it’s really people who change language, not language that changes by itself, so is there even a meaningful difference between natural and unnatural change?

Besides, many linguists have embraced certain unnatural changes, such as the movements for gender-neutral and plain language. Why is it okay for us to denounce prescriptivism on the one hand and then turn around and prescribe gender-neutral language on the other?

I haven’t come to a firm conclusion on this myself, but I think it all comes down to whether the alleged problem is in fact a problem and whether the proposed solution is in fact a solution. Does it solve the problem, does it do nothing, or does it simply create a new or different problem?

With gender-specific language, it’s clear that there’s a problem. Even though he is purportedly gender-neutral when its antecedent is indefinite or of unspecified gender, studies have shown that readers are more likely to assume that its antecedent is male. Clearly it’s not really gender-neutral if most people think “male” when they read “he”. Singular they has centuries of use behind it, including use by many great authors, and most people use it naturally and unselfconsciously. It’s not entirely uncontroversial, of course, but acceptance is growing, even among copy editors.

There are some minor thorny issues, like trying to figure out what the gender-neutral forms of freshman or fisherman should be, but writing around these seems like a small price to pay for text that treats people equally.

So what about the that/which rule? What problem does it claim to solve, and does it actually solve it?

The claim is that the rule helps distinguish between restrictive and nonrestrictive relative clauses, which in the abstract sounds like a good thing. But the argument quickly falls apart when you look at how other relative clauses work in English. We don’t need any extra help distinguishing between restrictive and nonrestrictive clauses with who, where, or when—the comma (or, in speech, the intonation) tells you whether a clause is restrictive. The fact that nobody has even recognized ambiguity with restrictive who or where or when as a problem, let alone proposed and implemented a solution, argues against the idea that there’s something wrong with restrictive which. Furthermore, no language I’ve heard of distinguishes between restrictive and nonrestrictive clauses with different pronouns. If it were really an advantage, then we’d expect to see languages all over the world with a grammaticalized distinction between restrictive and nonrestrictive clauses.

I’ve sometimes seen the counterargument that writers don’t always know how to use commas properly, so we can’t trust them to mark whether a clause is restrictive or not; but again, nobody worries about this with other relative clauses. And anyway, if copy editors can always identify when a clause is restrictive and thus know when to change which to that, then it stands to reason that they can also identify when a clause is nonrestrictive and and thus insert the commas if needed. (Though it’s not clear if even the commas are really necessary; in German, even restrictive clauses are set off with commas in writing, so you have to rely on context and common sense to tell you which kind of clause it is.)

It seems, then, that restrictive which is not a real problem at all and that insisting on that for all restrictive clauses doesn’t really accomplish anything. Even though Deborah Cameron criticizes linguists for accepting natural changes and rejecting unnatural ones, she also recognizes that many of the rules that copy editors impose, including the that/which rule, go far beyond what’s necessary for effective communication. She even quotes one scholar as saying that the that/which rule’s “sole virtue . . . is to give copy editors more billable hours.”

Some would argue that changing which to that doesn’t take much time, so there’s really no cost, but I don’t believe that’s true. My own research shows that it’s one of the most common usage or grammar changes that editors make. All those changes add up. I also know from experience that a lot of editors gripe about people not following the rule. That griping has a real effect on people, making them nervous about their abilities with their own native language. Even if you think the that/which rule is useful enough to justify the time it takes to impose it, is it worth making so many people feel self-conscious about their language?

Even if you believe that the that/which rule is an improvement, the fact is that English existed for nearly 1500 years without it, and even now it’s probably safe to say that the vast majority of English speakers have never heard of it. Although corpus data makes it appear as though it’s taken hold in American English, all we can really say from this data is that it has taken hold in edited, published American English, which really means that it’s taken hold among American copy editors. I’m sure some writers have picked the rule up from their English classes or from Word’s grammar checker, but I think it’s safe to say that American English as a whole has not changed—only the most visible portion, published writing, has.

So it’s rather disingenuous to say that the language has changed and thus we should accept the that/which rule as a valid part of Standard English. The argument is entirely circular: editors should enforce the rule because editors have been enforcing the rule now for a few decades. The fact that they have been enforcing the rule rather successfully doesn’t tell us whether they should be enforcing the rule.

Of course, that’s the fundamental problem with all prescriptions—sooner or later, you run into the is–ought problem. That is, it’s logically impossible to derive a prescriptive statement (one that tells you what you ought to do) from a descriptive one (one that states what is). Any statement like “This feature has been in use for centuries, so it’s correct” or “Shakespeare and Jane Austen used this feature, so it’s correct” or even “This feature is used by a majority of speakers today, so it’s correct” is technically a logical fallacy.

While acknowledging that nothing can definitively tell us what usage rules we should or shouldn’t follow, I still think we can come to a general understanding of which rules are worth following and which ones aren’t by looking at several different criteria:

  1. Historical use
  2. Modern use
  3. Oral use
  4. Edited written use
  5. Unedited written use
  6. Use by literary greats
  7. Common use

No single criterion is either necessary or sufficient to prove that a rule should be followed, but by looking at the totality of the usage evidence, we can get a good sense of where the rule came from, who uses it and in which contexts they use it, whether use is increasing or decreasing, and so on. So something might not be correct just because Chaucer or Shakespeare or Austen used it, but if something has been in continuous use for centuries by both literary greats and common people in both speech and writing, then it’s hard to maintain that it’s an error.

And if a rule is only followed in modern edited use, as the that/which rule is (and even then, it’s primarily modern edited American use), then it’s likewise hard to insist that this is a valid rule that all English speakers should be following. Again, the fact that editors have been enforcing a rule doesn’t tell us whether they should. Editors are good at learning and following rules, and we’re often good at pointing out holes or inconsistencies in a text or making it clearer and more readable, but this doesn’t mean that we have any special insight into what the grammar of English relative clauses should be, let alone the authority to insist that everyone follow our proposed changes.

So we can’t—or, at least, I think we shouldn’t—simply say that language has changed in this instance and that therefore we should all follow the rule. Language change is not necessarily good or bad, but it’s important to look at who is changing the language and why. If most people are changing the language in a particular way because they find that change genuinely useful, then it seems like a good thing, or at least a harmless thing. But if the change is being imposed by a small group of disproportionately powerful people for dubious reasons, and if the fact that this group has been successful is then used as evidence that the change is justified, then I think we should be skeptical.

If you want the language to change in a particular way, then the burden of proof is on you to demonstrate why you’re right and four hundred million native speakers are wrong. Until then, I’ll continue to tell our intern that what she learned in class was right: either that or which is fine.

SHARE:
Descriptivism, Prescriptivism, Usage 5 Replies to “Is Change Okay or Not?”
Jonathon Owen
Jonathon Owen

COMMENTS

5 thoughts on “Is Change Okay or Not?

    Author’s gravatar

    I agree with virtually everything you say. People who insist that “which” shouldn’t be used for restrictive clauses sometimes claim that the so-called rule was made by Fowler, but that’s a misreading of what he said. I can’t check now, but my recollection is that in Modern English Usage he said that the essential difference between restrictive and non-restrictive clauses is the absence or presence (respectively) of a comma before the “which”. He then suggested (not decreed) that using “that” for restrictive clauses might aid clarity. He certainly didn’t say say that that was what everyone must do. In practice I do use “that” myself most of the time, but I don’t lose any sleep if others use “which”.

      Author’s gravatar

      You’re right that Fowler didn’t invent the rule—the earliest such proposal that I know of was by Alexander Bain in 1891, and he proposed that we reserve that for all restrictive clauses and which or who for all nonrestrictive ones. That is, he proposed not just a that/which rule, but a that/whichwho rule.

      But apparently only the that/which part caught on. Fowler promoted it, though he said it would be foolish to pretend that it was actually the practice of the best writers, but it really took off after E. B. White put it in The Elements of Style.

    Author’s gravatar

    I’m an English teacher, and while I’m theoretically aware of the that/which alleged-rule, just thinking about it kind of makes my head hurt. I’d have to specifically look it up to tell you which word is supposed to go with which kind of relative clause. And I think that’s the ultimate argument against the rule being legitimate: it’s totally unintuitive. Sure, language changes, but only when the changes make intuitive sense to a critical mass of people, such that they use the revised “rules” without even thinking about them. You can try to force the issue, but…it doesn’t work.

    This is also why I find it unlikely that synthetic pronouns for alternative genders are going to catch on (as opposed to gender-neutral “they,” which is natural and which you find everywhere) in any serious way. I wouldn’t have a problem with it if they did, but I think that they’re really going against the grain, and I have a hard time imagining it happening. I could be wrong!

    Author’s gravatar

    Regarding language change coming from above (I really feel a need for scare-quotes around “above” in this instance), it seems to me that the pedants are winning in making people so insecure about “less” that they are now using “fewer” as a hypercorrection, and this is a real change in the making.

    Author’s gravatar

    It’s not just that the that/which rule is useless; it is positively harmful. It decrees that you can divide clauses into restrictive and non-restrictive. But English is much more nuanced than that. There are clauses that I would call semi-restrictive; I remember seeing these in Graham Greene. He would use “which” with them, though I’m sure an American copyeditor would change to “that.” Even if you could determine precisely that a clause is restrictive or non-restrictive, to say that one requires that and the other which is to simply throw style out the window. Which unfortunately is what a lot of the copyeditors’ rules do.

Leave a Reply to Athel Cornish-Bowden Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.