Arrant Pedantry

By

I Request You to Read This Post

Several weeks ago, I tweeted about a weird construction that I see frequently at work thanks to our project management system. Whenever someone assigns me to a project, I get an email like the one below:Hi Jonathon, [Name Redacted] just requested you to work on Editing. It's all yours.

I said that the construction sounded ungrammatical to me—you can ask someone to do something or request that they do it, but not request them to do it. Several people agreed with me, while others said that it makes sense to them if you stress you—they requested me to work on it, not someone else. Honestly, I’m not sure that stress changes anything, since the question is about what kind of complementation the verb request allows. Changing the stress doesn’t change the syntax.

However, Jesse Sheidlower, a former editor for The Oxford English Dictionary, quickly pointed out that the first sense of request in the OED is “to ask (a person), esp. in a polite or formal manner, to do something.” There are citations from around 1485 down to the present illustrating the construction request [someone] to [verb]. (Sense 3 is the request that [someone] [verb] construction, which has been around from 1554 to the present.) Jordan Smith, a linguistics PhD student at Iowa State, also pointed out that The Longman Grammar says that request is attested in the pattern [verb + NP + to-clause], just like ask. He agreed that it sounds odd, though.

So obviously the construction has been around for a while, and it’s apparently still around, but that didn’t explain why it sounds weird to me. I decided to do a little digging in the BYU corpora, and what I found was a little surprising.

The Corpus of Historical American English (COHA) shows a slow decline in the request [someone] to [verb] construction, from 13.71 hits per million words in the 1820s to just .2 per million words in the first decade of the 2000s.

And it isn’t just that we’re using the verb request a lot less now than we were two hundred years ago. Though it has seen a moderate decline, it doesn’t match the curve for that particular construction.

Even if the construction hasn’t vanished entirely, it’s pretty close to nonexistent in modern published writing—at least in some parts of the world. The Corpus of Global Web-Based English (GLoWbE) shows that while it’s mostly gone in nations where English is the most widely spoken first language (the US, Canada, the UK, Ireland, Australia, and New Zealand), it’s alive and well in South Asia (the taller bars in the middle are India, Sri Lanka, Pakistan, and Bangladesh). (Interestingly, the only OED citation for this construction in the last fifty years comes from a book called World Food: India.) To a lesser extent, it also survives in some parts of Africa and Southeast Asia (the two smallish bars at the right are Kenya and Tanzania).

It’s not clear why my work’s project management system uses a construction that is all but extinct in most varieties of English but is still alive and well in South Asia. The company is based in Utah, but it’s possible that they employ people from South Asia or that whoever wrote that text just happens to be among the few speakers of American English who still use it.

Whatever the reason, it’s an interesting example of language change in action. Peter Sokolowski, an editor for Merriam-Webster, likes to say, “Most English speakers accept the fact that the language changes over time, but don’t accept the changes made in their own time.” With apologies to Peter, I don’t think this is quite right. The changes we don’t accept are generally the ones made in our own time, but most changes happen without us really noticing. Constructions like request that [someone] [verb] fade out of use, and no one bemoans their loss. Other changes, like the shift from infinitives to gerunds and the others listed in this article by Arika Okrent, creep in without anyone getting worked up about them. It’s only the tip of the iceberg that we occasionally gripe about, while the vast bulk of language change slips by unnoticed.

This is important because we often conflate change and error—that is, we think that language changes begin as errors that gradually become accepted. For example, Bryan Garner’s entire Language Change Index is predicated on the notion that change is synonymous with error. But many things that are often considered wrong—towards, less with count nouns, which used as a restrictive relative pronoun—are quite old, while the rules forbidding their use are in fact the innovations. It’s perverse to call these changes that are creeping in when they’re really old features that are being pushed out. Indeed, the whole purpose of the index isn’t to tell you where a particular use falls on a scale of change, but to tell you how accepted that use is—that is, how much of an error it is.

So the next time you assume that a certain form must be a recent change because it’s disfavored, I request you to reexamine your assumptions. Language change is much more subtle and much more complex than you may think.

By

Over Has Always Meant More Than. Get Over it.

Last month, at the yearly conference of the American Copy Editors Society, the editors of the AP Stylebook announced that over in the sense of more than was now acceptable. For decades, newspaper copy editors had been changing constructions like over three hundred people to more than three hundred people; now, with a word from AP’s top editors, that rule was being abandoned.

According to Merriam-Webster editor Peter Sokolowski, who was in attendance, the announcement was met with gasps. Editors quickly took to Twitter and to blogs to express their approval or dismay. Some saw it as part of the dumbing-down of the language or as a tacit admission that newspapers no longer have the resources to maintain their standards. Others saw it as the banishment of a baseless superstition that has wasted copy editors’ time without improving the text.

The argument had been that over must refer to spatial relationships and that numerical relationships must use more than. But nobody objects to other figurative uses of over, such as over the weekend or get over it or in over your head or what’s come over you? The rule forbidding the use of over to mean more than was first codified in the 1800s, but over can be found in this sense going back a thousand years or more, in some of the earliest documents written in English.

Not only that, but parallel uses can be found in other Germanic languages, including German, Dutch, and Swedish. (Despite all its borrowings from French, Latin, and elsewhere, English is considered a Germanic language.) There’s nothing wrong with the German Kinder über 14 Jahre (children over 14 years) (to borrow an example from the Collins German-English Dictionary) or the Swedish Över femhundra kom (over five hundred came). This means that this use of over actually predates English and must have been inherited from the common ancestor of all the Germanic languages, Proto-Germanic, some two thousand years ago.

Mignon Fogarty, aka Grammar Girl, wrote that “no rationale exists for the ‘over can’t mean more than’ rule.” And in a post on the Merriam-Webster Unabridged blog, Sokolowski gave his own debunking, concluding that “we just don’t need artificial rules that do not promote the goal of clarity.” But none of this was good enough for some people. AP’s announcement caused a rift in the editing staff at Mashable, who debated the rule on the lifestyle blog.

Alex Hazlett argued that the rule “was an arbitrary style decision that had nothing to do with grammar, defensible only by that rationale of last resort: tradition.” Megan Hess, though, took an emotional and hyperbolic tack, claiming that following rules like this prevents the world from slipping into “a Lord of the Flies-esque dystopia.” From there her argument quickly becomes circular: “The distinction is one that distinguishes clean, precise language and attention to detail — and serves as a hallmark of a proper journalism training.” In other words, editors should follow the rule because they’ve been trained to follow the rule, and the rule is simply a mark of clean copy. And how do you know the copy is clean? Because it follows rules like this. As Sokolowski says, this is nothing more than a shibboleth—the distinction serves no purpose other than to distinguish those in the know from everyone else.

It’s also a perfect example of a mumpsimus. The story goes that an illiterate priest in the Middle Ages had learned to recite the Latin Eucharist wrong: instead of sumpsimus (Latin for “we have taken”), he said mumpsimus, which is not a Latin word at all. When someone finally told him that he’d been saying it wrong and that it should be sumpsimus, he responded that he would not trade his old mumpsimus for this person’s new sumpsimus. He didn’t just refuse to change—he refused to recognize that he was wrong and had always been wrong.

But so what if everyone’s been using over this way for longer than the English language has existed? Just because everyone does it doesn’t mean it’s right, right? Well, technically, yes, but let’s flip the question around: what makes it wrong to use over to mean more than? The fact that the over-haters have had such an emotional reaction is telling. It’s surprisingly easy to talk yourself into hating a particular word or phrase and to start judging everyone who allegedly misuses it. And once you’ve developed a visceral reaction to a perceived misuse, it’s hard to be persuaded that your feelings aren’t justified.

We editors take a lot of pride in our attention to language—which usually means our attention to the usage and grammar rules that we’ve been taught—so it can seem like a personal affront to be told that we were wrong and have always been wrong. Not only that, but it can shake our faith in other rules. If we were wrong about this, what else might we have been wrong about? But perhaps rather than priding ourselves on following the rules, we should pride ourselves on mastering them, which means learning how to tell the good rules from the bad.

Learning that you were wrong simply means that now you’re right, and that can only be a good thing.

Update: Parallel uses can also be found in cognates of over in other Indo-European languages. For instance, the Latin super could mean both “above” and “more than”, and so could the Ancient Greek ὑπέρ, or hyper. It’s possible that the development of sense from “above” to “more than” happened independently in Latin, Ancient Greek, and Proto-Germanic, but at the very least we can say that this sort of metaphorical extension of sense is very common and very old. There are no logical grounds for objecting to it.

I also created this helpful (and slightly snarky) timeline of the usage of over and its etymons in English and its ancestor languages.

overmorethan

By

Distinctions, Useful and Otherwise

In a recent New York Times video interview, Steven Pinker touched on the topic of language change, saying, “I think that we do sometimes lose distinctions that it would be nice to preserve—disinterested to mean ‘impartial’ as opposed to ‘bored’, for example.”

He goes on to make the point that language does not degenerate, because it constantly replenishes itself—a point which I agree with—but that line caught the attention of Merriam-Webster’s Peter Sokolowski, who said, “It’s a useful distinction, but why pick a problematic example?” I responded, “I find it ironic that such a useful distinction is so rarely used. And its instability undermines the claims of usefulness.”

What Mr. Sokolowski was alluding to was the fact that the history of disinterested is more complicated than the simple laments over its loss would indicate. If you’re unfamiliar with the usage controversy, it goes something like this: disinterested originally meant ‘impartial’ or ‘unbiased’, and uninterested originally meant ‘bored’, but now people have used disinterested to mean ‘bored’ so much that you can’t use it anymore, because too many people will misunderstand you. It’s an appealing story that encapsulates prescriptivists’ struggle to maintain important aspects of the language in the face of encroaching decay. Too bad it’s not really true.

I won’t dive too deeply into the history of the two words—the always-excellent Merriam-Webster’s Dictionary of English Usage spends over two pages on the topic, revealing a surprisingly complex history—but suffice it to say that disinterested is, as Peter Sokolowski mildly put it, “a problematic example”. The first definition the OED gives for disinterested is “Without interest or concern; not interested, unconcerned. (Often regarded as a loose use.)” The first citation dates to about 1631. The second definition (the correct one, according to traditionalists) is “Not influenced by interest; impartial, unbiased, unprejudiced; now always, Unbiased by personal interest; free from self-seeking. (Of persons, or their dispositions, actions, etc.)” Its first citation, however, is from 1659. And uninterested was originally used in the “impartial” or “unbiased” senses now attributed to disinterested, though those uses are obsolete.

It’s clear from the OED’s citations that both meanings have existed side by side from the 1600s. So there’s not so much a present confusion of the two words as a continuing, three-and-a-half-century-long confusion. And for good reason, too. The positive form interested is the opposite of both disinterested and uninterested, and yet nobody complains that we can’t use it because readers won’t be sure whether we mean “having the attention engaged” or “being affected or involved”, to borrow the Merriam-Webster definitions. If we can use interested to mean two different things, why do we need two different words to refer to the opposite of those things?

And as my advisor, Don Chapman, has written, “When gauging the usefulness of a distinction, we need to keep track of two questions: 1) is it really a distinction, or how easy is the distinction to grasp; 2) is it actually useful, or how often do speakers really use the distinction.”1Don Chapman, “Bad Ideas in the History of English Usage,” in Studies in the History of the English Language 5, Variation and Change in English Grammar and Lexicon: Contemporary Approaches, ed. Robert A. Cloutier, Anne Marie Hamilton-Brehm, William A. Kretzschmar Jr. (New York: Walter de Gruyter, 2010), 151 Chapman adds that “often the claim that a distinction is useful seems to rest on little more than this: if the prescriber can state a clear distinction, the distinction is considered to be desirable ipso facto.” He then asks, “But how easy is the distinction to maintain in actual usage?” (151).

From the OED citations, it’s clear that speakers have never been able to fully distinguish between the two words. Chapman also pointed out to me that the two prefixes in question, dis- and un-, do not clearly indicate one meaning or the other. The meanings of the two words comes from different meanings of the root interested, not the prefixes, so the assignment of meaning to form is arbitrary and must simply be memorized, which makes the distinction difficult for many people to learn and maintain. And even those who do learn the distinction do not employ it very frequently. I know this is anecdotal, but it seems to me that disinterested is far more often mentioned than it is used. I can’t remember the last time I spotted a genuine use of disinterested in the wild.

I think it’s time we dispel the myth that disinterested and uninterested epitomize a lost battle to preserve useful distinctions. The current controversy over its use is not indicative of current laxness or confusion, because there was never a time when people managed to fully distinguish between the two words. If anything, disinterested epitomizes the prescriptivist tendency to elegize the usage wars. The typical discussion of disinterested is often light on historical facts and heavy on wistful sighs over how we can no longer use a word that was perhaps never as useful as we would like to think it was.

Notes   [ + ]

1. Don Chapman, “Bad Ideas in the History of English Usage,” in Studies in the History of the English Language 5, Variation and Change in English Grammar and Lexicon: Contemporary Approaches, ed. Robert A. Cloutier, Anne Marie Hamilton-Brehm, William A. Kretzschmar Jr. (New York: Walter de Gruyter, 2010), 151
%d bloggers like this: