Arrant Pedantry

By

Now on Visual Thesaurus: “Electrocution: A Shocking Misuse?”

I have a new post up on Visual Thesaurus about the use, misuse, and history of the word electrocute. Some usage commentators today insist that it be used only to refer to death by electric shock; that is, you can’t say you’ve been electrocuted if you lived to tell the tale. But the history, unsurprisingly, is more complicated: there have been disputes about the word since its birth.

As always, the article is for subscribers only, but a subscription costs a paltry $2.95 a month or $19.95 (and would make a great gift for the word lover in your life). Check it out.

By

Yes, Irregardless Is a Word

My last post, “12 Mistakes Nearly Everyone Who Writes about Grammar Mistakes Makes”, drew a lot of comments, some supportive and some critical. But no point drew as much ire as my claim that irregardless is a word. Some stated flatly, “Irregardless is not a word.” One ignorantly demanded, “Show me a dictionary that actually contains that word.” (I could show him several.) Still others argued that it was a double negative, that it was logically and morphologically ill-formed and thus had no meaning. One commenter said that “with the negating preface [prefix] ‘ir-’ and the negating suffix ‘-less’, it is a double negative” and that “it is not a synonym with ‘regardless’.” Another was even cleverer, saying, “The prefix ir-, meaning not, changes the meaning of the word regardless, so not only is it not a standard word, but it’s also misused in nearly all cases.” But these arguments still miss the point: irregardless is indeed a word, and it means the same thing as regardless.

In my last post I argued that there’s a clear difference between a word like irregardless and a nonword like flirgle. By any objective criterion, irregardless is a word. It has an established form and meaning, it’s used in speech and occasionally in writing, and it’s even found in reputable dictionaries, including Merriam-Webster’s Collegiate Dictionary and The Oxford English Dictionary (though it is, quite appropriately, labeled nonstandard). We can identify its part of speech (it’s an adverb) and describe how it’s used. By contrast, though, consider flirgle. You don’t know what its part of speech is or how to use it, and if I were to use it in a sentence, you wouldn’t know what it meant. This is because it’s just something I made up by stringing some sounds together. But when someone uses irregardless, you know exactly what it means, even if you want to pretend otherwise.

This is because words get their wordhood not from etymology or logic or some cultural institution granting them official status, but by convention. It doesn’t matter that nice originally meant “ignorant” or that contact was originally only a noun or that television is formed from a blend of Greek and Latin roots; what matters is how people use these words now. This makes some people uncomfortable because it sounds like anarchy, but it’s more like the ultimate democracy or free market. We all want to understand one another and be understood, so it’s in our mutual interest to communicate in ways that are understandable. Language is a self-regulating system guided by the invisible hand of its users’ desire to communicate—not that this stops people from feeling the need for overt regulation.

One commenter, the same who said, “Irregardless is not a word,” noted rather aptly, “There is absolutely no value to ‘irregardless’ except to recognize people who didn’t study.” Exactly. There is nothing wrong with its ability to communicate; it’s only the word’s metacommunication—that is, what it communicates about its user—that is problematic. To put it a different way, the problem with irregardless is entirely social: if you use it, you’ll be thought of as uneducated, even though everyone can understand you just fine.

On Google Plus, my friend Rivka said, “Accepting it as a word is the first part of the slippery slope.” This seems like a valid fear, but I believe it is misplaced. First of all, we need to be clear about what it means to accept irregardless as a word. I accept that it’s a word, but this does not mean that I find the word acceptable. I can accept that people do all kinds of things that I don’t like. But the real problem isn’t what we mean by accept; it’s what we mean by word. When people say that something isn’t a word, they aren’t really making a testable claim about the objective linguistic status of the word; they’re making a sociolinguistic evaluation of the word. They may say that it’s not a word, but they really mean that it’s a word that’s not allowed in Standard English. This is because we think of Standard English as the only legitimate form of English. We think that the standard has words and grammar, while nonstandard dialects have nonwords and broken grammar, or no grammar at all. Yes, it’s important to recognize and teach the difference between Standard English and nonstandard forms, but it’s also important to be clear about the difference between facts about the language and our feelings about the language.

But the irregardless-haters can also take heart: the word has been around for at least a century now, and although many other new words have been coined and become part of Standard English in that time, irregardless shows no signs of moving towards acceptability. Most people who write for publication are well aware of the stigma attached to it, and even if they aren’t, few copyeditors are willing to let it into print. It’s telling that of the Oxford English Dictionary’s eight citations of the word, two merely cite the word in other dictionaries, three more are mentions or citations in linguistics or literary journals, and one more appears to be using the word ironically. We talk about the word irregardless—mostly just to complain about it—far more than we actually use it.

So yes, irregardless is a word, even though it’s nonstandard. You don’t have to like it, and you certainly don’t have to use it, but you also don’t have to worry about it becoming acceptable anytime soon.

This post also appears on Huffington Post.

By

Hanged and Hung

The distinction between hanged and hung is one of the odder ones in the language. I remember learning in high school that people are hanged, pictures are hung. There was never any explanation of why it was so; it simply was. It was years before I learned the strange and complicated history of these two words.

English has a few pairs of related verbs that are differentiated by their transitivity: lay/lie, rise/raise, and sit/set. Transitive verbs take objects; intransitive ones don’t. In each of these pairs, the intransitive verb is strong, and the transitive verb is weak. Strong verbs inflect for the preterite (simple past) and past participle forms by means of a vowel change, such as sing–sang–sung. Weak verbs add the -(e)d suffix (or sometimes just a -t or nothing at all if the word already ends in -t). So lie–lay–lain is a strong verb, and lay–laid–laid is weak. Note that the subject of one of the intransitive verbs becomes the object when you use its transitive counterpart. The book lay on the floor but I laid the book on the floor.

Historically hang belonged with these pairs, and it ended up in its current state through the accidents of sound change and history. It was originally two separate verbs (the Oxford English Dictionary actually says it was three—two Old English verbs and one Old Norse verb—but I don’t want to go down that rabbit hole) that came to be pronounced identically in their present-tense forms. They still retained their own preterite and past participle forms, though, so at one point in Early Modern English hang–hung–hung existed alongside hang–hanged–hanged.

Once the two verbs started to collapse together, the distinction started to become lost too. Just look at how much trouble we have keeping lay and lie separate, and they only overlap in the present lay and the past tense lay. With identical present tenses, hang/hang began to look like any other word with a choice between strong and weak past forms, like dived/dove or sneaked/snuck. The transitive/intransitive distinction between the two effectively disappeared, and hung won out as the preterite and past participle form.

The weak transitive hanged didn’t completely vanish, though; it stuck around in legal writing, which tends to use a lot of archaisms. Because it was only used in legal writing in the sense of hanging someone to death (with the poor soul as the object of the verb), it picked up the new sense that we’re now familiar with, whether or not the verb is transitive. Similarly, hung is used for everything but people, whether or not the verb is intransitive.

Interestingly, German has mostly hung on to the distinction. Though the German verbs both merged in the present tense into hängen, the past forms are still separate: hängen–hing–gehungen for intransitive forms and hängen–hängte–gehängt for transitive. Germans would say the equivalent of I hanged the picture on the wall and The picture hung on the wall—none of this nonsense about only using hanged when it’s a person hanging by the neck until dead.

The surprising thing about the distinction in English is that it’s observed (at least in edited writing) so faithfully. Usually people aren’t so good at honoring fussy semantic distinctions, but here I think the collocates do a lot of the work of selecting one word or the other. Searching for collocates of both hanged and hung in COCA, we find the following words:

hanged:
himself
man
men
herself
themselves
murder
convicted
neck
effigy
burned

hung:
up
phone
air
wall
above
jury
walls
hair
ceiling
neck

The hanged words pretty clearly all hanging people, whether by suicide, as punishment for murder, or in effigy. (The collocations with burned were all about hanging and burning people or effigies.) The collocates for hung show no real pattern; it’s simply used for everything else. (The collocations with neck were not about hanging by the neck but about things being hung from or around the neck.)

So despite what I said about this being one of the odder distinctions in the language, it seems to work. (Though I’d like to know to what extent, if any, the distinction is an artifact of the copy editing process.) Hung is the general-use word; hanged is used when a few very specific and closely related contexts call for it.

By

The Enormity of a Usage Problem

Recently on Twitter, Mark Allen wrote, “Despite once being synonyms, ‘enormity’ and ‘enormousness’ are different. Try to keep ‘enormity’ for something evil or outrageous.” I’ll admit right off that this usage problem interests me because I didn’t learn about the distinction until a few years ago. To me, they’re completely synonymous, and the idea of using enormity to mean “an outrageous, improper, vicious, or immoral act” and not “the quality or state of being huge”, as Merriam-Webster defines it, seems almost quaint.

Of course, such usage advice presupposes that people are using the two words synonymously; if they weren’t, there’d be no reason to tell them to keep the words separate, so the assertion that they’re different is really an exhortation to make them different. Given that, I had to wonder how different they really are. I turned to Mark Davies Corpus of Contemporary American English to get an idea of how often enormity is used in the sense of great size rather than outrageousness or immorality. I looked at the first hundred results from the keyword-in-context option, which randomly samples the corpus, and tried to determine which of the four Merriam-Webster definitions was being used. For reference, here are the four definitions:

1 : an outrageous, improper, vicious, or immoral act enormities of state power — Susan Sontag> enormities too juvenile to mention — Richard Freedman>
2 : the quality or state of being immoderate, monstrous, or outrageous; especially : great wickedness enormity of the crimes committed during the Third Reich — G. A. Craig>
3 : the quality or state of being huge : immensity
enormity of the universe>
4 : a quality of momentous importance or impact
enormity of the decision>

In some cases it was a tough call; for instance, when someone writes about the enormity of poverty in India, enormity has a negative connotation, but it doesn’t seem right to substitute a word like monstrousness or wickedness. It seems that the author simply means the size of the problem. I tried to use my best judgement based on the context the corpus provides, but in some cases I weaseled out by assigning a particular use to two definitions. Here’s my count:

1: 1
2: 19
2/3: 3
3: 67
3/4: 1
4: 9

By far the most common use is in the sense of “enormousness”; the supposedly correct senses of great wickedness (definitions 1 and 2) are used just under a quarter of the time. So why did Mr. Allen say that enormity and enormousness were once synonyms? Even the Oxford English Dictionary marks the “enormousness” sense as obsolete and says, “Recent examples might perh. be found, but the use is now regarded as incorrect.” Perhaps? It’s clear from the evidence that it’s still quite common—about three times as common as the prescribed “monstrous wickedness” sense.

It’s true that the sense of immoderateness or wickedness came along before the sense of great size. The first uses as recorded in the OED are in the sense of “a breach of law or morality” (1477), “deviation from moral or legal rectitude” (1480), “something that is abnormal” (a1513), and “divergence from a normal standard or type” (a1538). The sense of “excess in magnitude”—the one that the OED marks as obsolete and incorrect—didn’t come along until 1792. In all these senses the etymology is clear: the word comes from enorm, meaning “out of the norm”.

As is to be expected, Merriam-Webster’s Dictionary of English Usage has an excellent entry on the topic. It notes that many of the uses of enormity considered objectionable carry shades of meaning or connotations not shown by enormousness:

Quite often enormity will be used to suggest a size that is beyond normal bounds, a size that is unexpectedly great. Hence the notion of monstrousness may creep in, but without the notion of wickedness. . . .

In many instances the notion of great size is colored by aspects of the first sense of enormity as defined in Webster’s Second. One common figurative use blends together notions of immoderateness, excess, and monstrousness to suggest a size that is daunting or overwhelming.

Indeed, it’s the blending of senses that made it hard to categorize some of the uses that I came across in COCA. Enormousness does not seem to be a fitting replacement for those blended or intermediate senses, and, as MWDEU notes, it’s never been a popular word anyway. Interestingly, MWDEU also notes that “the reasons for stigmatizing the size sense of enormity are not known.” Perhaps it became rare in the 1800s, when the OED marked it obsolete, and the rule was created before the sense enjoyed a resurgence in the twentieth century. Whatever the reason, I don’t think it makes much sense to condemn the more widely used sense of a word just because it’s newer or was rare at some point in the past. MWDEU sensibly concludes, “We have seen that there is no clear basis for the ‘rule’ at all. We suggest that you follow the writers rather than the critics: writers use enormity with a richness and subtlety that the critics have failed to take account of. The stigmatized sense is entirely standard and has been for more than a century and a half.”

By

Funner Grammar

As I said in the addendum to my last post, maybe I’m not so ready to abandon the technical definition of grammar. In a recent post on Copyediting, Andrea Altenburg criticized the word funner in an ad for Chuck E. Cheese as “improper grammar”, and my first reaction was “That’s not grammar!”

That’s not entirely accurate, of course, as Matt Gordon pointed out to me on Twitter. The objection to funner was originally grammatical, and the Copyediting post does make an appeal to grammar. The argument goes like this: fun is properly a noun, not an adjective, and as a noun, it can’t take comparative or superlative degrees—no funner or funnest.

This seems like a fairly reasonable argument—if a word isn’t an adjective, it can’t inflect like one—but it isn’t the real argument. First of all, it’s not really true that fun was originally a noun. As Ben Zimmer explains in “Dear Apple: Stop the Funnification”, the noun fun arose in the late seventeenth century and was labeled by Samuel Johnson in the mid-1800s “as ‘a low cant word’ of the criminal underworld.” But the earliest citation for fun is as a verb, fourteen years earlier.

As Merriam-Webster’s Dictionary of English Usage
notes, “A couple [of usage commentators] who dislike it themselves still note how nouns have a way of turning into adjectives in English.” Indeed, this sort of functional shift—also called zero derivation or conversion by linguists because they change the part of speech without the means of prefixation or suffixation—is quite common in English. English lacks case endings and has little in the way of verbal endings, so it’s quite easy to change a word from one part of speech to another. The transformation of fun from a verb to a noun to an inflected adjective came slowly but surely.

As this great article explains, shifts in function or meaning usually happen in small steps. Once fun was established as a noun, you could say things like We had fun. This is unambiguously a noun—fun is the object of the verb have. But then you get constructions like The party was fun. This is structurally ambiguous—both nouns and adjectives can go in the slot after was.

This paves the way to analyze fun as an adjective. It then moved into attributive use, directly modifying a following noun, as in fun fair. Nouns can do this too, so once again the structure was ambiguous, but it was evidence that fun was moving further in the direction of becoming an adjective. In the twentieth century it started to be used in more unambiguously adjectival roles. MWDEU says that this accelerated after World War II, and Mark Davies COHA shows that it especially picked up in the last twenty years.

Once fun was firmly established as an adjective, the inflected forms funner and funnest followed naturally. There are only a handful of hits for either in COCA, which attests to the fact that they’re still fairly new and relatively colloquial. But let’s get back to Altenburg’s post.

She says that fun is defined as a noun and thus can’t be inflected for comparative or superlative forms, but then she admits that dictionaries also define fun as an adjective with the forms funner and funnest. But she waves away these definitions by saying, “However, dictionaries are starting to include more definitions for slang that are still not words to the true copyeditor.”

What this means is that she really isn’t objecting to funner on grammatical grounds (at least not in the technical sense); her argument simply reduces to an assertion that funner isn’t a word. But as Stan Carey so excellently argued, “‘Not a word’ is not an argument”. And even the grammatical objections are eroding; many people now simply assert that funner is wrong, even if they accept fun as an adjective, as Grammar Girl says here:

Yet, even people who accept that “fun” is an adjective are unlikely to embrace “funner” and “funnest.” It seems as if language mavens haven’t truly gotten over their irritation that “fun” has become an adjective, and they’ve decided to dig in their heels against “funner” and “funnest.”

It brings to mind the objection against sentential hopefully. Even though there’s nothing wrong with sentence adverbs or with hopefully per se, it was a new usage that drew the ire of the mavens. The grammatical argument against it was essentially a post hoc justification for a ban on a word they didn’t like.

The same thing has happened with funner. It’s perfectly grammatical in the sense that it’s a well-formed, meaningful word, but it’s fairly new and still highly informal and colloquial. (For the record, it’s not slang, either, but that’s a post for another day.) If you don’t want to use it, that’s your right, but stop saying that it’s not a word.

By

It’s All Grammar—So What?

It’s a frequent complaint among linguists that laypeople use the term grammar in such a loose and unsystematic way that it’s more or less useless. They say that it’s overly broad, encompassing many different types of rules, and that it allows people to confuse things as different as syntax and spelling. They insist that spelling, punctuation, and ideas such as style or formality are not grammar at all, that grammar is really just the rules of syntax and morphology that define the language.

Arnold Zwicky, for instance, has complained that grammar as it’s typically used refers to nothing more than a “grab-bag of linguistic peeve-triggers”. I think this is an overly negative view; yes, there are a lot of people who peeve about grammar, but I think that most people, when they talk about grammar, are thinking about how to say things well or correctly.

Some people take linguists’ insistence on the narrower, more technical meaning of grammar as a sign of hypocrisy. After all, they say, with something of a smirk, shouldn’t we just accept the usage of the majority? If almost everyone uses grammar in a broad and vague way, shouldn’t we consider that usage standard? Linguists counter that this really is an important distinction, though I think it’s fair to say that they have a personal interest here; they teach grammar in the technical sense and are dismayed when people misunderstand what they do.

I’ve complained about this myself, but I’m starting to wonder whether it’s really something to worry about. (Of course, I’m probably doubly a hypocrite, what with all the shirts I sell with the word grammar on them.) After all, we see similar splits between technical and popular terminology in a lot of other fields, and they seem to get by just fine.

Take the terms fruit and vegetable, for instance. In popular use, fruits are generally sweeter, while vegetables are more savory or bitter. And while most people have probably heard the argument that tomatoes are actually fruits, not vegetables, they might not realize that squash, eggplants, peppers, peas, green beans, nuts, and grains are fruits too, at least by the botanical definition. And vegetable doesn’t even have a botanical definition—it’s just any part of a plant (other than fruits or seeds) that’s edible. It’s not a natural class at all.

In a bit of editorializing, the Oxford English Dictionary adds this note after its first definition of grammar:

As above defined, grammar is a body of statements of fact—a ‘science’; but a large portion of it may be viewed as consisting of rules for practice, and so as forming an ‘art’. The old-fashioned definition of grammar as ‘the art of speaking and writing a language correctly’ is from the modern point of view in one respect too narrow, because it applies only to a portion of this branch of study; in another respect, it is too wide, and was so even from the older point of view, because many questions of ‘correctness’ in language were recognized as outside the province of grammar: e.g. the use of a word in a wrong sense, or a bad pronunciation or spelling, would not have been called a grammatical mistake. At the same time, it was and is customary, on grounds of convenience, for books professedly treating of grammar to include more or less information on points not strictly belonging to the subject.

There are a few points here to consider. The definition of grammar has not been solely limited to syntax and morphology for many years. Once it started branching out into notions of correctness, it made sense to treat grammar, usage, spelling, and pronunciation together. From there it’s a short leap to calling the whole collection grammar, since there isn’t really another handy label. And since few people are taught much in the way of syntax and morphology unless they’re majoring in linguistics, it’s really no surprise that the loose sense of grammar predominates. I’ll admit, however, that it’s still a little exasperating to see lists of grammar rules that everyone gets wrong that are just spelling rules or, at best, misused words.

The root of the problem is that laypeople use words in ways that are useful and meaningful to them, and these ways don’t always jibe with scientific facts. It’s the same thing with grammar; laypeople use it to refer to language rules in general, especially the ones they’re most conscious of, which tend to be the ones that are the most highly regulated—usage, spelling, and style. Again, issues of syntax, morphology, semantics, usage, spelling, and style don’t constitute a natural class, but it’s handy to have a word that refers to the aspects of language that most people are conscious of and concerned with.

I think there still is a problem, though, and it’s that most people generally have a pretty poor understanding of things like syntax, morphology, and semantics. Grammar isn’t taught much in schools anymore, so many people graduate from high school and even college without much of an understanding of grammar beyond spelling and mechanics. I got out of high school without knowing anything more advanced than prepositional phrases. My first grammar class in college was a bit of a shock, because I’d never even learned about things like the passive voice or dependent clauses before that point, so I have some sympathy for those people who think that grammar is mostly just spelling and punctuation with a few minor points of usage or syntax thrown in.

So what’s the solution? Well, maybe I’m just biased, but I think it’s to teach more grammar. I know this is easier said than done, but I think it’s important for people to have an understanding of how language works. A lot of people are naturally interested in or curious about language, and I think we do those students a disservice if all we teach them is never to use infer for imply and to avoid the passive voice. Grammar isn’t just a set of rules telling you what not to do; it’s also a fascinatingly complex and mostly subconscious system that governs the singular human gift of language. Maybe we just need to accept the broader sense of grammar and start teaching people all of what it is.

Addendum: I just came across a blog post criticizing the word funner as bad grammar, and my first reaction was “That’s not grammar!” It’s always easier to preach than to practice, but my reaction has me reconsidering my laissez-faire attitude. While it seems handy to have a catch-all term for language errors, regardless of what type they are, it also seems handy—probably more so—to distinguish between violations of the regulative rules and constitutive rules of language. But this leaves us right where we started.

By

The Data Is In, pt. 2

In the last post, I said that the debate over whether data is singular or plural is ultimately a question of how we know whether a word is singular or plural, or, more accurately, whether it is count or mass. To determine whether data is a count or a mass noun, we’ll need to answer a few questions. First—and this one may seem so obvious as to not need stating—does it have both singular and plural forms? Second, does it occur with cardinal numbers? Third, what kinds of grammatical agreement does it trigger?

Most attempts to settle the debate point to the etymology of the word, but this is an unreliable guide. Some words begin life as plurals but become reanalyzed as singulars or vice versa. For example, truce, bodice, and to some extent dice and pence were originally plural forms that have been made into singulars. As some of the posts I linked to last time pointed out, agenda was also a Latin plural, much like data, but it’s almost universally treated as a singular now, along with insignia, opera, and many others. On the flip side, cherries and peas were originally singular forms that were reanalyzed as plurals, giving rise to the new singular forms cherry and pea.

So obviously etymology alone cannot tell us what a word should mean or how it should work today, but then again, any attempt to say what a word ought mean ultimately rests on one logical fallacy or another, because you can’t logically derive an ought from an is. Nevertheless, if you want to determine how a word really works, you need to look at real usage. Present usage matters most, but historical usage can also shed light on such problems.

Unfortunately for the “data is plural” crowd, both present and historical usage are far more complicated than most people realize. The earliest citation in the OED for either data or datum is from 1630, but it’s just a one-word quote, “Data.” The next citation is from 1645 for the plural count noun “datas” (!), followed by the more familiar “data” in 1646. The singular mass noun appeared in 1702, and the singular count noun “datum” didn’t appear until 1737, roughly a century later. Of course, you always have to take such dates with a grain of salt, because any of them could be antedated, but it’s clear that even from the beginning, data‘s grammatical number was in doubt. Some writers used it as a plural, some used it as a singular with the plural form “datas”, and apparently no one used its purported singular form “datum” for another hundred years.

It appears that historical English usage doesn’t help much in settling the matter, though it does make a few things clear. First, there has been considerable variation in the perceived number of data (mass, singular count, or plural count) for over 350 years. Second, the purported singular form, datum, was apparently absent from English for almost a hundred years and continues to be relatively rare today. In fact, in Mark Davies’ COCA, “data point” slightly outnumbers “datum”, and most of the occurrences of “datum” are not the traditional singular form of data but other specialized uses. This is the first strike against data as a plural; count nouns are supposed to have singular forms, though there are a handful of words known as pluralia tantum, which occur only in the plural. I’ll get to that later.

So data doesn’t really seem to have a singular form. At least you can still count data, right? Well, apparently not. Nearly all of the hits in COCA for “[mc*] data” (meaning a cardinal number followed by the word data) are for things like “two data sets” or “74 data points”. It seems that no one who uses data as a plural count noun ever bothers to count their data, or when they do, they revert to using “data” as a mass noun to modify a normal count noun like “points”. Strike two, and this is a big one. The Cambridge Grammar of the English Language gives use with cardinal numbers as the primary test of countability.

Data does better when it comes to grammatical agreement, though this is not as positive as it may seem. It’s easy enough to find constructions like as these few data show, but it’s just as easy to find constructions like there is very little data. And when the word fails the first two tests, the results here seem suspect. Aren’t people simply forcing the word data to behave like a plural count noun? As this wonderfully thorough post by Norman Gray points out (seriously, read the whole thing), “People who scrupulously write ‘data’ as a plural are frequently confused when it comes to more complicated sentences”, writing things like “What is HEP data? The data themselves…”. The urge to treat data as a singular mass noun—because that’s how it behaves—is so strong that it takes real effort to make it seem otherwise.

It seems that if data really is a plural noun, it’s a rather defective one. As I mentioned earlier, it’s possible that it’s some sort of plurale tantum, but even this conclusion is unsatisfying.
Many pluralia tantum in English are words that refer to things made of two halves, like scissors or tweezers, but there are others like news or clothes. You can’t talk about one new or one clothe (though clothes was originally the plural of cloth). You also usually can’t talk about numbers of such things without using an additional counting word or paraphrasing. Thus we have news items or articles of clothing.

Similarly, you can talk about data points or points of data, but at best this undermines the idea that data is an ordinary plural count noun. But language is full of exceptions, right? Maybe data is just especially exceptional. After all, as Robert Lane Green said in this post, “We have a strong urge to just have language behave, but regular readers of this column know that, as the original Johnson knew, it just won’t.”

I must disagree. The only thing that makes data exceptional is that people have gone to such great lengths to try to get it to act like a plural, but it just isn’t working. Its irregularity is entirely artificial, and there’s no purpose for it except a misguided loyalty to the word’s Latin roots. I say it’s time to stop the act and just let the word behave—as a mass noun.

By

The Data Is In, pt. 1

Lately there has been a spate of blog posts on the question of whether data is a singular or a plural noun. Surprisingly, most of them come down on the side of saying that it can be singular—except when it’s plural. Although saying that it can be singular is refreshingly open-minded, I’ve still got a few problems with the facts and reasoning that led them to that conclusion, as well as the wishy-washiness of saying that it’s singular except when it isn’t.

The first post, “Is Data Is, or Is Data Ain’t, a Plural?”, came from the Wall Street Journal, and it took what Robert Lane Greene of the Economist blog Johnson called “an unusually fence-sitting position“: although they say that they “hereby join the majority” by accepting it as either singular or plural, they predict that “the plural will continue to dominate in our prose”. And they give this head-scratching reasoning:

Singular verbs now are often used to refer to collections of information: Little data is available to support the conclusions.

Otherwise, generally continue to use the plural: Data are still being collected.

Isn’t all data—whether you think of it as a count or a mass noun—“collections of information”? Just because something’s in a collection doesn’t mean it’s singular. For example, if I had an extensive rock collection, you probably wouldn’t say that I had a lot of rock, though I suppose you could; you’d probably say that I have a lot of rocks. The number really depends on the way we perceive the things in the collection, not on the fact that it’s in a collection. But if that wasn’t confusing enough, they give this unreliable test of data‘s number:

As a singular/plural test, try to substitute statistics for data: It doesn’t work in the first case — little statistics is available — so the singular is fails to pass muster. The substitution does work in the second case — statistics are still being collected – so the plural are passes muster. (italics added for clarity)

Doesn’t this test simply tell you that data should always be plural? In what case would the singular is ever pass muster? Either I’m missing something important about how you’re supposed to use this substitution test or it’s simply broken.

Next came this post on the Guardian‘s Datablog. Sadly, it’s even more muddled than the Wall Street Journal post, and it’s depressingly light on data. It simply asserts, without examination,

Strictly-speaking, data is a plural term. Ie, if we’re following the rules of grammar, we shouldn’t write “the data is” or “the data shows” but instead “the data are” or “the data show”.

But despite further assertions that data is “strictly a plural”, the Guardian style guide says, “Data takes a singular verb”, though they correctly note that (virtually) “no one ever uses ‘agendum’ or ‘datum’”. But this idoesn’t make much sense; if it’s plural, why does it take a singular verb? And if it takes a singular verb, is it really plural?

The Guardian post also linked to this National Geographic post from a few years ago, which says much the same thing but somehow manages to be even more muddled. It starts off badly by saying that “data is often used as a collective noun referring to information, statistics, and the like”. Here they mean “mass noun”, not “collective noun”. Note that the Wikipedia articles each say at the top that these terms should not be confused. But aside from this basic mistake, note how it seems to contradict the Wall Street Journal post, which says that singular verbs are used for collections of information.

I wondered if this was just a simple error in the National Geographic post; from context, I would have expected the so-called “collective” form to use a singular verb. But in the next paragraph they say that their style is to use data as a plural when “referring to a body of facts, figures, and such.”

The post gets even more confusing, pointing out some of National Geographic‘s supposed errors and then saying that both the singular and plural are considered standard. If they’re both standard, then how are their examples errors? The post ends with a red herring about avoiding confusion and the bizarre statement, “I’d rather not box writers into a singular form.” So why box them into a plural form? If there’s a distinction to be made, even a subtle one, between data as a mass noun and data as a singular noun, why not encourage it? Why whitewash over it by insisting that data always be plural?

Ultimately, though, this whole debate rests on one question: how do we know whether a word is plural or singular? And that’s what I’ll tackle next time.

Read part 2 here.

By

No Dice

If you’ve ever had to learn a foreign language, you may have struggled to memorize plural forms of nouns. German, for example, has about a half a dozen ways of forming plurals, and it’s a chore to remember which kind of plural each noun takes. English, by comparison, is ridiculously easy. Here’s how it works for nearly every English noun: add -s to the end. Sometimes you need to insert an e before the s, and sometimes you need to change a preceding y to ie, but that’s the rule in a nutshell.

Of course, there are still plenty of exceptions: a couple that end in -en (oxen and the strange double plural children), a handful of umlaut plurals (man–men, foot–feet, mouse–mice, etc.), some uninflected plurals (usually for domesticated or game animals, such as sheep, deer, and so on), and a plethora of foreign borrowings (particularly from Latin and Greek) that often follow rules from their donor languages but occasionally don’t. There are a few other oddballs—like person–people, for example—but nearly every English count noun fits into one of these categories.

But there’s one plural that doesn’t fit into any of these categories, because it’s been caught for centuries in a strange limbo between count nouns, which take plural forms, and mass nouns, which don’t. It’s dice. If you need a refresher, mass nouns generally refer to things that are not discrete, such as milk or oil, though some refer to things that are made of discrete pieces “whose indivual identities are not usually important to us,” as Arnold Zwicky put it in this Language Log post—words like corn or rice. You could count the individual grains or kernels if you wanted to, but why would you ever want to?

And this is how dice slipped through the cracks of language change. Originally, die was a regular noun that formed its plural by adding an s sound to the end. (For the moment, let’s leave aside the issue of spelling, because Middle and Early Modern English spelling was anything but standard.) At some point in the history of English, the final -s in plurals was voiceless, meaning that it was always pronounced with an s sound, not a z sound. But then that changed, probably sometime in the 1500s, so that the final -s was always voiced—that is, pronounced as a z—unless it followed a voiceless sound. Strangely, this sound change seems to have affected only the plural and possessive -s endings and not other word-final s’s.

But around that time, we start seeing the plural of die, when referring to those little cubes with pips used for games and whatnot, spelled as dice (and similar forms). In Modern English spelling, the final -s on a plural can be either voiced or voiceless, depending on the preceding word, but -ce is always voiceless. As the regular plural ending was becoming voiced for many many words, it remained voiceless in dice. Why?

Well, apparently because people had stopped thinking of it as a plural and started thinking of it as a mass noun, much like corn and rice, so they stopped seeing the s sound on the end as the plural marker and started perceiving it as simply part of the word. Singular dice can be found back to the late 1300s, and when the sound change came along in the 1500s and voiced most plural -s endings, dice was left behind, with its spelling altered to show that it was unequivocally voiceless. In other senses of the word, die was still thought of as a regular count noun, so its plural forms ended up as dies.*

Dice wasn’t the only word passed over in this way, though; truce (originally the plural of true, meaning “pledge” or “oath”), bodice (plural of body), and pence (a contracted plural form of penny) come to us the same way. Speakers subconsciously reanalyzed these words as mass nouns or singular count nouns, so their final s sounds stayed voiceless. Similarly, once, twice, and thrice were originally genitive forms, but they ceased to be thought of as such and consequently retained their voiceless sounds, respelled with ce.

But the strange thing is that whereas the words mentioned above made the transition to mass nouns or new singular count nouns, usage of dice has been split for centuries. We’ve never fully made the switch to thinking of dice as a mass noun, used regardless of the actual number of the things, because, unlike rice or corn, we do frequently care about the number of dice being used. Instead of a true mass noun, it’s become an uninflected count noun—one dice, two dice—for many people, though it exists alongside the original singular die. But singular dice is rare in print, because we’re told that it’s properly one die, two dice, even though some dictionaries note that singular dice is much more frequent in gaming than die.

So where does that leave us? You can go with singular die and possibly be thought of as something of a pedant, or you can go with singular dice and possibly be thought of as a little ignorant. As for me, I usually use singular die and feel twinges of self-loathing when I do so; I haven’t had the heart to correct my boys when they use singular dice.

*For more on the reconstruction of the plural ending in English, see the section on the English plural suffix in the chapter “Reconstruction” in Language History: An Introduction, by Andrew L. Sihler (Philadelphia: John Benjamins, 2000).

By

However

Several weeks ago, Bob Scopatz asked in a comment about the word however, specifically whether it should be preceded by a comma or a semicolon when it’s used between two clauses. He says that a comma always seems fine to him, but apparently this causes people to look askance at him.

The rule here is pretty straightforward, and Purdue’s Online Writing Lab has a nice explanation. Independent clauses joined by coordinating conjunctions are separated by a comma; independent clauses that are not joined by coordinating conjunctions or are joined by what OWL calls “conjunctive adverbs” require a semicolon.

I’ve also seen the terms “transitional adverb” and “transitional phrase,” though the latter usually refers to multiword constructions like as a result, for example, and so on. These terms are probably more accurate since (I believe) words and phrases like however are not, strictly speaking, conjunctions. Though they do show a relationship between two clauses, that relationship is more semantic or rhetorical than grammatical.

Since however falls into this group, it should be preceded by a semicolon, though it can also start a new sentence. Grammar-Monster.com has some nice illustrative examples:

I am leaving on Tuesday, however, I will be back on Wednesday to collect my wages.
I am leaving on Tuesday; however, I will be back on Wednesday to collect my wages.
I am leaving on Tuesday. However, I will be back on Wednesday to collect my wages.

The first example is incorrect, while the latter two are correct. Note that “however” is also followed by a comma. (But would also work here, though in that case it would be preceded by a comma and not followed by one.)

Bob also mentioned that he sometimes starts a sentence with “however,” and this usage is a little more controversial. Strunk & White and others forbade however in sentence- or clause-initial position, sometimes with the argument that in this position it can only mean “in whatever way” or “to whatever extent.”

It’s true that however is sometimes used this way, as in “However it is defined, the middle class is standing on shaky ground,” to borrow an example from COCA. But this is clearly different from the Grammar-Monster sentences above. In those, the punctuation—namely the comma after “however”—indicates that this is not the “in whatever way” however, but rather the “on the contrary” or “in spite of that” one.

Some editors fastidiously move sentence-initial “howevers” to a position later in the sentence, as in I will be back on Wednesday, however, to collect my wages. As long as it’s punctuated correctly, it’s fine in either location, so there’s no need to move it. But note that when it occurs in the middle of a clause, it’s surrounded by commas.

It’s possible that sentence-initial however could be ambiguous without the following comma, but even then the confusion is likely to be momentary. I don’t see this as a compelling reason to avoid sentence-initial however, though I do believe it’s important to punctuate it properly, with both a preceding semicolon or period and a following comma, to avoid tripping up the reader.

In a nutshell, however is an adverb, not a true conjunction, so it can’t join two independent clauses with just a comma. You can either join those clauses with a semicolon or separate them with a period. But either way, however should be set off by commas. When it’s in the middle of a clause, the commas go on both sides; when it’s at the beginning of a clause, it just needs a following comma. Hopefully this will help Bob (and others) stop getting those funny looks.

%d bloggers like this: