Arrant Pedantry

By

Lynne Truss and Chicken Little

Lynne Truss, author of the bestselling Eats, Shoots & Leaves: The Zero Tolerance Approach to Punctuation, is at it again, crying with her characteristic hyperbole and lack of perspective that the linguistic sky is falling because she got a minor bump on the head.

As usual, Truss hides behind the it’s-just-a-joke-but-no-seriously defense. She starts by claiming to have “an especially trivial linguistic point to make” but then claims that the English language is doomed, and it’s all linguists’ fault. According to Truss, linguists have sat back and watched while literacy levels have declined—and have profited from doing so.

What exactly is the problem this time? That some people mistakenly write some phrases as compound words when they’re not, such as maybe for may be or anyday for any day. (This isn’t even entirely true; anyday is almost nonexistent in print, even in American English, according to Google Ngram Viewer.) I guess from anyday it’s a short, slippery slope to complete language chaos, and then “we might as well all go off and kill ourselves.”

But it’s not clear what her complaint about erroneous compound words has to do with literacy levels. If the only problem with literacy is that some people write maybe when they mean may be, then it seems to be, as she originally says, an especially trivial point. Yes, some people deviate from standard orthography. While this may be irritating and may occasionally cause confusion, it’s not really an indication that people don’t know how to read or write. Even educated people make mistakes, and this has always been the case. It’s not a sign of impending doom.

But let’s consider the analogies she chose to illustrate linguists’ supposed negligence. She says that we’re like epidemiologists who simply catalog all the ways in which people die from diseases or like architects who make notes while buildings collapse. (Interestingly, she makes two remarks about how well paid linguists are. Of course, professors don’t actually make that much, especially those in the humanities or social sciences. And it smacks of hypocrisy from someone whose book has sold 3 million copies.)

Perhaps there is a minor crisis in literacy, at least in the UK. This article says that 16–24-year-olds in the UK are lagging behind many counterparts in other first-world countries. (The headline suggests that they’re trailing the entire world, but the study only looked at select countries from Europe and east Asia.) Wikipedia, however, says that the UK has a 99 percent literacy rate. Maybe young people are slipping a bit, and this is certainly something that educators should address, but it doesn’t appear that countless people are dying from an epidemic of slightly declining literacy rates or that our linguistic structures are collapsing. This is simply not the linguistic apocalypse that Truss makes it out to be.

Anyway, even if it were, why would it be linguists’ job to do something about it? Literacy is taught in primary and secondary school and is usually the responsibility of reading, language arts, or English teachers—not linguists. Why not criticize English professors for sitting back and collecting fat paychecks for writing about literary theory while our kids struggle to read? Because they’re not her ideological enemy, that’s why. Linguists often oppose language pedants like Truss, and so Truss finds some reason—contrived though it may be—to blame them. Though some applied linguists do in fact study things like language acquisition and literacy, most linguists hew to the more abstract and theoretical side of language—syntax, morphology, phonology, and so on. Blaming descriptive linguists for children’s illiteracy is like blaming physicists for children’s inability to ride bikes.

And maybe the real reason why linguists are unconcerned about the upcoming linguistic apocalypse is that there simply isn’t one. Maybe linguists are like meteorologists who observe that, contrary to the claims of some individuals, the sky is not actually falling. In studying the structure of other languages and the ways in which languages change, linguists have realized that language change is not decay. Consider the opening lines from Beowulf, an Old English epic poem over a thousand years old:

HWÆT, WE GAR-DEna in geardagum,
þeodcyninga þrym gefrunon,
hu ða æþelingas ellen fremedon!

Only two words are instantly recognizable to modern English speakers: we and in. The changes from Old English to modern English haven’t made the language better or worse—just different. Some people maintain that they understand that language changes but say that they still oppose certain changes that seem to come from ignorance or laziness. They fear that if we’re not vigilant in opposing such changes, we’ll lose our ability to communicate. But the truth is that most of those changes from Old English to modern English also came from ignorance or laziness, and we seem to communicate just fine today.

Languages can change very radically over time, but contrary to popular belief, they never devolve into caveman grunting. This is because we all have an interest in both understanding and being understood, and we’re flexible enough to adapt to changes that happen within our lifetime. And with language, as opposed to morality or ethics, there is no inherent right or wrong. Correct language is, in a nutshell, what its users consider to be correct for a given time, place, and audience. One generation’s ignorant change is sometimes the next generation’s proper grammar.

It’s no surprise that Truss fundamentally misunderstands what linguists and lexicographers do. She even admits that she was “seriously unqualified” for linguistic debate a few years back, and it seems that nothing has changed. But that probably won’t stop her from continuing to prophesy the imminent destruction of the English language. Maybe Truss is less like Chicken Little and more like the boy who cried wolf, proclaiming disaster not because she actually sees one coming, but rather because she likes the attention.

By

The Pronunciation of Smaug

With the recent release of the new Hobbit movie, The Desolation of Smaug, a lot of people have been talking about the pronunciation of the titular dragon’s name. The inclination for English speakers is to pronounce it like smog, but Tolkien made clear in his appendixes to The Lord of the Rings that the combination au was pronounced /au/ (“ow”), as it is in German. A quick search on Twitter shows that a lot of people are perplexed or annoyed by the pronunciation, with some even declaring that they refuse to see the movie because of it. Movie critic Eric D. Snider joked, “I’m calling him ‘Smeowg’ now. Someone please Photoshop him to reflect the change, thanks.” I happily obliged.

smeowg

I can haz desolashun?

So what is it about the pronunciation of Smaug that makes people so crazy? Simply put, it doesn’t fit modern English phonology. Phonology is the pattern of sounds in language (or the study of those patterns), including things like syllable structure, word stress, and permissible sound combinations. In my undergraduate phonology class, my professor once gave us an exercise: think of all the consonants that can follow /au/, and give an example of each. The first several came easily, but we started to run out quickly: out, house (both as a noun with /s/ and as a verb with /z/), owl, mouth (both as a noun with /θ/ and as a verb with /ð/), down, couch, hour, and gouge. What these sounds all have in common is that they’re coronal consonants, or those made with the front of the tongue.

The coronal consonants in modern Standard English are /d/, /t/, /s/, /z/, /ʃ/ (as in shoe), /ʒ/ (as in measure), /tʃ/ (as in church), /dʒ/ (as in judge) /l/, /r/, and /n/. As far as I know, only two coronal consonants are missing from the list of consonants that can follow /au/—/ʃ/ and /ʒ/, the voiceless and voiced postalveolar fricatives. By contrast, /g/ is a dorsal consonant, pronounced with the back of the tongue. There are some nonstandard dialects (such as Cockney and African American English) that change /θ/ to /f/ and thus pronounce words like mouth as /mauf/, but in Standard English the pattern holds; there are no words with /aup/ or /aum/ or /auk/. (The only exception I know of, howf, is a rare Scottish word that was apparently borrowed from Dutch, and it could be argued that it appears rarely enough in Standard English that it shouldn’t be considered a part of it. It appears not at all in the Corpus of Contemporary American English and only once in the Corpus of Historical American English, but it’s in scare quotes. I only know it as an occasionally handy Scrabble word.)

And this isn’t simply a case like orange or silver, where nothing happens to rhyme with them. Through the accidents of history, the /aug/ combination simply does not occur in modern English. Before the Great Vowel Shift, Middle English /au/ turned into /ɔ:/ (as in caught today). (Note: the : symbol here denotes that a vowel is long.) During the Great Vowel Shift, /u:/ turned into a new /au/, but apparently this /u:/ never occurred before non-coronal consonants. This means that in Middle English, either /u/ lengthened before coronals or /u:/ shortened before non-coronals; I’m not sure which. But either way, it left us with the unusual pattern we see in English today.

What all this technical gibberish means is that, in the absence of a clear pronunciation guide, readers will assume that the “au” in Smaug is pronounced as it is in other English words, which today is almost always /ɔ:/ or /ɑ:/. Thus most Americans will rhyme it with smog. (I can’t speak with authority about other varieties of English, but they would probably opt for one of those vowels or something similar, but not the diphthong /au/.) It’s not surprising that many readers will feel annoyed when told that their pronunciation clashes with the official pronunciation, which they find unintuitive and, frankly, rather non-English.

One final note: Michael Martinez suggests in this post that /smaug/ is not actually Tolkien’s intended pronunciation. After all, he says, the appendixes are a guide to the pronunciation of Elvish, and Smaug’s name is not Elvish. Martinez quotes one of Tolkien’s letters regarding the origin of the name: “The dragon bears as name—a pseudonym—the past tense of the primitive Germanic verb Smugan, to squeeze through a hole: a low philological jest.” He seems to take this as evidence against the pronunciation /smaug/, but this is probably because Tolkien was not as clear as he could have been. Smugan is the infinitive form; the past tense is—surprise—smaug.

Note: the definition given for the Proto-Germanic form doesn’t quite match Tolkien’s, though it appears to be the same verb; the Old English form, also with the infinitive smugan, is defined as “to creep, crawl, move gradually”. The astute student of language will notice that the past tense of the verb in Old English had the form smēag in the first and third person. This is because the Proto-Germanic /au/ became /ēa/ in Old English and /i:/ or /ai/ in modern English; compare the German auge ‘eye’ and the English eye. This demonstrates once again that English lost the combination /aug/ quite some time ago while its sister languages hung on to it.

So yes, it appears that Tolkien really did intend Smaug to be pronounced /smaug/, with that very un-English (but very Germanic) /aug/ combination at the end. He was a linguist and studied several languages in depth, particularly old Germanic languages such as Old English, Old Norse, and Gothic. He was certainly well aware of the pronunciation of the word, even if he didn’t make it clear to his readers. You can find the pronunciation silly if you want, you can hate it, and you can even threaten to boycott the movie, but you can’t call it wrong.

By

Now on Visual Thesaurus: “Electrocution: A Shocking Misuse?”

I have a new post up on Visual Thesaurus about the use, misuse, and history of the word electrocute. Some usage commentators today insist that it be used only to refer to death by electric shock; that is, you can’t say you’ve been electrocuted if you lived to tell the tale. But the history, unsurprisingly, is more complicated: there have been disputes about the word since its birth.

As always, the article is for subscribers only, but a subscription costs a paltry $2.95 a month or $19.95 (and would make a great gift for the word lover in your life). Check it out.

By

Yes, Irregardless Is a Word

My last post, “12 Mistakes Nearly Everyone Who Writes about Grammar Mistakes Makes”, drew a lot of comments, some supportive and some critical. But no point drew as much ire as my claim that irregardless is a word. Some stated flatly, “Irregardless is not a word.” One ignorantly demanded, “Show me a dictionary that actually contains that word.” (I could show him several.) Still others argued that it was a double negative, that it was logically and morphologically ill-formed and thus had no meaning. One commenter said that “with the negating preface [prefix] ‘ir-’ and the negating suffix ‘-less’, it is a double negative” and that “it is not a synonym with ‘regardless’.” Another was even cleverer, saying, “The prefix ir-, meaning not, changes the meaning of the word regardless, so not only is it not a standard word, but it’s also misused in nearly all cases.” But these arguments still miss the point: irregardless is indeed a word, and it means the same thing as regardless.

In my last post I argued that there’s a clear difference between a word like irregardless and a nonword like flirgle. By any objective criterion, irregardless is a word. It has an established form and meaning, it’s used in speech and occasionally in writing, and it’s even found in reputable dictionaries, including Merriam-Webster’s Collegiate Dictionary and The Oxford English Dictionary (though it is, quite appropriately, labeled nonstandard). We can identify its part of speech (it’s an adverb) and describe how it’s used. By contrast, though, consider flirgle. You don’t know what its part of speech is or how to use it, and if I were to use it in a sentence, you wouldn’t know what it meant. This is because it’s just something I made up by stringing some sounds together. But when someone uses irregardless, you know exactly what it means, even if you want to pretend otherwise.

This is because words get their wordhood not from etymology or logic or some cultural institution granting them official status, but by convention. It doesn’t matter that nice originally meant “ignorant” or that contact was originally only a noun or that television is formed from a blend of Greek and Latin roots; what matters is how people use these words now. This makes some people uncomfortable because it sounds like anarchy, but it’s more like the ultimate democracy or free market. We all want to understand one another and be understood, so it’s in our mutual interest to communicate in ways that are understandable. Language is a self-regulating system guided by the invisible hand of its users’ desire to communicate—not that this stops people from feeling the need for overt regulation.

One commenter, the same who said, “Irregardless is not a word,” noted rather aptly, “There is absolutely no value to ‘irregardless’ except to recognize people who didn’t study.” Exactly. There is nothing wrong with its ability to communicate; it’s only the word’s metacommunication—that is, what it communicates about its user—that is problematic. To put it a different way, the problem with irregardless is entirely social: if you use it, you’ll be thought of as uneducated, even though everyone can understand you just fine.

On Google Plus, my friend Rivka said, “Accepting it as a word is the first part of the slippery slope.” This seems like a valid fear, but I believe it is misplaced. First of all, we need to be clear about what it means to accept irregardless as a word. I accept that it’s a word, but this does not mean that I find the word acceptable. I can accept that people do all kinds of things that I don’t like. But the real problem isn’t what we mean by accept; it’s what we mean by word. When people say that something isn’t a word, they aren’t really making a testable claim about the objective linguistic status of the word; they’re making a sociolinguistic evaluation of the word. They may say that it’s not a word, but they really mean that it’s a word that’s not allowed in Standard English. This is because we think of Standard English as the only legitimate form of English. We think that the standard has words and grammar, while nonstandard dialects have nonwords and broken grammar, or no grammar at all. Yes, it’s important to recognize and teach the difference between Standard English and nonstandard forms, but it’s also important to be clear about the difference between facts about the language and our feelings about the language.

But the irregardless-haters can also take heart: the word has been around for at least a century now, and although many other new words have been coined and become part of Standard English in that time, irregardless shows no signs of moving towards acceptability. Most people who write for publication are well aware of the stigma attached to it, and even if they aren’t, few copyeditors are willing to let it into print. It’s telling that of the Oxford English Dictionary’s eight citations of the word, two merely cite the word in other dictionaries, three more are mentions or citations in linguistics or literary journals, and one more appears to be using the word ironically. We talk about the word irregardless—mostly just to complain about it—far more than we actually use it.

So yes, irregardless is a word, even though it’s nonstandard. You don’t have to like it, and you certainly don’t have to use it, but you also don’t have to worry about it becoming acceptable anytime soon.

This post also appears on Huffington Post.

By

15 Percent Off Shirts

Today through November 24th, you can get 15 percent off all orders at the Arrant Pedantry Store when you use the coupon code WITHLOVE at checkout. It’s a good chance to get the word nerd in your life (or yourself) a little something for Christmas.

By

12 Mistakes Nearly Everyone Who Writes About Grammar Mistakes Makes

There are a lot of bad grammar posts in the world. These days, anyone with a blog and a bunch of pet peeves can crank out a click-bait listicle of supposed grammar errors. There’s just one problem—these articles are often full of mistakes of one sort or another themselves. Once you’ve read a few, you start noticing some patterns. Inspired by a recent post titled “Grammar Police: Twelve Mistakes Nearly Everyone Makes”, I decided to make a list of my own.

1. Confusing grammar with spelling, punctuation, and usage. Many people who write about grammar seem to think that grammar means “any sort of rule of language, especially writing”. But strictly speaking, grammar refers to the structural rules of language, namely morphology (basically the way words are formed from roots and affixes), phonology (the system of sounds in a language), and syntax (the way phrases and clauses are formed from words). Most complaints about grammar are really about punctuation, spelling (such as problems with you’re/your and other homophone confusion) or usage (which is often about semantics). This post, for instance, spends two of its twelve points on commas and a third on quotation marks.

2. Treating style choices as rules. This article says that you should always use an Oxford (or serial) comma (the comma before and or or in a list) and that quotation marks should always follow commas and periods, but the latter is true only in most American styles (linguists often put the commas and periods outside quotes, and so do many non-American styles), and the former is only true of some American styles. I may prefer serial commas, but I’m not going to insist that everyone who doesn’t use them is making a mistake. It’s simply a matter of style, and style varies from one publisher to the next.

3. Ignoring register. There’s a time and a place for following the rules, but the writers of these lists typically treat English as though it had only one register: formal writing. They ignore the fact that following the rules in the wrong setting often sounds stuffy and stilted. Formal written English is not the only legitimate form of the language, and the rules of formal written English don’t apply in all situations. Sure, it’s useful to know when to use who and whom, but it’s probably more useful to know that saying To whom did you give the book? in casual conversation will make you sound like a pompous twit.

4. Saying that a disliked word isn’t a word. You may hate irregardless (I do), but that doesn’t mean it’s not a word. If it has its own meaning and you can use it in a sentence, guess what—it’s a word. Flirgle, on the other hand, is not a word—it’s just a bunch of sounds that I strung together in word-like fashion. Irregardless and its ilk may not be appropriate for use in formal registers, and you certainly don’t have to like them, but as Stan Carey says, “‘Not a word’ is not an argument.”

5. Turning proposals into ironclad laws. This one happens more often than you think. A great many rules of grammar and usage started life as proposals that became codified as inviolable laws over the years. The popular that/which rule, which I’ve discussed at length before, began as a proposal—not “everyone gets this wrong” but “wouldn’t it be nice if we made a distinction here?” But nowadays people have forgotten that a century or so ago, this rule simply didn’t exist, and they say things like “This is one of the most common mistakes out there, and understandably so.” (Actually, no, you don’t understand why everyone gets this “wrong”, because you don’t realize that this rule is a relatively recent invention by usage commentators that some copy editors and others have decided to enforce.) It’s easy to criticize people for not following rules that you’ve made up.

6. Failing to discuss exceptions to rules. Invented usage rules often ignore the complexities of actual usage. Lists of rules such as these go a step further and often ignore the complexities of those rules. For example, even if you follow the that/which rule, you need to know that you can’t use that after a preposition or after the demonstrative pronoun that—you have to use a restrictive which. Likewise, the less/fewer rule is usually reduced to statements like “use fewer for things you can count”, which leads to ugly and unidiomatic constructions like “one fewer thing to worry about”. Affect and effect aren’t as simple as some people make them out to be, either; affect is usually a verb and effect a noun, but affect can also be a noun (with stress on the first syllable) referring to the outward manifestation of emotions, while effect can be a verb meaning to cause or to make happen. Sometimes dumbing down rules just makes them dumb.

7. Overestimating the frequency of errors. The writer of this list says that misuse of nauseous is “Undoubtedly the most common mistake I encounter.” This claim seems worth doubting to me; I can’t remember the last time I heard someone say “nauseous”. Even if you consider it a misuse, it’s got to rate pretty far down the list in terms of frequency. This is why linguists like to rely on data for testable claims—because people tend to fall prey to all kinds of cognitive biases such as the frequency illusion.

8. Believing that etymology is destiny. Words change meaning all the time—it’s just a natural and inevitable part of language. But some people get fixated on the original meanings of some words and believe that those are the only correct meanings. For example, they’ll say that you can only use decimate to mean “to destroy one in ten”. This may seem like a reasonable argument, but it quickly becomes untenable when you realize that almost every single word in the language has changed meaning at some point, and that’s just in the few thousand years in which language has been written or can be reconstructed. And sometimes a new meaning is more useful anyway (which is precisely why it displaced an old meaning). As Jan Freeman said, “We don’t especially need a term that means ‘kill one in 10.'”

9. Simply bungling the rules. If you’re going to chastise people for not following the rules, you should know those rules yourself and be able to explain them clearly. You may dislike singular they, for instance, but you should know that it’s not a case of subject-predicate disagreement, as the author of this list claims—it’s an issue of pronoun-antecedent agreement, which is not the same thing. This list says that “‘less’ is reserved for hypothetical quantities”, but this isn’t true either; it’s reserved for noncount nouns, singular count nouns, and plural count nouns that aren’t generally thought of as discrete entities. Use of less has nothing to do with being hypothetical. And this one says that punctuation always goes inside quotation marks. In most American styles, it’s only commas and periods that always go inside. Colons, semicolons, and dashes always go outside, and question marks and exclamation marks only go inside sometimes.

10. Saying that good grammar leads to good communication. Contrary to popular belief, bad grammar (even using the broad definition that includes usage, spelling, and punctuation) is not usually an impediment to communication. A sentence like Ain’t nobody got time for that is quite intelligible, even though it violates several rules of Standard English. The grammar and usage of nonstandard varieties of English are often radically different from Standard English, but different does not mean worse or less able to communicate. The biggest differences between Standard English and all its nonstandard varieties are that the former has been codified and that it is used in all registers, from casual conversation to formal writing. Many of the rules that these lists propagate are really more about signaling to the grammatical elite that you’re one of them—not that this is a bad thing, of course, but let’s not mistake it for something it’s not. In fact, claims about improving communication are often just a cover for the real purpose of these lists, which is . . .

11. Using grammar to put people down. This post sympathizes with someone who worries about being crucified by the grammar police and then says a few paragraphs later, “All hail the grammar police!” In other words, we like being able to crucify those who make mistakes. Then there are the put-downs about people’s education (“You’d think everyone learned this rule in fourth grade”) and more outright insults (“5 Grammar Mistakes that Make You Sound Like a Chimp”). After all, what’s the point in signaling that you’re one of the grammatical elite if you can’t take a few potshots at the ignorant masses?

12. Forgetting that correct usage ultimately comes from users. The disdain for the usage of common people is symptomatic of a larger problem: forgetting that correct usage ultimately comes from the people, not from editors, English teachers, or usage commentators. You’re certainly entitled to have your opinion about usage, but at some point you have to recognize that trying to fight the masses on a particular point of usage (especially if it’s a made-up rule) is like trying to fight the rising tide. Those who have invested in learning the rules naturally feel defensive of them and of the language in general, but you have no more right to the language than anyone else. You can be restrictive if you want and say that Standard English is based on the formal usage of educated writers, but any standard that is based on a set of rules that are simply invented and passed down is ultimately untenable.

And a bonus mistake:

13. Making mistakes themselves. It happens to the best of us. The act of making grammar or spelling mistakes in the course of pointing out someone else’s mistakes even has a name, Muphry’s law. This post probably has its fair share of typos. (If you spot one, feel free to point it out—politely!—in the comments.)

This post also appears on Huffington Post.

By

Book Review: Shady Characters

Shady_1e.inddI recently received a review copy of Keith Houston’s new book, Shady Characters: The Secret Life of Punctuation, Symbols, and Other Typographical Marks, based on his excellent blog of the same name. The first delightful surprise I found inside is that, in a tribute to medieval manuscripts and early printed books, the book is rubricated—the drop caps, special characters, figure numbers, and dingbats are printed in red. It’s a fitting design choice for a book that takes it readers through the history of the written word.

Each chapter covers a different punctuation mark or typographical symbol, starting with the pilcrow (also known as the paragraph mark, ¶). The first chapter ranges through the beginnings of Greek and Roman writing, the spread of Christianity in Europe, monastic manuscript copying, and the rise of modern typography. Partway through the chapter, I started to wonder where on earth it was all going, but as all the pieces came together, I realized what I treat I was in for. Houston has a knack for turning otherwise dry historical facts into a compelling narrative, picking out the thread of each character’s story while following it down all kinds of scenic side roads and intriguing back alleys.

The rest of the book follows much the same pattern, with trips through the birth of textual criticism in the Great Library of Alexandria, the lasting influence of Roman weights and measures, the invention of the printing press and the birth of typography, the invention of the novel, the standardization of keyboards and telephone keypads, and the beginnings of the internet. And in each chapter, Houston pulls together seemingly unrelated threads of history into a fascinating story of the origin of a familiar typographical or punctuation mark. As an editor and typesetter, I particularly appreciated his lucid treatment of the functions and appearances of the various kinds of hyphens and dashes, including the hated all-purpose hyphen-minus.

Through it all, Houston manages to muster an impressive amount of research (the endnotes take up nearly seventy pages) while keeping the text interesting and accessible. The only part where I got bogged down at all was the chapter on sarcasm and irony, which, unlike the other chapters, focuses on a set of marks that didn’t succeed. It covers various proposals over the years to create a mark or text style to indicate irony or sarcasm. But since it’s an account of failed punctuation marks, there’s an unavoidable sameness to each story—someone proposes a new punctuation mark, it fails to get off the ground, and it’s relegated to the dustbin of history. This isn’t to say that the stories aren’t interesting, just that I found them less compelling than the stories of the punctuation marks that survived.

One other problem is that some of the images are hard to read. I sometimes found it hard to pick out the character I was supposed to see in a faded and tattered Greek papyrus. Increasing the contrast or highlighting the character in question would have been helpful.

Those quibbles aside, it’s a delightful book, full of little gems like this: “In Gutenberg’s day the first rule of Hyphenation Club was that there are no rules.” (Gutenberg’s famous forty-two-line Bible features stacks of up to eight end-of-line hyphens, which would make modern typesetters and proofreaders hyperventilate.) Throughout the book, Houston successfully weaves together history, technology, and design in telling the stories of characters that we’ve seen countless times without giving a second thought to. I highly recommend it to all lovers of typography and the written word.

Disclosure: I received a review copy of Shady Characters from W. W. Norton.

By

Free Shipping Again

Once again I apologize for not posting anything new lately. I had a crazy summer of freelancing, job hunting, moving, and starting a new job, so I just haven’t had time to write recently. I hope to have something soon. But in the meanwhile, you can enjoy free shipping from the Arrant Pedantry Store when you buy two or more items and use the coupon code FALL2013. The code is good until September 17th.

If you haven’t checked out my store in a while, please take a look. You may have missed some of the newer designs like IPA for the Win and Stet Wars: The Editor Strikes Back. And of course, there are always perennial classics like Word Nerd and Battlestar Grammatica.

By

Solstices, Vegetables, and Official Definitions

Summer officially began just a few days ago—at least that’s what the calendar says. June 20 was the summer solstice, the day when the northern hemisphere is most inclined towards the sun and consequently receives the most daylight. By this definition, summer lasts until the autumnal equinox, in late September, when days and nights are of equal length. But by other definitions, summer starts at the beginning of June and goes through August. Other less formal definitions may put the start of summer on Memorial Day or after the end of the school year (which for my children were the same this year).

For years I wondered why summer officially began so late into June. After all, shouldn’t the solstice, as the day when we receive the most sunlight, be the middle of summer rather than the start? But even though it receives the most sunlight, it’s not the hottest, thanks to something called seasonal lag. The oceans absorb a large amount of heat and continue to release that heat for quite some time after the solstice, so the hottest day may come a month or more after the day that receives the most solar energy. Summer officially starts later than it should to compensate for this lag.

But what does this have to do with language? It’s all about definitions, and definitions are arbitrary things. Laypeople may think of June 1 as the start of summer, but June 1 is a day of absolutely no meteorological or astronomical significance. So someone decided that the solstice would be the official start of summer, even though the period from June 20/21 to September 22/23 doesn’t completely encompass the hottest days of the year (at least not in most of the United States).

Sometimes the clash between common and scientific definitions engenders endless debate. Take the well-known argument about whether tomatoes are fruit. By the common culinary definition, tomatoes are vegetables, because they are used mostly in savory or salty dishes. Botanically, though, they’re fruit, because they’re formed from a plant’s ovaries and contain seeds. But tomatoes aren’t the only culinary vegetables that are botanical fruits: cucumbers, squashes, peas, beans, avocados, eggplants, and many other things commonly thought of as vegetables are actually fruits.

The question of whether a tomato is a fruit or a vegetable may have entered popular mythology following a Supreme Court case in 1893 that answered the question of whether imported tomatoes should be taxed as vegetables. The Supreme Court ruled that the law was written with the common definition in mind, so tomatoes got taxed, and people are still arguing about it over a century later.

Sometimes these definitional clashes even lead to strong emotions. Consider how many people got upset when the International Astronomical Union decided that Pluto wasn’t really a planet. People who probably hadn’t thought about planetary astronomy since elementary school passionately proclaimed that Pluto was always their favorite planet. Even some astronomers declared, “Pluto’s dead.” But nothing actually happened to Pluto, just to our definition of planet. Astronomers had discovered several other Pluto-like objects and suspect that there may be a hundred or more such objects in the outer reaches of the solar system.

Does it really make sense to call all of these objects planets? Should we expect students to learn the names of Eris, Sedna, Quaoar, Orcus, and whatever other bodies are discovered and named? Or is it perhaps more reasonable to use some agreed-upon criteria and draw a clear line between planets and other objects? After all, that’s part of what scientists do: try to increase our understanding of the natural world by describing features of and discovering relationships among different things. Sometimes the definitions are arbitrary, but they’re arbitrary in ways that are useful to scientists.

And this is the crux of the matter: sometimes definitions that are useful to scientists aren’t that useful to laypeople, just as common definitions aren’t always useful to scientists. These definitions are used by different people for different purposes, and so they continue to exist side by side. Scientific definitions have their place, but they’re not automatically or inherently more correct than common definitions. And there’s nothing wrong with this. After all, tomatoes may be fruit, but I don’t want them in my fruit salad.

By

New Posts Elsewhere

I have a couple of new posts up elsewhere: a brief one at Copyediting discussing those dialect maps that are making the rounds and asking whether Americans really talk that differently from each other, and a longer one at Visual Thesaurus (subscription required) discussing the role of copy editors in driving restrictive relative which out of use. Stay tuned, and I’ll try to have something new up here in the next few days.