Arrant Pedantry

By

Book Review: What the F

whatthef Disclosure: I received a free advance review copy of this book from the publisher, Basic Books.

I was a little nervous when I was asked to review Benjamin K. Bergen’s new book, What the F: What Swearing Reveals About Our Language, Our Brains, and Ourselves. Unlike many of my linguist and editor friends, I’m not much of a swearer. I was raised in a fairly conservative religious household, and I can count the number of times I swore as a child on one hand with some fingers left over. Even now I swear pretty rarely. When someone asked me if I’d like to contribute to the group blog Strong Language (tagline: a sweary blog about swearing), I politely declined simply because I wouldn’t have much to add.

But even for someone with as clean a mouth as me, What the F is a fascinating read. Bergen starts by looking at the different realms swear words come from, like religion, sex, bodily effluvia, and disparaged groups. Most swear words across cultures probably fall into one of these categories, but different categories are weighted differently across cultures. For example, in French-speaking Quebec, some of the most offensive words are religious terms, even though most Quebecois nowadays are not very religious. Japanese, on the other hand, is said to lack dedicated swear words, but it still has ways to express the same ideas.

Bergen then dives into what makes a swear word a swear word, exploring concepts like sound symbolism to see whether there’s something innately sweary about certain words. In English, at least, there are some strong tendencies—our swear words tend to be monosyllabic and end with a consonant, especially consonants lower on the sonority hierarchy, like stops, affricates, and fricatives. That is, a word ending in k sounds swearier than a word ending in m. But this doesn’t necessarily hold across other languages, and it doesn’t offer a complete explanation for why English swear words are what they are. There are certainly other words that fit the pattern but aren’t swears. To a large extent it’s simply arbitrary.

Similarly, gestures like flipping the bird are largely arbitrary too, despite what appears to be some striking iconicity. But rude gestures vary widely, so that a gesture that seems harmless to Americans, like a thumbs-up or an A-OK, can be just as offensive as the bird in other countries. Even swearing in sign language isn’t as symbolic or iconic as you might think; signs for the f-word are quite different in American and British Sign Language, though the connection between signifier and signified is perhaps a little less arbitrary than in spoken language. Swear words are swear words because convention says they are. If you hear people use a certain word a certain way, you figure out pretty quickly what it means.

Some of the most fascinating parts of the book, though, come from what swearing tells us about how the brain works. Most students of linguistics probably know that some stroke victims can still swear fluently even if their other language abilities are severely impaired, which tells us that swearing uses different mental circuitry from regular language—swearing taps into much more primal neural hardware in the basal ganglia. On the flip side, Tourette’s syndrome, which involves dysfunction of the basal ganglia, can cause an overwhelming urge to swear. Some deaf people with Tourette’s feel the same urge, but the swearing comes out via their hands rather than their mouths. And the fact that the brain reacts to prevent us from accidentally saying swear words shows that we have a built-in censor monitoring our speech as it’s produced.

In a later chapter, Bergen debunks a paper by a team from where else but the School of Family Life at my alma mater, Brigham Young University, that purported to show that exposure to swearing actually harms children. Although there’s evidence that slurs can harm children, and verbal abuse in general can be harmful, there’s actually no evidence that exposure to swearing causes children harm. And Bergen ends with a thoughtful chapter titled “The Paradox of Profanity”, which argues that profanity gets much of their power from our attempts to suppress it. The less frequently we hear a swear word, the more shocking it is when we do hear it.

Throughout the book, Bergen maintains a nice balance between academic and approachable. The book is backed up by copious notes, but the writing is engaging and often funny, as when a footnote on the “various other manifestations” of the chicken gesture (“bent elbows moving up and down to depict chicken wings”) led to this Arrested Development clip.

Come for the swears; stay for a fascinating exploration of language and humanity.

What the F: What Swearing Reveals About Our Language, Our Brains, and Ourselves is available now at Amazon and other booksellers.

By

To Boldly Split Infinitives

Today is the fiftieth anniversary of the first airing of Star Trek, so I thought it was a good opportunity to talk about split infinitives. (So did Merriam-Webster, which beat me to the punch.) If you’re unfamiliar with split infinitives or have thankfully managed to forget what they are since your high school days, it’s when you put some sort of modifier between the to and the infinitive verb itself—that is, a verb that is not inflected for tense, like be or go—and for many years it was considered verboten.

Kirk’s opening monologue on the show famously featured the split infinitive “to boldly go”, and it’s hard to imagine the phrase working so well without it. “To go boldly” and “boldly to go” both sound terribly clunky, partly because they ruin the rhythm of the phrase. “To BOLDly GO” is a nice iambic bimeter, meaning that it has two metrical feet, each consisting of an unstressed syllable followed by a stressed syllable—duh-DUN duh-DUN. “BOLDly to GO” is a trochee followed by an iamb, meaning that we have a stressed syllable, two unstressed syllables, and then another stressed syllable—DUN-duh duh-DUN. “To GO BOLDly” is the reverse, an iamb followed by a trochee, leading to a stress clash in the middle where the two stresses butt up against each other and then ending on a weaker unstressed syllable. Blech.

But the root of the alleged problem with split infinitives concerns not meter but syntax. The question is where it’s syntactically permissible to put a modifier in a to-infinitive phrase. Normally, an adverb would go just in front of the verb it modifies, as in She boldly goes or He will boldly go. Things were a little different when the verb was an infinitive form preceded by to. In this case the adverb often went in front of the to, not in front of the verb itself.

As Merriam-Webster’s post notes, split infinitives date back at least to the fourteenth century, though they were not as common back then and were often used in different ways than they are today. But they mostly fell out of use in the sixteenth century and then roared back to life in the eighteenth century, only to be condemned by usage commentators in the nineteenth and twentieth centuries. (Incidentally, this illustrates a common pattern of prescriptivist complaints: a new usage arises, or perhaps it has existed for literally millennia, it goes unnoticed for decades or even centuries, someone finally notices it and decides they don’t like it (often because they don’t understand it), and suddenly everyone starts decrying this terrible new thing that’s ruining English.)

It’s not particularly clear, though, why people thought that this particular thing was ruining English. The older boldly to go was replaced by the resurgent to boldly go. It’s often claimed that people objected to split infinitives on the basis of analogy with Latin (Merriam-Webster’s post repeats this claim). In Latin, an infinitive is a single word, like ire, and it can’t be split. Ergo, since you can’t split infinitives in Latin, you shouldn’t be able to split them in English either. The problem with this theory is that there’s no evidence to support it. Here’s the earliest recorded criticism of the split infinitive, according to Wikipedia:

The practice of separating the prefix of the infinitive mode from the verb, by the intervention of an adverb, is not unfrequent among uneducated persons. . . . I am not conscious, that any rule has been heretofore given in relation to this point. . . . The practice, however, of not separating the particle from its verb, is so general and uniform among good authors, and the exceptions are so rare, that the rule which I am about to propose will, I believe, prove to be as accurate as most rules, and may be found beneficial to inexperienced writers. It is this :—The particle, TO, which comes before the verb in the infinitive mode, must not be separated from it by the intervention of an adverb or any other word or phrase; but the adverb should immediately precede the particle, or immediately follow the verb.

No mention of Latin or of the supposed unsplittability of infinitives. In fact, the only real argument is that uneducated people split infinitives, while good authors didn’t. Some modern usage commentators have used this purported Latin origin of the rule as the basis of a straw-man argument: Latin couldn’t split infinitives, but English isn’t Latin, so the rule isn’t valid. Unfortunately, Merriam-Webster’s post does the same thing:

The rule against splitting the infinitive comes, as do many of our more irrational rules, from a desire to more rigidly adhere (or, if you prefer, “to adhere more rigidly”) to the structure of Latin. As in Old English, Latin infinitives are written as single words: there are no split infinitives, because a single word is difficult to split. Some linguistic commenters have pointed out that English isn’t splitting its infinitives, since the word to is not actually a part of the infinitive, but merely an appurtenance of it.

The problem with this argument (aside from the fact that the rule wasn’t based on Latin) is that modern English infinitives—not just Old English infinitives—are only one word too and can’t be split either. The infinitive in to boldly go is just go, and go certainly can’t be split. So this line of argument misses the point: the question isn’t whether the infinitive verb, which is a single word, can be split in half, but whether an adverb can be placed between to and the verb. As Merriam-Webster’s Dictionary of English Usage notes, the term split infinitive is a misnomer, since it’s not really the infinitive but the construction containing an infinitive that’s being split.

But in recent years I’ve seen some people take this terminological argument even further, saying that split infinitives don’t even exist because English infinitives can’t be split. I think this is silly. Of course they exist. It used to be that people would say boldly to go; then they started saying to boldly go instead. It doesn’t matter what you call the phenomenon of moving the adverb so that it’s snug up against the verb—it’s still a phenomenon. As Arnold Zwicky likes to say, “Labels are not definitions.” Just because the name doesn’t accurately describe the phenomenon doesn’t mean it doesn’t exist. We could call this phenomenon Steve, and it wouldn’t change what it is.

At this point, the most noteworthy thing about the split infinitive is that there are still some people who think there’s something wrong with it. The original objection was that it was wrong because uneducated people used it and good writers didn’t, but that hasn’t been true in decades. Most usage commentators have long since given up their objections to it, and some even point out that avoiding a split infinitive can cause awkwardness or even ambiguity. In his book The Sense of Style, Steven Pinker gives the example The board voted immediately to approve the casino. Which word does immediately modify—voted or approve?

But this hasn’t stopped The Economist from maintaining its opposition to split infinitives. Its style guide says, “Happy the man who has never been told that it is wrong to split an infinitive: the ban is pointless. Unfortunately, to see it broken is so annoying to so many people that you should observe it.”

I call BS on this. Most usage commentators have moved on, and I suspect that most laypeople either don’t know or don’t care what a split infinitive is. I don’t think I know a single copy editor who’s bothered by them. If you’ve been worrying about splitting infinitives since your high school English teacher beat the fear of them into you, it’s time to let it go. If they’re good enough for Star Trek, they’re good enough for you too.

But just for fun, let’s do a little poll:

Do you find split infinitives annoying?

View Results

Loading ... Loading ...

By

Book Review: The Subversive Copy Editor

Disclosure: I received a free copy of this book from the University of Chicago Press.

subversiveI have a terrible editor confession:1You can choose to read that either as a terrible confession for an editor or as the confession of a terrible editor. until now, I had not read Carol Fisher Saller’s book The Subversive Copy Editor. I also have to take back what I said about But Can I Start a Sentence with “But”?this is the best book on editing I’ve ever read.

The book, now in its second edition, has been revised and expanded with new chapters. In the introduction, Saller explains just what she means by “subversive”—rather than sneaking errors into print to sabotage the writer, she aims to subvert the stereotype of the editor locked in an eternal struggle with the writer or so bound by pointless rules that they can’t see the forest of the copy for the trees of supposed errors.

I find Saller’s views on editing absolutely refreshing. I’ve never been a fan of the idea that editors and authors are mortal enemies locked in an eternal struggle. Authors want to share their ideas, and readers, we hope, want to read them; editors help facilitate the exchange. Shouldn’t we all be on the same side?

Saller starts with a few important reminders—copy editors aren’t the boss, and the copy doesn’t belong to us—before diving into some practical advice on how to establish good author-editor relations. It all starts with an introductory phone call or email, which is the editor’s chance to establish their carefulness, transparency, and flexibility. If you show the author from the beginning that you’re on their side, the project should get off to a good start.

And to maintain good relations throughout a project, it’s important to keep showing that you’re careful, transparent, and flexible. Don’t bombard the author with too many queries about things that they don’t know or care about like arbitrary points of style. Just make a decision, explain it succinctly if you feel the need, and move on. And don’t lecture or condescend in your queries either. Saller recommends reading through all of your queries again once you get to the end of a project, because sometimes you read a query you wrote days ago and realize you unintentionally come across as a bit of a jerk.

Too many editors mechanically apply a style without stopping to ask themselves whether they’re making the manuscript better or merely making it different. Sometimes a manuscript won’t perfectly conform to Chicago or whatever style you may be using, but that can be okay as long as it’s consistent and not wrong. (If you’re editing for an academic journal or other publication with a rigid style, of course, that’s a different story.) But there’s no reason to spend hours and hours changing an entire book manuscript from one arbitrary but valid style to another equally arbitrary but valid style. Not only have you wasted time and probably irritated the author, but there’s a good chance that you’ve missed something, introduced errors, or both. Rather than “What’s the rule?” Saller suggests asking, “What is helpful?” or “What makes sense?”

And Saller doesn’t have much patience for editors who get “hung up on phantom issues and personal bugaboos,” who feel compelled to “ferret out every last which and change it to that2I saw this happen once on a proofread. Remarkably, I don’t think the author used a single relative that in the entire book. The proofreader hunted down every last restrictive which and changed it to that—and missed a lot of real errors in the process. And changing that many whiches to thats surely would have wreaked havoc with the copyfitting.—if you’re still relying on your high school English teacher’s lectures on grammar, you need to get with the times. Get some good (current!) reference books. Learn to look things up online.

I also appreciated the advice on how to manage difficult projects. When faced with a seemingly insurmountable task, Saller recommends a few simple steps: automate, delegate, reevaluate, and accept your fate. See if you can find a macro or other software tool to save you from having to grind through long, repetitive tasks. Delegate things to an intern if possible. (Sorry, interns!) Ask yourself whether you really need to do what you think needs to be done. And if all else fails, simply knuckle down and get through it.

There’s also a chapter to help writers navigate the copyediting process, along with chapters on learning to use your word processor better, managing deadlines, working as a freelancer, and more. And throughout it all Saller provides sensible, practical advice. Some of my favorite bits come from a chapter called “The Zen of Copyediting,” which aims to help editors let go of the things that don’t really matter. When faced with an apathetic author, one of Saller’s colleagues tells herself, “You can’t care about the book more than the author.” Saller herself dares to suggest that “some of our ‘standards’ are just time-consuming habits that don’t really make a difference to the reader.” And finally, one of Saller’s former mentors liked to say, “Remember—it’s only a book.”

Whether you’re a seasoned editor or a novice just breaking into the field, The Subversive Copy Editor provides sage advice on just about every aspect of the job. It should be a part of every editor’s library.

The Subversive Copy Editor is available now at Amazon and other booksellers.

Notes   [ + ]

1. You can choose to read that either as a terrible confession for an editor or as the confession of a terrible editor.
2. I saw this happen once on a proofread. Remarkably, I don’t think the author used a single relative that in the entire book. The proofreader hunted down every last restrictive which and changed it to that—and missed a lot of real errors in the process. And changing that many whiches to thats surely would have wreaked havoc with the copyfitting.

By

Whoa There

Recently, the freelance writer and film critic Eric Snider tweeted this:

A few days later, a friend linked to this discussion thread on Goodreads started by sci-fi/fantasy author Lois McMaster Bujold. In it, Bujold asked readers to help her with what she called “distributed proofreading” (I’ll just note in passing that the idea of crowdsourcing your proofreading makes my skin crawl), and one reader helpfully pointed out that Bujold had misspelled “whoa” as “woah”. Bujold responded that whoa and woah mean different things, so it was not a misspelling: “‘Whoa!’ is a command meaning ‘Stop!’ ‘Woah!’ is an exclamation of astonishment, rendered phonetically. The original meaning stands.”

I’ve never read anything by Bujold, so I have no idea whether she was being a little tongue in cheek or whether she was simply mistaken. Woah doesn’t appear in Merriam-Webster’s Collegiate Dictionary, The American Heritage Dictionary, or The Random House Dictionary. According to all of these, there is just one word, whoa, that can be used as a command (often to a horse) to stop or as an exclamation of surprise.

The Oxford English Dictionary, of course, paints a more complicated picture. Whoa dates to the 1800s and is a variant of an earlier who (pronounced the same as whoa, not the same as the interrogative pronoun who), which dates to the 1400s. Who, in turn, is a variant of an earlier ho, which was borrowed from Old French in the 1300s. Some of the spellings recorded in the OED for these three related words are whoa, whow, whoo, whoe, hoo, hoe, and hoe. A search for woah leads to the entry woa, which is listed as a variant of whoa, with the forms woa and woah. Dinosaur Comics author Ryan North seems to prefer the hybrid form “whoah”:

whoah

But despite this variation, a search for whoa and woah in the Google Books Ngrams Viewer shows that whoa has been the overwhelmingly more popular form for at least the last two hundred years.

But a search in the BYU GloWbe Corpus, which includes unedited material from the web, shows that whoa occurs at a rate of 2.02 per million in blogs and woah at a rate of .8 per million—not neck and neck, but much closer than we see in the edited material in Google Books. This means that an awful lot of people misspell whoa, and those misspellings are generally edited out of published writing. (Though Bujold’s books are apparently an exception; maybe she talked a copyeditor into letting her keep woah.)

It seems obvious where people are getting the woah spelling: yeah is spelled very similarly, with a semivowel, two vowels, and a silent h. And if you’re like many Americans and don’t distinguish between wh and w—that is, you pronounce which and witch identically—then it’s not obvious where the h goes.

But as Eric Snider noted, many people don’t seem to know how to spell yeah either. The OED says that yeah is a casual pronunciation of yes that originated in the US around 1900. The entry for yeh says much the same thing: “colloq. or dial. var. of yes n.1 or yea v. ” The earliest citation dates to 1920. The entry for yah, interestingly, says that it’s a representation of German or Dutch speech (in both of these languages, the word for “yes” is ja), and the earliest citation dates to 1863. A citation from the London Daily News in 1905 reads, “America..has two substitutes for ‘yes.’ One of them is ‘yep’ and the other is ‘yah.’” I have to wonder if German and Dutch influenced the rise and spread of yeah in American English.

But regardless of its ultimate origin, yeah arose in speech, and so it’s no surprise that people came up with different ways to spell this new word. Still, most people have settled on yeah in edited writing, even though yah and ya are common in unedited writing. I even had a friend who used yeay, and I was never quite sure if this was supposed to be pronounced like yeah or yay or somewhere in between the two. (Interestingly, yay, which arose as a variant of yea, is not found in Merriam-Webster’s Collegiate, though it is in American Heritage and the OED. And surprisingly, it dates to only 1963.)

I don’t expect the situation to change anytime soon. The more unedited writing people read, the more forms like woah and yah will look normal. Editors may continue to correct them in published writing when we get the chance, but people will go on merrily spelling them any way they please.

neowhoa

By

Book Review: But Can I Start a Sentence with “But”?

chicagoq&a

Disclosure: I received a free copy of this book from the University of Chicago Press.

I have to admit that I was a little skeptical when I heard that the University of Chicago Press was putting out a collection of questions and answers from the popular Chicago Style Q&A. What’s the point of having it in book form when the online Q&A is freely available and easily searchable? And yet I have to admit that this charming little gift book is one of the best books on editing I’ve ever read.

If you’re not familiar with the Chicago Style Q&A, it’s a place where anyone can submit a question to the staff in the manuscript editing department at the University of Chicago Press. Selected questions and answers are then posted monthly. I don’t read the Q&A regularly, but when you search Chicago’s website, answers from the Q&A appear in the results. It’s a great repository of answers to questions that aren’t necessarily covered in the manual itself.

Because the book is simply a compilation of questions and answers, the organization is necessarily somewhat loose, though the books editors have grouped them into topics such as Possessives and Attributes, How Do You Cite . . . ?, and, one of my favorites, Things That Freak Us Out. If you’re not familiar with the Chicago Style Q&A, you may not know that the editors have developed a bit of a snarky voice. Maybe it’s a result of staring of pages and pages of text all day or of dealing with recalcitrant authors. Or maybe the editors have just been asked one too many times about something that could have been found in the manual if the person asking had just looked. Whatever the reason for it, it makes reading the answers a lot of fun.

For example, when someone asked if an abbreviation with periods should then be followed by another period if it appears at the end of the sentence, they respond, “Seriously, have you ever seen two periods in a row like that in print? If we told you to put two periods, would you do it? Would you set your hair on fire if CMOS said you should?” Or when someone asks innocently enough, “Can I use the first person?”, they answer, “Evidently.” And when someone asks why it’s so hard to find things in the manual, they write, “It must just be one of those things. If only there were a search box, or an index . . .” And when a US Marine threatened to deploy a detail of marines to invade Chicago’s offices and impose the outdated two-spaces-after-a-sentence rule, they reply, “As a US Marine, you’re probably an expert at something, but I’m afraid it’s not this.” The editors at Chicago clearly suffer no fools.

But in between the bits of dry wit and singeing snark are some truly thoughtful remarks on the craft of editing. For instance, when someone says that they don’t think it’s helpful to write out “graphics interchange format” in full the first time when referring to GIFs, the editors simply respond, “You never have to do anything that isn’t helpful. If a style guide says you do, you need a better style guide.” Or when someone asks if you always need commas after introductory phrases like “in the summer of 1812”, they answer, “Rejoice: everyone is correct. Higher authorities are not interested in legislating commas to this degree. Peace.”

Even at a thousand pages or more, The Chicago Manual of Style can’t provide answers to everything, nor should it. Editing that relies on a list of black-and-white edicts tends to be mechanical and to miss the forest of the text for the trees of commas and hyphens. If you want to be a good editor, you have to learn how to use your head. As the editors say, “Make your choice with a view to minimizing inconsistencies, and record them in your style sheet.” There’s not always one right answer. Sometimes you just have to pick one and stick with it.

But perhaps my favorite answer is the last one in the book:

Q. My library shelves are full. I need to make some difficult decisions to make space for new arrivals. Is there any reason to keep my CMOS 14th and 15th editions?
A. What a question. If you had more children, would you give away your firstborn? Find a board and build another shelf.

Here’s my bookcase of editing and language books at home. Obviously it’s time for me to build another shelf.

IMG_20160712_154518

But Can I Start a Sentence with “But”? Advice from the Chicago Style Q&A is available now. You can buy it from Amazon or your favorite bookseller.

By

15% Off All T-Shirts

Now through June 21, get 15% off all T-shirts in the Arrant Pedantry Store. Just use the code TSHIRT16 at checkout. Come take a look!


By

Sorry, Merriam-Webster, but Hot Dogs Are Not Sandwiches

On the Friday before Memorial Day, Merriam-Webster sent out this tweet:

They linked to this post describing ten different kinds of sandwiches and asserted that “yes, the hot dog is one of them.” They say,

We know: the idea that a hot dog is a sandwich is heresy to some of you. But given that the definition of sandwich is “two or more slices of bread or a split roll having a filling in between,” there is no sensible way around it. If you want a meatball sandwich on a split roll to be a kind of sandwich, then you have to accept that a hot dog is also a kind of sandwich.

Predictably, the internet exploded.

Users took to Twitter with the hashtag #hotdogisnotasandwich to voice their disagreement. Numerous Twitter polls showed that anywhere from 75 to 90 percent of respondents agreed that the hot dog is not a sandwich. Meanwhile, Merriam-Webster’s Emily Brewster went on the podcast Judge John Hodgman to defend Merriam-Webster’s case. Part of her argument is that there’s historical evidence for the sandwich definition: in the early to mid-twentieth century, hot dogs were commonly called “hot dog sandwiches”. Jimmy Kimmel, on the other hand took to his podium to make a more common-sense appeal:

That’s their definition. By my definition, a hot dog is a hot dog. It’s its own thing, with its own specialized bun. If you went in a restaurant and ordered a meat tube sandwich, would that make sense? No! They’d probably call the cops on you. I don’t care what anyone says—a hot dog is not a sandwich. And if hot dogs are sandwiches, then cereal is soup. Chew on that one for a while.

For reference, here’s Merriam-Webster’s definition of soup:

1 : a liquid food especially with a meat, fish, or vegetable stock as a base and often containing pieces of solid food

Read broadly, this definition does not exclude cold cereal from being a type of soup. Cereal is a liquid food containing pieces of solid food. It doesn’t have a meat, fish, or vegetable stock as a base, but the definition doesn’t strictly require that.

But we all know, of course, that cereal isn’t soup. Soup is usually (but not always) served hot, and it’s usually (but again, not always) savory and salt. It’s also usually eaten for lunch or dinner, while cereal is usually eaten for breakfast. But note how hard it is to write a definition that includes all things that are soup and excludes all things that aren’t.

My friend Mike Sakasegawa also noted the difficulty in writing a satisfactory definition of sandwich, saying, “Though it led me to the observation that sandwiches are like porn: you know it when you see it.” I said that this is key: “Just because you can’t write a single definition that includes all sandwiches and excludes all not-sandwiches doesn’t mean that the sandwich-like not-sandwiches are now sandwiches.” And Jesse Sheidlower, a former editor for the Oxford English Dictionary, concurred: “IOW, Lexicographer Fail.”

I wouldn’t put it that way, but, with apologies to my good friends at Merriam-Webster, I do think this is a case of reasoning from the definition. Lexicography’s primary aim is to describe how people use words, and people simply don’t use the word sandwich to refer to hot dogs. If someone said, “I’m making sandwiches—what kind would you like?” and you answered, “Hot dog, please,” they’d probably respond, “No, I’m making sandwiches, not hot dogs.” Whatever the history of the term, hot dogs are not considered sandwiches anymore. Use determines the definition, not the other way around. And definitions are by nature imperfect, unless you want to make them so long and detailed that they become encyclopedia entries.

So how can hot dogs fit the description of a sandwich but not be sandwiches? Easy. I propose that sandwiches are a paraphyletic group. A monophyletic group contains all the descendants of a common ancestor, but a paraphyletic group contains all descendants of a common ancestor with some exceptions. In biology, for example, mammals are a monophyletic group, because they contain all the descendants of the original proto-mammal. Reptiles, on the other hand, are an example of a paraphyletic group—the common ancestor of all reptiles is also the common ancestor of birds and mammals, but birds and mammals are not considered reptiles. Thus a chart showing the phylogenetic tree of reptiles has a couple of scallops cut out to exclude those branches.

Foods may not have ancestors in the same sense, but we can still construct a sort of phylogeny of sandwiches. Sandwiches include at least two main groups—those made with slices of bread and those made with a split bun or roll. Hot dogs would normally fall under the split-bun group, but instead they form their own separate category.

Proposed phylogeny of sandwiches

Proposed phylogeny of sandwiches

Note that this sort of model is also quite flexible. Some people might consider gyros or shawarma sandwiches, but I would consider them a type of wrap. Some people might also consider hamburgers sandwiches but not hot dogs. Sloppy joes and loose meat sandwiches may be edge cases, falling somewhere between hamburgers and more traditional split-roll sandwiches. And in some countries, people might also say that the split-bun types aren’t sandwiches, preferring to simply call these rolls.

Wherever you draw the line, the important thing is that you can draw the line. Don’t let the dictionary boss you around, especially on such an important topic as sandwiches.

By

Book Review: Perfect English Grammar

Disclosure: I received a free review PDF of this book from Callisto Media.

perfectenglishgrammar

Grant Barrett, cohost of the public radio program A Way with Words, recently published a book called Perfect English Grammar: The Indispensable Guide to Excellent Writing and Speaking. In it, Barrett sets out to help writers like himself who may not have gotten the best education in grammar or composition in school, ranging from middle-school students to “business professionals and community leaders who need a refresher on grammar points they last thought about decades ago.”

The book is designed as a reference book, something to be pulled out and consulted in those moments when you can’t remember the difference between a present perfect and a past perfect or between an initialism and a conjunction. The book is well organized, with chapters like “Verbs” broken down into topics like person, number, mood, linking verbs, and so on. The different topics are also very clearly marked, with bold colors and clear headings that make it easy to flip through in case you’d rather browse than use the table of contents or index.

Barrett starts with some general principles of writing like writing for your audience rather than yourself, avoiding using a thesaurus to learn fancy new words, and sticking to whichever style guide is appropriate in your field. He then moves on to the basics of composition, with a reminder to be aware of register and some good tips for getting started if you’re feeling stuck.

One weak spot in the chapter on composition was the section on paragraph and essay structure. Though Barrett says that paragraphs don’t have to be a certain length, he says that a paragraph should have a topic sentence, supporting sentences, and a conclusion sentence, and he explains that the classic five-paragraph essay has a similar structure. I’ve never been a fan of the five-paragraph essay as a way to teach composition. Perhaps it’s a necessary stepping-stone on the way to better composition, but to me it always felt more like a straitjacket, designed to keep students from hurting themselves and their teachers. But the chapter ends with some good advice on writing transitions, avoiding common mistakes, and having your work edited.

The later chapters on parts of speech, spelling and style, and sentence structure provide helpful introductions or refreshers to the topics, and I like that Barrett uses more current linguistic terminology. For example, he talks about verb tense and aspect rather than just tense (though I think the explanation of aspect could have been a little clearer), and he groups articles, possessives, quantifiers, and others under determiners. He also defends the passive voice, saying, “Both active and passive voices are essential to everyday writing and speaking. Broadside suggestions that you should avoid the passive voice are misguided and should be ignored.”

Though his treatment of various aspects of grammar is sometimes a little brief, he uses grammar mostly as a way to talk about frequent problem areas for novice writers, and this is where the book is most valuable. You have to have at least a basic understanding of what an independent clause is before you can identify a comma splice, and you have to be able to identify a subject and verb and be aware of some common tricky areas before you can identify a subject-verb agreement problem.

However, I found a few pieces of usage advice a little less helpful. For instance, Barrett advocates the singular they (which I was happy to see) but warns against sentential hopefully—even though it is, as he says, fully grammatical—because some people have been erroneously taught to dislike it. He also recommends following the rule requiring the strict placement of only, which Jan Freeman (among others) addressed here. In that column, published in 2009, Freeman asked for readers to send her examples of truly ambiguous onlys. I was apparently the first person to send her such an example, nearly five years after her column was published.

Most of the usage advice, though, is solid, and some of it is even quite refreshing, like this passage in which he addresses the usual advice about avoiding adverbs: “There is nothing whatsoever intrinsically wrong with adverbs. In fact, avoiding them leads to bland, forgettable writing. You can and should use adverbs.” My biggest complaint with the chapter on usage and style is simply that it is too short; there are many more usage items that a novice writer may need help with that aren’t covered here.

Despite these quibbles, I think the book is full of good advice that will be helpful to both novices and more experienced writers who may need a refresher on basic topics of grammar, usage, and style.

By

On a Collision Course with Reality

In a blog post last month, John McIntyre took the editors of the AP Stylebook to task for some of the bad rules they enforce. One of these was the notion that “two objects must be in motion to collide, that a moving object cannot collide with a stationary object.” That is, according to the AP Stylebook, a car cannot collide with a tree, because the tree is not moving, and it can only collide with another car if that other car is moving. McIntyre notes that this rule is not supported by Fowler’s Modern English Usage or even mentioned in Garner’s Modern American Usage.

Merriam-Webster’s Dictionary of English Usage does have an entry for collide and notes that the rule is a tradition (read “invention”) of American newspaper editors. It’s not even clear where the rule came from or why; there’s nothing in the etymology of the word to suggest that only two objects in motion can collide. It comes from the Latin collidere, meaning “to strike together”, from com- “together” + laedere “to strike”.

The rule is not supported by traditional usage either. Speakers and writers of English have been using collide to refer to bodies that are not both in motion for as long as the word has been in use, which is roughly four hundred years. Nor is the rule an attempt to slow language change or hang on to a fading distinction; it’s an attempt to create a distinction and impose it on everyone who uses the language, or at least journalists.

What I found especially baffling was the discussion that took place on Mr. McIntyre’s Facebook page when he shared the link there. Several people chimed in to defend the rule, with one gentleman saying, “There’s an unnecessary ambiguity when ‘collides’ involves <2 moving objects.” Mr. McIntyre responded, “Only if you imagine one.” And this is key: collide is ambiguous only if you have been taught that it is ambiguous—or in other words, only if you’re a certain kind of journalist.

In that Facebook discussion, I wrote,

So the question is, is this actually a problem that needs to be solved? Are readers constantly left scratching their heads because they see “collided with a tree” and wonder how a tree could have been moving? If nobody has ever found such phrasing confusing, then insisting on different phrasing to avoid potential ambiguity is nothing but a waste of time. It’s a way to ensure that editors have work to do, not a way to ensure that editors are adding benefit for the readers.

The discussion thread petered out after that.

I’m generally skeptical of the usefulness of invented distinctions, but this one seems especially useless. When would it be important to distinguish between a crash involving two moving objects and one involving only one moving object? Wouldn’t it be clear from context anyway? And if it’s not clear from context, how on earth would we expect most readers—who have undoubtedly never heard of this journalistic shibboleth—to pick up on it? Should we avoid using words like crash or struck because they’re ambiguous in the same way—because they don’t tell us whether both objects were moving?

It doesn’t matter how rigorously you follow the rule in your own writing or in the writing you edit; if your readers think that collide is synonymous with crash, then they will assume that your variation between collide and crash is merely stylistic. They’ll have no idea that you’re trying to communicate something else. If it’s important, they’ll probably deduce from context whether both objects were moving, regardless of the word you use.

In other words, if an editor makes a distinction and no reader picks up on it, is it still useful?

By

Free Shipping

It’s that time again! Now through May 1, get free shipping on orders of two or more shirts from the Arrant Pedantry Store when you use the coupon code TWOWOO. Oh, and did I mention that I have a new shirt design? Check it out!

%d bloggers like this: