Arrant Pedantry

By

Contest Reminders

Just a reminder that my blog is currently competing in Grammar.net’s Best Grammar Blog of 2011 contest. Arrant Pedantry is currently in third. If you like my blog, please go vote.

Also, the deadline for submissions for my own contest sponsored by Stack Exchange English Language and Usage is fast approaching. Submit an idea for a future post here on Arrant Pedantry, and you’ll be entered to win either a new Kindle 3G or a copy of Robert Lane Greene’s You Are What You Speak: Grammar Grouches, Language Laws, and the Politics of Identity. Post a comment on that post or send me a tweet @ArrantPedantry. The last day for entries is September 30th.

By

It’s Not Wrong, but You Still Shouldn’t Do It

A couple of weeks ago, in my post “The Value of Prescriptivism,” I mentioned some strange reasoning that I wanted to talk about later—the idea that there are many usages that are not technically wrong, but you should still avoid them because other people think they’re wrong. I used the example of a Grammar Girl post on hopefully wherein she lays out the arguments in favor of disjunct hopefully and debunks some of the arguments against it—and then advises, “I still have to say, don’t do it.” She then adds, however, “I am hopeful that starting a sentence with hopefully will become more acceptable in the future.”

On the face of it, this seems like a pretty reasonable approach. Sometimes the considerations of the reader have to take precedence over the facts of usage. If the majority of your readers will object to your word choice, then it may be wise to pick a different word. But there’s a different way to look at this, which is that the misinformed opinions of a very small but very vocal subset of readers take precedence over the facts and the opinions of others. Arnold Zwicky wrote about this phenomenon a few years ago in a Language Log post titled “Crazies win”.

Addressing split infinitives and the equivocal advice to avoid them unless it’s better not to, Zwicky says that “in practice, [split infinitive as last resort] is scarcely an improvement over [no split infinitives] and in fact works to preserve the belief that split infinitives are tainted in some way.” He then adds that the “only intellectually justifiable advice” is to “say flatly that there’s nothing wrong with split infinitives and you should use them whenever they suit you”. I agree wholeheartedly, and I’ll explain why.

The problem with the it’s-not-wrong-but-don’t-do-it philosophy is that, while it feels like a moderate, open-minded, and more descriptivist approach in theory, it is virtually indistinguishable from the it’s-wrong-so-don’t-do-it philosophy in practice. You can cite all the linguistic evidence you want, but it’s still trumped by the fact that you’d rather avoid annoying that small subset of readers. It pays lip service to the idea of descriptivism informing your prescriptions, but the prescription is effectively the same. All you’ve changed is the justification for avoiding the usage.

Even more neutral and descriptive pieces like this New York Times “On Language” article on singular they ends with a wistful, “It’s a shame that grammarians ever took umbrage at the singular they,” adding, “Like it or not, the universal they isn’t universally accepted — yet. Its fate is now in the hands of the jury, the people who speak the language.” Even though the authors seem to be avoiding giving out advice, it’s still implicit in the conclusion. It’s great to inform readers about the history of usage debates, but what they’ll most likely come away with is the conclusion that it’s wrong—or at least tainted—so they shouldn’t use it.

The worst thing about this waffly kind of advice, I think, is that it lets usage commentators duck responsibility for influencing usage. They tell you all the reasons why it should be alright to use hopefully or split infinitives or singular they, but then they sigh and put them away in the linguistic hope chest, telling you that you can’t use them yet, but maybe someday. Well, when? If all the usage commentators are saying, “It’s not acceptable yet,” at what point are they going to decide that it suddenly is acceptable? If you always defer to the peevers and crazies, it will never be acceptable (unless they all happen to die off without transmitting their ideas to the next generation).

And furthermore, I’m not sure it’s a worthwhile endeavor to try to avoid offending or annoying anyone in your writing. It reminds me of Aesop’s fable of the man, the boy, and the donkey: people will always find something to criticize, so it’s impossible to behave (or write) in such a way as to always avoid criticism. As the old man at the end says, “Please all, and you will please none.” You can’t please everyone, so you have to make a choice: will you please the small but vocal peevers, or the more numerous reasonable people? If you believe there’s nothing technically wrong with hopefully or singular they, maybe you should stand by those beliefs instead of caving to the critics. And perhaps through your reasonable but firm advice and your own exemplary writing, you’ll help a few of those crazies come around.

By

Contests!

Topic Contest

I’m very pleased to announce the first-ever contest here at Arrant Pedantry, sponsored by the generous folks at Stack Exchange English Language and Usage. The first-prize winner will receive a new Kindle 3G.

A Word from Our Sponsor

Stack Exchange English Language and Usage is a collaborative, community-driven site focused on questions about grammar, etymology, usage, dialects, and other aspects of the English language. For example, you can ask about the pronunciation of the names of the letters of the alphabet, the appropriate use of the semicolon, or the factual basis for pirate speech (appropriate for yesterday’s Talk like a Pirate Day).

Stack Exchange English Language and Usage is a great resource for people looking for answers to those often obscure questions about language that we all have from time to time. Stack Exchange features an involved community of language experts, amateurs, and enthusiasts who are willing and able to tackle questions on a variety of topics. Please go check it out, and consider following StackEnglish on Twitter.

The Rules

And now on to business. To enter, submit a request for a future topic you’d like to see covered here on Arrant Pedantry. It can be a question about usage, etymology, how I can call myself an editor when I think a lot of the rules are bogus—whatever you want. (Keep it civil, of course). Post your request either in the comments below or on Twitter @ArrantPedantry. I’ll pick the two best suggestions and write a post on each of them. One lucky winner will receive the grand prize of a a new Kindle 3G; one slightly less lucky winner will receive a copy of Robert Lane Greene’s You Are What You Speak: Grammar Grouches, Language Laws, and the Politics of Identity (on which I’ll try to write a review sometime soon).

The deadline for entries is Friday, September 30th. Only contestants in the continental US, Canada, and Western Europe are eligible. Employees of StackExchange and relatives of me are not eligible. Spread the word!

And while you’re at it, check out the limerick contest at Sentence First, also sponsored by Stack Exchange English Language and Usage.

Addendum: My blog is currently getting bombarded by spammers, so if your comment doesn’t go through for some reason, please let me know through the contact page or by direct message on Twitter.

Update: The contest is now closed to submissions. I’ll go over all of them and announce the winners soon.

Best Grammar Blog of 2011

As you may have noticed, my blog has been preselected as a finalist for Grammar.net’s Best Grammar Blog of 2011 contest. I’m up against some excellent grammar and language blogs, so I’m honored to have been chosen. Voting for this contest starts on September 26th and runs through October 17th. If you enjoy my blog, please go and vote!

By

What Is a Namesake?

I just came across the sentence “George A. Smith became the namesake for St. George, Utah” while editing. A previous editor had changed it to “In 1861 St. George, Utah, became the namesake of George A. Smith.” Slightly awkward wording aside, I preferred the unedited form. Apparently, though, this is an issue of divided usage, with some saying that a namesake is named after someone else, some saying that a namesake is someone after whom someone else is named, some saying that both are correct, and some saying that namesakes simply share the same name without one being named after the other.

But I’d like to get a better idea of which definitions are most common, so I’m putting up this nice little poll. Let me know your feelings on the matter, and feel free to explain your vote in the comments below.

[poll id=”2″]

By

Smelly Grammar

Earlier today on Twitter, Mark Allen posted a link to this column on the Columbia Journalism Review’s website about a few points of usage. It begins with a familiar anecdote about dictionary maker Samuel Johnson and proceeds to analyze the grammar and usage of the exchange between him and an unidentified woman.

Pretty quickly, though, the grammatical analysis goes astray. The author says that in Johnson’s time, the proper use of smell was as an intransitive verb, hence Johnson’s gentle but clever reproach. But the woman did indeed use smell as an intransitive verb—note that she didn’t say “I smell you“—so that can’t possibly be the reason why Johnson objected to it. And furthermore, the OED gives both transitive and intransitive senses of the verb smell tracing back to the late 1100s and early 1200s.

Johnson’s own dictionary simply defines smell as “to perceive by the nose” but does not say anything about transitivity. But note that it only identifies the perception of smell and not the production of it. Johnson produced a smell; the lady perceived it. Perhaps this is what his repartee was about, not the verb’s transitivity but who its subject was. But even this doesn’t hold up against the evidence: the OED lists both the “perceive an odor” and “emit an odor” senses, dating to 1200 and 1175, respectively. And the more specific sense of “emit an unpleasant odor” dates to 1400. By Johnson’s day, English speakers had been saying “You smell” to mean “You stink” for at least three hundred years. Merriam-Webster’s Dictionary of English Usage says nothing on this point, though it’s possible that other usage guides have addressed it.

But perhaps the biggest problem with the story is that I can’t find an attestation of it earlier than 1950 in Google Books. (If you can find an earlier one, let me know in the comments.) This anecdote seems more like a modern fabrication about a spurious point of usage than a real story that encapsulates an example of language change. But the most disappointing thing about the Columbia Journalism Review piece is its sloppy grammatical analysis. Transitivity is a pretty basic concept in grammar, but the author consistently gets it wrong; she’s really talking about thematic roles. And the historical facts of usage don’t line up with the argument, either.

I’m sure some of you are thinking, “But you’re missing the point! The point is that good usage matters.” But my point is that the facts matter, too, and you can’t talk about good usage without being aware of the facts. You can’t come to a better understanding of the truth by combining apocryphal anecdotes with a little misguided grammatical analysis. The sad truth is that an awful lot of usage commentators really don’t understand the grammatical points on which they comment, and I think that’s unfortunate, because understanding those points gives one better tools with which to analyze real usage.

By

The Value of Prescriptivism

Last week I asked rather skeptically whether prescriptivism had moral worth. John McIntyre was interested by my question and musing in the last paragraph, and he took up the question (quite admirably, as always) and responded with his own thoughts on prescriptivism. What I see is in his post is neither a coherent principle nor an innately moral argument, as Hart argued, but rather a set of sometimes-contradictory principles mixed with personal taste—and I think that’s okay.

Even Hart’s coherent principle is far from coherent when you break it down. The “clarity, precision, subtlety, nuance, and poetic richness” that he touts are really a bundle of conflicting goals. Clear wording may come at the expense of precision, subtlety, and nuance. Subtlety may not be very clear or precise. And so on. And even if these are all worthy goals, there may be many more that are missing.

McIntyre notes several more goals for practical prescriptivists like editors, including effectiveness, respect for an author’s voice, consistency with a set house style, and consideration of reader reactions, which is a quagmire in its own right. As McIntyre notes, some readers may have fits when they see sentence-disjunct “hopefully”, while other readers may find workarounds like “it is to be hoped that” to be stilted.

Of course, any appeal to the preferences of the reader (which is, in a way, more of a construct than a real entity) still requires decision making: which readers are you appealing to? Many of those who give usage advice seem to defer to the sticklers and pedants, even when it can be shown that they’re pretty clearly wrong or at least holding to outdated and somewhat silly notions. Grammar Girl, for example, guides readers through the arguments for and against “hopefully”, repeatedly saying that she hopes it becomes acceptable someday (note how carefully she avoids using “hopefully” herself, even though she claims to support it) but ultimately shies away from the usage, saying that you should avoid it for now because it’s not acceptable yet. (I’ll write about the strange reasoning presented here some other time.)

But whether or not you give in to the pedants and cranks who write angry letters to lecture you on split infinitives and stranded prepositions, it’s still clear that there’s value in considering the reader’s wishes while writing and editing. The author wants to communicate something to an audience; the audience presumably wants to receive that communication. It’s in both parties’ best interests if that communication goes off without a hitch, which is where prescriptivism can come in.

As McIntyre already said, this doesn’t give you an instant answer to every question, it can give you some methods of gauging roughly how acceptable certain words or constructions are. Ben Yagoda provides his own “somewhat arbitrary metric” for deciding when to fight for a traditional meaning and when to let it go. But the key word here is “arbitrary”; there is no absolute truth in usage, no clear, authoritative source to which you can appeal to solve these questions.

Nevertheless, I believe the prescriptive motivation—the desire to make our language as good as it can be—is, at its core, a healthy one. It leads us to strive for clear and effective communication. It leads us to seek out good language to use as a model. And it slows language change and helps to ensure that writing will be more understandable to audiences that are removed spatially and temporally. But when you try to turn this into a coherent principle to instruct writers on individual points of usage, like transpire or aggravate or enormity, well, then you start running into trouble, because that approach favors fiat over reason and evidence. But I think that an interest in clear and effective language, tempered with a healthy dose of facts and an acknowledgement that the real truth is often messy, can be a boon to all involved.