Arrant Pedantry

By

Names, Spelling, and Style

A couple of weeks ago, I had a conversation with Mededitor on Twitter about name spelling and style. It started with a tweet from Grammar Girl linking to an old post of hers on whether you need a comma before “Jr.” She notes that most style guides now leave out the commas. Mededitor opined that the owners of the names, not editors, should get to decide whether or not to use commas. In this follow-up post, Grammar Girl seems to come to the same conclusion:

However, Chicago also states that writers should make a reasonable effort to spell a name the way a person spells it himself or herself, and I presume that also applies to punctuation. In other words, you’re free to insist on the comma before “Jr.” in your own name.

I can see the appeal in this argument, but I have to disagree. As I argued on Twitter and in a comment on that second post, catering to authors’ preferences for commas around “Jr.” creates inconsistency in the text. And it wouldn’t just be authors themselves that we’d have to cater to; what about people mentioned or cited in the text? Should editors spend time tracking down every Jr. or III whose names appear in writing to ask whether they prefer to have their suffixes set off with commas?

Doing so could take enormous amounts of time, and in the end there’s no benefit to the reader (and possibly a detriment in the form of distracting inconsistency), only to some authors’ egos. Further, we’d have to create a style anyway and apply it to all those who had no preference or whose preferences could not be identified. Why pick an arbitrary style for some names and not others? Either the preference matters or it doesn’t. And if it doesn’t matter, that’s what a style choice is for: to save us from wasting our time making countless minor decisions.

But I have a further reason for not wishing to defer to authors’ preferences. As I argued in that same comment, punctuation is not the same thing as spelling. There’s one right way to spell my name: Jonathon Owen. If you write my name Jonathan Owens, you’ve spelled it wrong. There’s no principled reason for spelling it one way or another; that’s just the way it is. But punctuation marks aren’t really part of someone’s name; they’re merely stylistic elements between or around the parts of people’s names to separate them, abbreviate them, or join them.

Punctuation around or in names, however, is often principled, though the principles of punctuation are prone to change over time. “Jr.” was traditionally set off by commas not because the commas were officially part of anyone’s name, but because it was considered parenthetic. As punctuation has become more streamlined, the requirement to set off this particular parenthetic with commas has been dropped by most style guides. And to be blunt, I think the desire of some authors to hang on to the commas is driven mostly by a desire to stick with whatever style they grew up with. It’s not much different from some people’s resistance to switching to one space between sentences.

In the course of the conversation with Mededitor, another point came up: periods after middle initials that don’t stand for anything. Some people insist that you shouldn’t use a period in those cases, because the period signals that the letter is an abbreviation, but The Chicago Manual of Style recommends using a period in all cases regardless. Again, it’s difficult for editors and proofreaders to check and enforce proper punctuation after an initial, and the result is a style that looks inconsistent to the readers. And again, individuals’ preferences are not always clear. Even one of the most famous individuals with only a middle initial, Harry S. Truman, wrote his name inconsistently, as the Harry S. Truman Library points out.

Yes, it’s true that editors can add a list of names to their style sheets to save some time, but checking every single name with an initial against a style sheet—and then looking them up if they’re not on the sheet—still takes time. And what’s the result? Names that occasionally look like they’re simply missing a period after the initial, because the reader will generally have no idea that there’s a reason behind the omission. The result is an error in most readers’ eyes, except for those few in the know.

The fundamental problem with making exceptions to general rules is that readers often has no idea that there are principled reasons behind the exceptions. If they see an apparent inconsistency and can’t quickly figure out a reason for it, then they’ve been needlessly distracted. Does the supposed good done by catering to some individuals’ preference for commas or periods around their names outweigh the harm done by presenting readers the appearance of sloppiness?

I don’t think it does, and this is why I agree with Chicago. I think it’s best—both for editors and for readers—to pick a rule and stick with it.

Update: Mededitor posted a response here, and I want to respond and clarify some points I made here. In that post he says, “I argue for the traditional rule, namely: ‘Make a reasonable attempt to accommodate the conventions by which people spell their own names.’” I want to make it clear that I’m also arguing for the traditional rule. I’m not saying that editors should not worry about the spelling of names. I simply disagree that commas and periods should be considered spelling.

With the exception of apostrophes and hyphens, punctuation is a matter of style, not spelling. The comma in Salt Lake City, Utah is not part of the spelling of the place name; it simply separates the two elements of the name, just as the now-deprecated comma before “Jr.” separates it from the given and family names. Note that the commas disappear if you use one element by itself, and other commas can appear in other contexts, such as when a name is inverted: “Jonathon Owen” becomes “Owen, Jonathon” in an index. This comma is also not part of the spelling of my name; it’s just a piece of punctuation. It’s a style choice.

And those style choices vary and change over time. In the UK, it’s standard practice to omit periods from abbreviations. Thus I’d be Jonathon R Owen in British style. The period in American style is not an element of my middle name that appears when it’s shortened—it’s a style choice that communicates something about my name. But the important thing is that it’s a choice. You can’t choose how to spell my name (though plenty of people have told me that I spell it wrong). But you can choose how to punctuate it to fit a given style.

By

Funner Grammar

As I said in the addendum to my last post, maybe I’m not so ready to abandon the technical definition of grammar. In a recent post on Copyediting, Andrea Altenburg criticized the word funner in an ad for Chuck E. Cheese as “improper grammar”, and my first reaction was “That’s not grammar!”

That’s not entirely accurate, of course, as Matt Gordon pointed out to me on Twitter. The objection to funner was originally grammatical, and the Copyediting post does make an appeal to grammar. The argument goes like this: fun is properly a noun, not an adjective, and as a noun, it can’t take comparative or superlative degrees—no funner or funnest.

This seems like a fairly reasonable argument—if a word isn’t an adjective, it can’t inflect like one—but it isn’t the real argument. First of all, it’s not really true that fun was originally a noun. As Ben Zimmer explains in “Dear Apple: Stop the Funnification”, the noun fun arose in the late seventeenth century and was labeled by Samuel Johnson in the mid-1800s “as ‘a low cant word’ of the criminal underworld.” But the earliest citation for fun is as a verb, fourteen years earlier.

As Merriam-Webster’s Dictionary of English Usage
notes, “A couple [of usage commentators] who dislike it themselves still note how nouns have a way of turning into adjectives in English.” Indeed, this sort of functional shift—also called zero derivation or conversion by linguists because they change the part of speech without the means of prefixation or suffixation—is quite common in English. English lacks case endings and has little in the way of verbal endings, so it’s quite easy to change a word from one part of speech to another. The transformation of fun from a verb to a noun to an inflected adjective came slowly but surely.

As this great article explains, shifts in function or meaning usually happen in small steps. Once fun was established as a noun, you could say things like We had fun. This is unambiguously a noun—fun is the object of the verb have. But then you get constructions like The party was fun. This is structurally ambiguous—both nouns and adjectives can go in the slot after was.

This paves the way to analyze fun as an adjective. It then moved into attributive use, directly modifying a following noun, as in fun fair. Nouns can do this too, so once again the structure was ambiguous, but it was evidence that fun was moving further in the direction of becoming an adjective. In the twentieth century it started to be used in more unambiguously adjectival roles. MWDEU says that this accelerated after World War II, and Mark Davies COHA shows that it especially picked up in the last twenty years.

Once fun was firmly established as an adjective, the inflected forms funner and funnest followed naturally. There are only a handful of hits for either in COCA, which attests to the fact that they’re still fairly new and relatively colloquial. But let’s get back to Altenburg’s post.

She says that fun is defined as a noun and thus can’t be inflected for comparative or superlative forms, but then she admits that dictionaries also define fun as an adjective with the forms funner and funnest. But she waves away these definitions by saying, “However, dictionaries are starting to include more definitions for slang that are still not words to the true copyeditor.”

What this means is that she really isn’t objecting to funner on grammatical grounds (at least not in the technical sense); her argument simply reduces to an assertion that funner isn’t a word. But as Stan Carey so excellently argued, “‘Not a word’ is not an argument”. And even the grammatical objections are eroding; many people now simply assert that funner is wrong, even if they accept fun as an adjective, as Grammar Girl says here:

Yet, even people who accept that “fun” is an adjective are unlikely to embrace “funner” and “funnest.” It seems as if language mavens haven’t truly gotten over their irritation that “fun” has become an adjective, and they’ve decided to dig in their heels against “funner” and “funnest.”

It brings to mind the objection against sentential hopefully. Even though there’s nothing wrong with sentence adverbs or with hopefully per se, it was a new usage that drew the ire of the mavens. The grammatical argument against it was essentially a post hoc justification for a ban on a word they didn’t like.

The same thing has happened with funner. It’s perfectly grammatical in the sense that it’s a well-formed, meaningful word, but it’s fairly new and still highly informal and colloquial. (For the record, it’s not slang, either, but that’s a post for another day.) If you don’t want to use it, that’s your right, but stop saying that it’s not a word.

By

It’s Not Wrong, but You Still Shouldn’t Do It

A couple of weeks ago, in my post “The Value of Prescriptivism,” I mentioned some strange reasoning that I wanted to talk about later—the idea that there are many usages that are not technically wrong, but you should still avoid them because other people think they’re wrong. I used the example of a Grammar Girl post on hopefully wherein she lays out the arguments in favor of disjunct hopefully and debunks some of the arguments against it—and then advises, “I still have to say, don’t do it.” She then adds, however, “I am hopeful that starting a sentence with hopefully will become more acceptable in the future.”

On the face of it, this seems like a pretty reasonable approach. Sometimes the considerations of the reader have to take precedence over the facts of usage. If the majority of your readers will object to your word choice, then it may be wise to pick a different word. But there’s a different way to look at this, which is that the misinformed opinions of a very small but very vocal subset of readers take precedence over the facts and the opinions of others. Arnold Zwicky wrote about this phenomenon a few years ago in a Language Log post titled “Crazies win”.

Addressing split infinitives and the equivocal advice to avoid them unless it’s better not to, Zwicky says that “in practice, [split infinitive as last resort] is scarcely an improvement over [no split infinitives] and in fact works to preserve the belief that split infinitives are tainted in some way.” He then adds that the “only intellectually justifiable advice” is to “say flatly that there’s nothing wrong with split infinitives and you should use them whenever they suit you”. I agree wholeheartedly, and I’ll explain why.

The problem with the it’s-not-wrong-but-don’t-do-it philosophy is that, while it feels like a moderate, open-minded, and more descriptivist approach in theory, it is virtually indistinguishable from the it’s-wrong-so-don’t-do-it philosophy in practice. You can cite all the linguistic evidence you want, but it’s still trumped by the fact that you’d rather avoid annoying that small subset of readers. It pays lip service to the idea of descriptivism informing your prescriptions, but the prescription is effectively the same. All you’ve changed is the justification for avoiding the usage.

Even more neutral and descriptive pieces like this New York Times “On Language” article on singular they ends with a wistful, “It’s a shame that grammarians ever took umbrage at the singular they,” adding, “Like it or not, the universal they isn’t universally accepted — yet. Its fate is now in the hands of the jury, the people who speak the language.” Even though the authors seem to be avoiding giving out advice, it’s still implicit in the conclusion. It’s great to inform readers about the history of usage debates, but what they’ll most likely come away with is the conclusion that it’s wrong—or at least tainted—so they shouldn’t use it.

The worst thing about this waffly kind of advice, I think, is that it lets usage commentators duck responsibility for influencing usage. They tell you all the reasons why it should be alright to use hopefully or split infinitives or singular they, but then they sigh and put them away in the linguistic hope chest, telling you that you can’t use them yet, but maybe someday. Well, when? If all the usage commentators are saying, “It’s not acceptable yet,” at what point are they going to decide that it suddenly is acceptable? If you always defer to the peevers and crazies, it will never be acceptable (unless they all happen to die off without transmitting their ideas to the next generation).

And furthermore, I’m not sure it’s a worthwhile endeavor to try to avoid offending or annoying anyone in your writing. It reminds me of Aesop’s fable of the man, the boy, and the donkey: people will always find something to criticize, so it’s impossible to behave (or write) in such a way as to always avoid criticism. As the old man at the end says, “Please all, and you will please none.” You can’t please everyone, so you have to make a choice: will you please the small but vocal peevers, or the more numerous reasonable people? If you believe there’s nothing technically wrong with hopefully or singular they, maybe you should stand by those beliefs instead of caving to the critics. And perhaps through your reasonable but firm advice and your own exemplary writing, you’ll help a few of those crazies come around.

%d bloggers like this: