Posted on

Word Nerd: The Plural of “Referendum”

Hola nerds! I’m back!

And I’m doing a totally apolitical post with no agenda whatsoever about the plural of “referendum.”*

You may have encountered this (generally spirited but good-natured) exchange in recent months, for obvious reasons. The English-conforming “referendums” feels like the younger, more freewheeling use, while the Latinate “referenda” seems more apposite to older, more formal modes of speech (although those with a legal background are far more likely to use the former than the latter). So which is it, and why?

The answer is, in fact, “referendums,” but the why is the more interesting bit.

Plurals in English

So the first thing to do is talk about plurals in general, which means talking about inflection (as I have previously). So Old English had a bunch of different ways to decline nouns, so that the plural of stān (stone) is stānas, the plural of sċip (ship) is sċipu, ƿudu (wood) is ƿuda, mann (person) is menn, bōc (book) is bēċ and nama (name) is naman, coming to four plural suffixes and two stem vowel-shifts – and that’s just counting the regular forms. These endings depended on a bunch of implicit rules on gender and context that are largely lost on modern English readers.

But by Middle English, this had simplified so that most nouns pluralised in only two ways: the “strong”† -es ending, turning engel (angel) into engeles, and the “weak” -en ending, turning name (name) into namen. By Modern English, the -en ending has all but disappeared, leaving just a handful of words,‡ while the -es ending (or more usually just -s) has more or less overtaken the language. Since then, pretty much all new nouns that have formed or been imported have adopted the same ending. As of now, aside from the aforementioned -en endings and a handful of oddities that change the stem (teeth, geese, mice, etc.) or don’t change at all (like sheep and fish**), English nouns overwhelmingly pluralise with some form of -s ending.

But Latin!

But then in weigh our eighteenth-century grammarian buddies. They felt that words imported from Ancient Greek, Classical Latin or Modern French should absolutely retain their native plurals: cactus should become cacti, oasis should go to oases, gateau becomes gateaux. And most of these usages stuck, becoming part of that great artificial “proper English” you were taught at school.

And in some cases, I can see it. It may be a practical, even necessary acknowledgement of the complexities of pronunciation; speaking as someone with a lisp, oasises is a hissing nightmare, and the more elegant oases is a welcome tonic. In scientific or technical writing, the original Latin may be exactly intended; I have no objection to the use of bacterium and bacteria in a medical context.

But by and large, it’s a rule that exists simply because it exists; we should, we are told, learn obscure foreign inflections just because our forebears have decided it’s to be done.

The Case Against

But I don’t buy it. Frankly, most of the time, it’s obtrusive and inelegant, and a practice that should arguably be put to rest. For a few reasons:

  • It’s classist. Like all grammatical prescriptivism, this is a practice meant to signal the quality of your education. Using Latin and Greek grammatical forms shows that you’ve learned Latin and Greek – or at least learned English from someone aware of the rules of Latin and Greek. These rules were laid down at a time when the wealthy middle-class were becoming increasingly educated, and were keen to show it, to earn recognition among their higher-born peers and to distinguish themselves from their poorer neighbours.
  • It’s colonialist! Weren’t expecting that, were ya? But tell me, why is it these specific languages are given this privilege? The word orang-utan comes from Malay; obeying Malay grammatical rules should give us the plural orang-orang-utan. And mongoose comes from Marathi; the plural should be (roughly) mongoosay. The word horde is Polish originally, suggesting the plural hordy. If we’re not going to obey those rules, it seems strange to insist on making an exception for Latin and Greek, and there’s no defence that doesn’t boil down to “we consider these cultures to be better than those cultures.”
  • It’s unnecessary. Look, we already have -s (and -es, not to mention -oes, -ies and -ves), -en, teeth, geese, mice, sheep and god knows what else to try and remember, just in English; why the funk do we need to add more to this nonsense? I’m gonna take my daughter to as many aquariums as I want, with no minimums or maximums whatsoever – there are no formulas to this decision – and you just understood every single word of that sentence whether it upset your frail grammatical sensibilities or not.

And heck, even if you’re determined to preserve the Precious Latin, a bunch of these presumed plurals are wrong, for several reasons:

  • There’s no plural in the original. Virus, in Latin, is an uncountable noun; it means “venom,” and like other liquids, has no number (if you have a glass of water, and I have a glass twice as large, I don’t have “two waters”; I just have more water). In early use, science writers tried to get viri to stick, but if there’s no plural in the original Latin, there’s nothing to import.
  • The word has moved around speech. It’s time to accept that agenda is a singular noun in English. Agendum (“which is to be done”) isn’t a noun but a participle, a repurposed verb serving more as an adjective than anything else. Fancypants clerks would head a list of tasks agenda in Latin (i.e. “[those things] which are to be done”) in the same way we’d write “to do” now, and the word entered English as a singular noun, referring to the list rather than to the tasks. As a new word doing a new job, the plural should naturally be agendas. (We should also accept that >register to vote.

    As always, if you want to argue with me, or to chat about this shit, or to propose a topic for a future blog, let me know! Tweet us; Facebook us; let’s have an argument/chat.

    ___

    *No, seriously, this isn’t about another referendum, it’s about the word.

    †Pay no attention to “weak” and “strong.” The terms were used in German grammar to distinguish types of verbs (whether they change the stem in conjugation, or just the suffix), then mapped onto nouns, then spread into other languages, where they’re used with no particular consistency. We should probably just do away with the terms.

    ‡If you’re interested: oxen, children,⁋  men, women, brethren and oddly swine (which used to be spelled sweyen).

    For extra fun, children is actually a sort of “double plural”; the older plural was childer, using a super-rare -er form, but -er seems to have at some point picked up an extraneous -en, possibly by confusion with brethren (although the -er in brother isn’t a suffix at all).

    **The singular and plural of “sheep” have been identical since the Old English sċēap, but I have no idea what happened with “fish” (fisċ), which used to perfectly happily take the plural “fishes” (fisċas).

Posted on

Word Nerd: Quite Impossible, Really

Ho there, nerds!

So, been a while since you had a column out of me, but something came up on Twitter this morning which I thought would make a quite perfect Word Nerd, so here I am.

It started on Saturday, when journalist Holly Brockwell (who you should really follow on Twitter, she’s very good value) shared contranym.

My bookish Twitter feed caught up to the post this morning, and much of the reaction has been hilarity or outrage – including a few people from both sides of the pond who’ve just had a whole raft of prior exchanges with their transatlantic peers radically reframed – and inevitably a few people who knew about the disparity and were keen to offer other examples or context.

But what interests me (as always) is why this difference exists.

Quietus to Quit

The origin of the word quite is in the Middle English quit, which comes ultimately from the Latin quietus, meaning “to rest”* or “to absolve.” It came to English (via French) with the latter sense, and was used, for example, to describe paying off your debts (an ancient ceremony occurs at the City of London to this very day called “Quit Rents”), leaving a place, freeing a prisoner, forgiving someone an obligation or (most common now) resigning a job or leaving a group.

So while the original sense of the word suggest specifically forgiveness or absolution, it came gradually to carry a sense of completeness, of a thing finished.†

Quit to Quite

And it’s in this sense that we encounter it in the modern quite. Because the reason it serves as an intensifier in US English but as a diminisher‡ in UK English is that it isn’t really either! What it actually does, following the earlier sense of quit as “completed,” is convey absoluteness.** Consider phrases like “quite right,” still used on both sides of the Atlantic; quite in this phrase is meant to suggest that a thing is definitely right.

Quite to “Quite”

But absoluteness isn’t just exactly the same thing as emphasis (although it’s often used that way, as in literally). Absoluteness is, by definition, binary, not relative: if I describe you as “absolutely tall,” then I’m not saying that you are very tall, only that there can be no doubting your tallness.

And in some way this has informed the common (although not universal) British usage of the word quite: if a British speaker says someone is “quite clever,” they’re saying that the person is exactly clever enough that no-one could reasonably deny their cleverness, but no cleverer than that. Whereas the US usage has followed the path of absolutes like really and literally to become an intensifier.

An entirely fascinating example of divergence creating a contranym, and I’m delighted to have had the chance to talk about it here.

Cheers and see you next time!

As always, if you want to argue with me, or to chat about this shit, or to propose a topic for a future blog, let me know! Tweet us; Facebook us; let’s have an argument/chat.

___

*Sometimes, poetically, “death,” as in Hamlet.

†Perhaps echoing the above association?

‡I know this isn’t a word, but as far as I can tell, the word that conveys what this is intended to convey does not exist in English, which is utterly infuriating and I demand my money back.

**I’ve just realised it used to convey absolution and now conveys absoluteness and I’m delighted.

Posted on

Word Nerd: It’s All Getting a Bit Tense

Hola nerds!

Sorry I’m a bit late this month; been on crunch time for a bunch of stuff. But here I am!

Anyway, this one’s another request, from Twitter use @thiefree (hi Anna!). I’m going to break with form* for this one: rather than tackle a bit of pedantry, I’ve been asked to do a bit of a breakdown on terminology. Specifically, the thorny issue of English tenses.

“Tenses?” I hear you cry. “What’s the big deal? There’s still only like three of them, right? That’s not going to make much of a column.”

Oh, if only.

Of course, English isn’t an unusually or extravagantly tensed language – other languages have tenses, like the French compound past, the Swahili perfect of present state, the Swedish perfect inferential and the Alicante perfect of very-recent-past, that we just don’t – but in turn we have tenses other languages don’t, and actually it seems tense is a complicated subject all round; there are generally a dozen or more tenses in any language. Human beings have a complex and powerful relationship with time, and have sought ways to make their languages jump and leap around, conveying to each other that (for example) something has already happened, has been happening for a while, was happening for a while but stopped when something else happened, will have been happening for a while at some point in the future when something else might happen but might not unless something else happens,† and endless other varieties.

Grammarians entertain themselves by trying to tease these constructions out and coining terms for them, and it’s possibly an endless task, as it turns out we’re still up to it (more below).

And it turns out that tense isn’t really best described as a list of time frames, but a sort of matrix‡ of time-relative ideas, or perhaps a toolbox for building relations with time; a toolbox we’re still learning about.

So let’s unpack how we construct tense.

Fair warning: this is a blog, in part, about terminology, and if there’s one thing grammarians suck at, it’s agreeing on terminology. There’s a fair amount of “also known as” in grammatical terminology, and some of you will disagree with the terms I prefer. And I can’t really help you with that.

What’s in a Tense?

First of all, it’s worth understanding that conjugating a verb for tense isn’t necessarily done just by tweaking the word;** just as often, we combine it with one or more auxiliary verbs or copulas to complete the tense (hence how verb tense is a toolbox). The whole thing together is called a predicate or verb phrase (although the latter can also encompass adverbs and other modifiers).

The usual forms of the verb itself are the infinitive (eg. [toeat), an uninflected form of the verb used in a range of constructions (sometimes inflected for person and number, eg. he eats); the preterite (eg. ate), denoting a completed act; the participle (eg. eating¶), denoting a continuous act; and the past participle (eg. eaten§), a sort of metapreterite denoting something already completed before something else.

Auxiliary verbs include the perfect auxiliary (ie. have) and the copula (ie. be§), along with the modal verbs (will, shall, can, would, should, could and must, on which more below), although other verbs can be – and frequently are – pressed into service. Note that, in more complex tenses, one of the auxiliaries is inflected into the preterite or participle, rather than the primary verb.

Got all that? So these are the tools, the parts-of-verbs we use to build tenses. Let’s start using them.

Using the Toolbox

So the sixteen standard English tenses are grouped into four categories (confusingly also called tenses) – past, present, future and conditional – describing the frame of reference for the whole clause (that is, not every past tense is necessarily in the past and not every present tense is necessarily in the present, but those verbs are built around a point of reference in the past or present, respectively; think of it as a sort of flagpole you drive into the timeline before starting work).

Past, present and future tenses obviously mean the clause is anchored in the “before now,” “now” and “after now,” relative to the speaker, while conditional tenses describe something achored in the “never” (or “maybe”); something that is possible (could), desirable (would), or expected (should), but uncertain.

Within each tense are four aspectssimple, progressive, perfect, and progressive perfect – describing how the verb extends over time, relative to the clause. A simple verb happens all at once, at the flagpole; a progressive†† verb started before the flagpole and continues past it; a perfect verb is done and dusted somewhere before the flagpole; and a progressive perfect verb was ongoing for a while, but ended at or before the flagpole (or was interrupted by it).

Following me so far?

Putting it together is down to a sort of pic-n-mix of verb forms and auxiliaries (from the section above). The simple tenses, ironically, are the most variable:

  • Simple present verbs use the infinitive of the primary verb on its own, inflected for person and number if necessary.
  • Simple past verbs use the preterite of the primary verb on its own.
  • Simple future verbs use the infinitive with the modal can/will/shall,‡‡ inflecting the modal per the simple present tense.
  • Simple conditional verbs use the infinitive with the modal could/should/would.

Then the other tenses follow a standard pattern:

  • Progressive verbs use the participle of the primary verb with the copula be, inflecting the copula for tense per simple verbs above.
  • Perfect verbs use the past participle of the primary verb after the auxiliary have, inflecting the auxiliary for tense per simple verbs.
  • Progressive perfect verbs use the participle of the primary verb after the auxiliary have and the past participle of the copula be (ie. been), inflecting the auxiliary have for tense per simple verbs (which, altogether, means that the future progressive perfect opens with will have been, for a total of three auxiliary verbs).

Need a moment? There’s no easy way to make this less dry, but if you want to take a moment to grab a cup of tea and read it again, I’ll be right here (or on Twitter if you want to prod me).

Putting it Together

Okay, this is probably the best way to absorb all this; let’s show them in use. (Cut out and keep, etc.) Note particularly that this isn’t really a list of sixteen, but four matching lists of four. Think of it as a grid (actually I might make up a grid and post it up):

  • A simple present verb (eg. I eat) happens all at once, now.
  • A present progressive verb (eg. I am eating) is ongoing now.
  • A present perfect verb (eg. I have eaten) has just finished now.
  • A present perfect progressive verb (eg. I have been eating) was ongoing, but has stopped now.
  • A simple past verb (eg. I ate) happened all at once, at some point before now.
  • A past progressive verb (eg. I was eating) was ongoing, at some point before now.
  • A past perfect verb (eg. I had eaten) had finished before some point before now.
  • A past perfect progressive verb (eg. I had been eating) had been ongoing, but stopped before some point before now.
  • A simple future verb (eg. I will eat) is going to happen all at once, at some point after now.
  • A future progressive verb (eg. I will be eating) is going to be ongoing, at some point after now.
  • A future perfect verb (eg. I will have eaten) is going to have finished, before some point after now.
  • A future perfect progressive verb (eg. I will have been eating) had been ongoing, but stopped before some point before now.
  • A simple conditional verb (eg. I would eat) would happen right now, if not for some other factor.
  • A conditional progressive verb (eg. I would be eating) would be ongoing, if not for some other factor.
  • A conditional perfect verb (eg. I would have eaten) would have finished before now, if not for some other factor.
  • A conditional perfect progressive verb (eg. I would have been eating) would have been ongoing at some point before now, if not for some other factor.

Note that, while there formally isn’t such a thing as a conditional past or conditional future per se, the perfect and imperfect forms convey past and present tense anyway.

The Odd Stuff

Well, that’s us completely sorted for English tenses with absolutely nothing else that needs looking at, because of course English is a neat, logical language that never has confusing inconsistencies.

No? [Sigh.] It was worth a try.

Must

So there are all sorts of oddities. First off, there’s the peculiar must. The Old English mōtan was a verb meaning roughly “to be allowed” or “to be compelled,” which a long, long time ago had withered away to this one usage (today it’s charmingly called a “defective” verb) as a modal verb similar to but much stronger than will and shall. It’s uninflected – it’s always must, whether in the future, present or (rarely) past – and serves as the sort of opposite of a conditional.

Anterior Future

Next there are anterior tenses – tenses relative not to the speaker’s absolute “now,” but the relative “now” of a past-tense narration. Since most prose is written in the past tense, these are extremely useful for writers, and some languages, like French, have a full set. English mostly doesn’t use them at all, but we do have one – the anterior future tense – for when a past-tense narrator talks about an event in the story’s relative future, but in the absolute past. It’s formed like the conditional simple tense (eg. “he would eventually pay me back with a delicious cake”), although it may even predate it, since would was originally the past tense of will.

Historical Present

Past-tense narration also gives us the historical present tense, in which a past-tense event is described in the present tense, as in, “I’m at Uncle Bernie’s funeral yesterday when a man comes up to me and says, ‘And how did you know Bernard?’” Although not a formal tense in Standard English (by which I mean, your primary school English teacher would have told you it was “wrong”), it’s widely and deliberately used, in historical accounts, newspaper headlines and prose. Some novels (like Margaret Atwood’s The Handmaid’s Tale) are written entirely in the historical present, and some switch between past simple and historical present to create immediacy.

Historical Past?

A similar device is used when a past-tense narrative recounts an event further in the past for a long passage; formal English demands the past perfect tense be used, but this quickly becomes a repetitive, ungainly tangle of hads, had beens and had hads, so it’s common to stick to the past perfect for a paragraph and then slip back into the past simple having established the new frame of reference. This isn’t, as far as I know, formally called the historical past tense, but fuck it, that’s what I’m calling it.

Going-to Future

One of the most interesting irregular tenses is what I swear to God is actually, formally called the Going-to future tense,¶¶ where go to (in the progressive form be going to) is used as a sort of auxiliary verb similar to will or shall (eg. “I [will/am going to] eat”).

Originally, this was an example of a straightforward present progressive (ie. “I am going”) with what’s called a to-infinitive phrase, where a full infinitive (with the “to” still attached) is used as an object or modifier (ie. “to eat”). It was presumably meant in the present tense and described a motion towards something; ie. “Look, I’m going, I’m going to the kitchen right now to eat, stop hassling me, Mom.” But at some point we subtly migrated the to from the beginning of the infinitive phrase to the end of the primary verb go, and go to became an alternative auxiliary tense, with similar phrases planning to, fixing to, trying to etc. starting to follow the same rule. This has further evolved in contemporary African-American Vernacular English, with the contraction I’ma/Imma for “I’m going to” (and finna for “I’m fixing to”), and is no doubt going to evolve more in years to come.

The Odder Stuff

Is more irregularity heading our way? Who knows? I’ve spoken before about the innovations coming out of social media; how about “that time when,” prefacing a clause in the past simple tense, ironically describing a current or very recent event as though it were in the remote past? Is this a new tense? Could it become the basis of a new tense, in years to come? Who knows? It’s an amazing time to be a grammarian…

The Really Odd Stuff

Okay, I couldn’t really let this column go without mentioning time travel tenses. Why does time travel need its own tenses? Well, it doesn’t, necessarily; most time travel narratives stick to the personal chronology of the narrator (ie. if they’ve seen or done it, it’s in the past tense, and if they haven’t seen or done it yet, it’s in the future tense, regardless of when the events occur in absolute time). But there are questions to be asked: not just how to tackle an event that’s in the narrator’s subjective past but the absolute future, but how to describe an event that’s in both the narrator’s past and future? Or an event that would be in the narrator’s subjective future, but is now never going to happen thanks to events in the narrator’s subjective past?

Well, it seems the nerds are on it. Douglas Adams’ The Restaurant at the End of the Universe tackles the idea briefly (tongue very firmly in cheek) in the Guide’s write-up of Milliways, for instance. The Technobabble Wiki has proposed this (very) slightly more serious write up of possible rules. In the unlikely event you have to deal with this issue, you’ll most likely find you get on fine with subjective frames of reference and a certain degree of frantic hand-waving, but if you are interested, this is exactly where to look if you need to distinguish you willn’t from your cain’t.

Cheers all!

David

As always, if you want to argue with me, or to chat about this shit, or to propose a topic for a future blog, let me know! Tweet us; Facebook us; let’s have an argument/chat.

*Who am I kidding; I just fuck about on this blog anyway.

†Seriously, this fucking language…

‡”Do you know what I hate about this language, Mr. Anderson…?”

**But not in every language; in Mandarin, for instance, the verb is always uninflected, and tense construction depends entirely on context, sometimes aided with particles like le 了 and guò 過.

¶But here be monsters: the –ing ending can also denote a gerund, a verb inflected to turn it into an adjective or noun, as in “an interesting book” or “a running club.” Be aware of these.

§But here be further monsters: the past participle form and the auxiliary verb be can also be used in the passive voice (eg. “I ate the apple pie” and “The apple pie was eaten” are in different voices, but are both in the simple past tense). Be aware of these too!

††Also known as continuous in English, although other languages have separate progressive and continuous cases, indicating ongoing actions and ongoing conditions respectively.

‡‡Can as a future tense is a bit of a weird one; “I can eat,” on first examination, seems to be referring to the present (ie. I’m ready to eat right now). But it always refers to a potential future act; I may be about to eat imminently, but I’m not actually eating now. Consider the simple future “I can be ready at eight.”

And I guess for completeness’ sake I should point out that formal Standard English grammar, as laid out by our old chums the eighteenth century grammarians, required you to use will in the second or third person and shall in the first person (ie. “I shall” vs “he will”), which is very boring. What’s more interesting is where this rule comes from (for once it’s not just an invention of a random school teacher!). The Old English willan meant “to wish, or want,” while sculan meant “to be obliged”; the rule reflected the assumption that predictions about your own actions are presumably more reliable than predictions about other people’s!

¶¶Like, seriously? They couldn’t think of a fancy Latin name for it? Even a bit Latin? Latinish?

Posted on

Word Nerd: The Mathematics of Grammar

Well I ain’t never, I ain’t never
Seen nobody like you no, no, no;
Never have I ever seen nobody like you.
“I Ain’t Never,” Mel Tillis

Hola wordies! It’s my birthday today!

So let’s talk about an old saw in the world of pedantry: the double negative.

It’s an easy target, really: “I never done nothing!” says an English speaker. “Ho, ho,” says the pedant, sipping his brandy.* “In that case, you must have done something! Because, you see, if you’ve never done nothing, then you’ve always done something.”

And in a just and fair world, the pedant would immediately step on a rake and receive a concussion from the handle, but sometimes God just isn’t on our side.

So let’s look at this. Is it a thing? Can you not use two negatives for emphasis? Do they always cancel each other out? Is this the ancient and inviolable truth of English?†

Negative Concord

Ha-ha, of course not! Old and Middle English were perfectly happy using double negatives. In Chaucer’s The Canterbury Tales, describing the remarkable gentilesse of the Knight, the poet says he “nevere yet no vileynye ne sayde in all his lyf unto no maner wight” (that is, “he didn’t never say nothing malicious about no manner of person in his whole life”), a whopping quadruple negative. In Shakespeare’s Twelfth Night, Viola (masquerading as Cesario) swears she has “one heart, one bosom, and one truth, and that no woman has, nor never none shall mistress be of it, save I alone.”‡

It isn’t limited to sentence structure, either! As I mentioned before, a delightful quirk of English word construction adds the negative prefix dis- (as in disconnect or disarray) to words that already have a negative sense, with much the same effect; thus, both annul and disannul mean “to cancel or reverse,” avow and disavow mean “to formally reject,” and embowel and disembowel mean “to eviscerate.” It’s a double negative in one word!

In general, this device is called negative concord, and is a rule in any number of languages around the world, including Portuguese, Russian, Italian and, for most of its history, English. And it’s a great device! It’s definite, emphatic: I will NOT – NEVER – do NOTHING like that. Bam! No-one’s coming away from that with any sort of ambiguity.

So what happened? Why did we change?

Enter the Grammarians

Would it surprise you to learn it was the eighteenth-century grammarians?

The earliest mention we can find of the new rule is by a schoolmaster and grammarian called James Greenwood, in whose An Essay towards a practical English Grammar, Describing the Genius and Nature of the English Tongue (1711) he insisted “Two Negatives, or two Adverbs of Denying do in English affirm.” He was echoed by noted and prolific pain-in-the-ass Bishop Robert Lowth in 1763, fairly dramatically suggesting that two negatives “destroy one another.” Many of their peers took up the cry, and entering into the modern age it had become an agreed rule of formal English.

Why? Well, Latin, as usual, which does not allow negative concord (nor does German, for some reason). It may also have appealed to their sense of logic; as the middle class was becoming more literate (driving, among other things, the boom in English grammars), they were also becoming more educated, and would have been familiar with mathematical concepts such as the rule that the product of two negatives is positive.** Instead, they proposed using positive qualifiers – such as ever, anyone, at all, and the like – to emphasise a negative term, much as multiplying a small negative by a large positive generates a large negative.

And, of course, it’s not all bad. If, as Lowth insisted, two negatives “are equivalent to an affirmative,” then they allow a sort of nuance. Aside from standard and emphatic negatives and affirmatives, we have the cancelled negative, or litotes: consider the self-effacing “I’m not bad,” for instance, or “he’s not unattractive.” It’s a technique that can rest comfortably alongside negative concord without confusion; even in the Old English Beowulf, we read that “Né húru Hildeburh herian þorfte eotena tréowe” (that is, “Truly, Hildeburh did not have need to praise the good faith of the Jutes”), since, it seems, the Jutes had slaughtered her son and brother. Even using Greenwood’s and Lowth’s rules, we can play with language, using double negatives for understatement or irony – two of the English language’s favourite games.

The Agreeing No

But is this end of it? Not at all! English is finding new ways to combine negatives and positives all the time, and a lot of them don’t even have fancy Latinate names yet.

Like the controversial “agreeing no,” a construction that seems to have turned up in the last fifteen years or so, possibly in Australia. Using a negative to agree with a negative (eg. “There’s no need to kill him,” “No, there isn’t, but I might anyway”) is fairly standard, but for the past few years, it’s become more common to use a negative to agree with an affirmative.

Mixed and confusing though it may seem, something like “Yeah, no, you’re right” or “No, I agree” or “Yes, no, absolutely” serves as definite, even emphatic agreement. The role of the negative here seems to be to both raise and dismiss an imagined counterpoint; consider the following exchange:

“The growing diversity in SF and fantasy is great for the genre.”
“No, you’re absolutely right.”

The second speaker is saying something like, “Quite! Some may complain about tokenism or ‘reverse discrimination,’ but no, you’re absolutely right,” but simply excising the counterpoint altogether. The “no” is acknowledging its existence but dismissing it from the conversation at the same time.

(On the other hand, there’s also a sarcastic “Yeah… no,” given in answer to a request or proposal, which seems to mean, “Yeah, this is me pretending to think about this, but no. Don’t be silly.” Which is really remarkably satisfying.)

Double Positives?

Are we done yet? Maybe not. You know that joke about the English teacher that says, “Two negatives make a positive, but two positives never make a negative,” and the kid in the back row says, “Yeah, right”?

Sure, it’s a joke, and the sarcastic irony in the popular “yeah, right” form is obvious – doubling the positive is apparently intended to emphasise the irony rather than reverse the sense – but this is an enduring construction (remember “shyeah, right” from 1992’s Wayne’s World?), with no small amount of variation, including okay, yeah and yeah, sure and so on.

Has the ironic double-positive become a new grammatical rule? Possibly. Certainly it seems necessary to double the positive; you don’t get quite the same effect from a single positive, however sarcastically you say it. I expect grammarians of this century to identify and label the ironic double positive and the agreeing no, and to net PhDs and careers out of stuff like this. In the meantime, we can just enjoy playing with the rules as we get along, and not trouble ourselves too much with the mathematics of grammar.

Cheers,

David

As always, if you want to argue with me, or to chat about this shit, or to propose a topic for a future blog, let me know! Tweet us; Facebook us; let’s have an argument/chat.

*Pedants always have smug brandy to hand.

†I mean, obviously not, but let’s have a hint of suspense, here.

‡Leaving out the double negative, there’s a neat little play with “no mistress… save I,” where she’s basically hinting at the disguise. Get it? Huh? Applaud, dammit! THIS IS SHAKESPEARE BEING CLEVER.

**Although as the Merriem-Webster pointed out in 1994, the sum of two negatives is a larger negative. Apparently either eighteenth century grammarians thought grammar should act more like multiplication than addition, or just didn’t notice that Lowth pulled a bit of a fast one, there.

Posted on

Word Nerd: The Definition of Marriage

“Marriage is defined as the union of one man and one woman.”

Good morning nerds!

So here’s another slightly political one. With Australia’s incredibly tight federal election results still coming through and the issue of marriage equality very much on the country’s minds, and after UK Prime Ministerial candidate Andrea Leadsom’s rather confusing defence of her anti-marriage equality stance, I thought it would be a good time to tackle one of the strangest and most persistent arguments offered by opponents of reform (and one that is entirely pertinent to this blog): that the definition of the word itself somehow forbids it.

The Ol’ Definition-of-Marriage Gambit

On the face of it, it seems like circular reasoning: that you can’t change the law to allow same-sex couples to marry, because the law currently says same-sex couples can’t marry.* What’s this obsession over the “definition” of marriage? Aren’t you just telling us what the law says? If that’s really the case, just change it!

Ah, but there’s the rub.

You see, this move comes at the end of a now decades-long public debate. Once, people spoke of sin, of morality, of standards, but those are private questions† and not a matter of law. They spoke of the purpose of marriage – of sex, childbirth, the family – but we allow the impotent and infertile to marry without question, so that doesn’t hold water either. They flailed at appeals to propriety, speculated at the longevity of same-sex relationships, warned of the social harms of “lifestyles,” spoke of examples to others; they claimed it was extending an unfair privilege to an unrepresentative minority.‡ Finally they were told, “Look: ultimately, it doesn’t affect you. Your own marriages are just as heterosexual as they’ve ever been. There’s no real reason you should have a say.”

“Ah,” comes the reply, “but it does affect us!” Because, you see, the traditional composition of a marriage – one man and one woman – isn’t just a limitation or constraint on the union, but part of what defines it; and altering that definition will necessarily, retroactively change the character of the institution others have already entered into, without their consent or approval.** The marriage they entered into, they argue, is an institution created to nurture families and build communities (even if not every marriage leads to a family), and to replace it with a mere protestation of love or a legal convenience is to take that away from them.

Of course, there are any number of ways a same-sex couple might become parents, and they can build communities as much as anyone else; marriage for love or to manage property has been the rule rather than the exception for much of history; and at any rate, why any given couple gets married doesn’t really have any bearing (the law just doesn’t work like that). But that’s the argument, in a nutshell: marriage has a clear legal definition that includes its composition, that definition supposedly has a moral and social purpose, and so changing the definition undermines the purpose, changing the institution to the harm of all those already married.

Or this argument, which, I guess, is that gay couples have an unfair weight advantage?

So is there any justification in that? Is the composition of a marriage truly part of its definition? Where did the oft-quoted phrase “marriage is defined as the union of one man and one woman” actually come from, and on what basis is it held to be absolute and binding?

Well, let’s see.

The Hunt for Biblical Precedent

So a lot of people seem to think it came from the Bible (or from God himself, although I’m taking that as meaning “it’s in the Bible” and not that they receive direct information from the Almighty¶). But the phrase doesn’t appear anywhere in the Good Book; in fact, there’s no obvious attempt to define the word at all. It offers plenty of rules for marriage – on the bride’s father’s authority; on dowries, bride prices and purification rites; on interracial marriage, or marriage between gentiles and Jews§; on whether Christians remain married in the afterlife – but never actually defines marriage.†† The Bible, when all’s said and done, is a scripture, not a dictionary, and it assumes you already know what these things are.

Because to the audience it was written for, marriage was ubiquitous, and ancient. Every culture in the world has a tradition of marriage. In every history, kings marry queens, and in every mythology, the gods themselves wed (and commit adultery!). Hammurabi’s famous law code – the oldest known, dating to around a thousand years before the oldest passages in the Bible – covers divorce law. We’ll never know how old marriage is, as an institution; hell, Neanderthal “Venus figurines” are assumed to have played a role in a fertility rite, which (reaching slightly, here) potentially dates some form of marriage to the beginnings of humanity. It seems like nobody’s ever needed to define it, because it’s always existed.

That said, the Bible does a nice line in providing examples of approved marriages, including one man and one woman (various); one man, one woman and the woman’s slave (Abraham, Sarah and Hagar); one man, one woman and the woman’s sister (Jacob, Rachael and Leah); one man, seven hundred women and three hundred concubines (Solomon); one man and his dead brother’s widow (Levirate law); and, rather unfortunately, one rapist and his victim, or one soldier and his prisoner (laws in Deuteronomy). As Biblical scholar Robert R. Cargill wrote in the Des Moines Register in 2013, anyone who argues that “the Bible speaks plainly on one issue, especially something as complicated as marriage… haven’t taken the time to read all of it.”

Pictured: Biblical marriage.

But let’s move on. We’ve a phrase to hunt.

HIGH-FIDELITY, HIGH-Definition

So if marriage isn’t defined in the Bible – if the Bible assumed, even three thousand years ago, that people already knew what the word meant – maybe it’s to language we need to turn? Specifically, to dictionaries? I mean, obviously, “You can’t change the law, it’s in the dictionary” isn’t a compelling argument (although with the growing trend towards “plain language” rulings, court justices are increasingly turning to dictionaries to interpret the law), but it might be a clue where the argument came from, at the very least; and may even uncover the phrase itself.

Here we seem to be making progress. The Oxford English Dictionary defines marriage as a “formal union of a man and a woman, typically as recognised by law, by which they become husband and wife” (although they’re due to change it to reflect current usage). Webster goes with “the state of being united to a person of the opposite sex as husband or wife in a consensual and contractual relationship recognized by law” (and they added same-sex unions in 2003). A great many other dictionaries, legal and common, do the same.

Still, are these useful definitions? Saying that marriage is how you become a spouse is more or less a tautology (helpfully, husband and wife are defined as “a married man/woman”). Saying that a marriage is a “union” is unhelpfully vague. What sort of union are we talking about? If pressed, most people would come up with something like “a formal agreement to be romantically and sexually exclusive,” also mentioning shared property and responsibility and an expectation of moral and emotional support. Not every marriage exactly meets these criteria, but they’re more or less the expectation, and even have some weight in law; so why do the dictionaries overlook them?

Well… it’s gonna be down to Samuel Johnson, really, isn’t it? His dictionary was the foundation on which other dictionaries were built; the Oxford English Dictionary cribs from him so often he gets his own notation, “J.” Many of the oddities of our language today begin with him. Johnson’s definition? “The act of uniting a man and woman for life,” which Webster actually copied word for word in his first edition. So I guess the question is, why didn’t Johnson mention intimacy, or romance?

You might get a hint from his definition of sex, which covers gender classification but not any other use. Or intercourse, which is missing altogether. Copulate is defined as “to link together,” and copulation, reluctantly, as “the congress of the sexes” (and congress as “a meeting”). Clearly, Johnson wasn’t going near it with a barge pole. “Uniting a man and woman for life” was about the strongest euphemism he was willing to use for what marriage actually was; he was pointing out that… “you know… a man and a woman…? ‘United’? Do I have to draw you a picture?”

Hey, he doesn’t have “euphemism” either! Why, that no-good –

In the end, the composition of a marriage alone can’t be a useful definition; “a contractual relationship between a man and a woman” perfectly describes co-owning a sandwich bar (assuming the co-owners are a man and a woman). The nature of the relationship is crucial to understanding. Eccentric as he certainly was, Emperor Nero’s two marriages to men didn’t pose his contemporaries any vocabulary difficulties. They might not have approved of it, but they certainly knew what he meant when he said he and his husband were married. And that was nearly two thousand years ago.

So, the traditionalists can have the dictionaries, but only, I suspect, because a rather prudish eighteenth-century academic wasn’t sure how to explain Mummy and Daddy’s “special hug.”

Legislating My Way Into Your Heart

So let’s turn to the law! After all, it’s a law we’re talking about changing.

On the face of it, we’re on promising ground. The United States’ Defense of Marriage Act very clearly stated that “the word ‘marriage’ means only a legal union between one man and one woman as husband and wife.” Australia’s own Marriage Act insists that “marriage means the union of a man and a woman to the exclusion of all others, voluntarily entered into for life.” Done and done, right? Ancient, irrefutable legal covenants.

Except that, of course, DOMA was passed in 1996, and the relevant passage in the Australian act was only added in 2004. Oddly, there were no laws banning same-sex marriage in either country, or indeed in much of the world, until fairly recently. There were no written laws defining marriage either, for that matter. Much like the Bible, legislation has generally concerned itself with how one is married – by whom, under what circumstances, what the ceremony entails – and keeping a thorough administrative trail. Lawmakers assumed people more or less knew what a marriage was without their help; or else trusted them to have and live by their own definitions without interference.

The first significant test case in the US was Baker v. Nelson, in 1971, when a gay couple applied for a marriage certificate that the law did not formally forbid them and were refused. A suit and appeal followed, and the Minnesota Supreme Court ruled not that same-sex marriage was illegal – which it wasn’t – but that denying them the certificate was not unconstitutional. So it wasn’t illegal for them to marry, but it also wasn’t illegal for people to say they couldn’t. Hardly conclusive.

At any rate, clearly, our search continues.

Just in Case (Law)

Because if it’s not in the Bible, and it’s not on the law books, and the dictionary definitions probably come from a rather prurient euphemism by the good Dr. Johnson (and in any case dictionaries aren’t laws), where does it come from? Like me, you’ve almost certainly attended a civil wedding where the registrar said, “the law defines a marriage as a union between one man and one woman,” and before the new laws were passed to boot. So if there was no such law, why did they say it? God damn it, Dave, where does this phrase come from?

Turns out, the one place we haven’t looked yet: case law.

Case law, or precedent, is the body of former decisions that a justice may consult and use as a guideline when interpreting the law. In many ways, it’s the better half of the law, where the rough work of legislation is tested and refined, producing laws that are practical, robust, consistent and fair. And a good precedent can have enormous reach; justices have been known to look across the seas, where different countries have the same (or very similar) laws and clear, worthy precedents have been set.

And then sometimes judges get totally quoted out of context, accidentally creating an utterly arbitrary precedent that basically gets quoted in every English-language court in the world for a century and a half. Welcome to Hyde v. Hyde, 1866.

So, the backstory: John Hyde was an Englishman who in 1850, having converted to the Church of the Latter-Day Saints and been ordained as a minister, met a young Mormon woman called Lavinia Hawkins, went with her to Salt Lake City and was married there. He later left the community, started preaching against the Mormon faith and was understandably excommunicated, whereupon Lavinia Hyde was told she was free to marry again (and promptly did). On his return to England, Hyde was apparently concerned as to whether his marriage was still in place or not (presumably hoping himself to remarry), and so applied to divorce her on grounds of adultery. With me so far?

Justice Lord Penzance’s ruling was utterly extraordinary. Hyde’s petition was denied, but not because he didn’t have sufficient cause. The judge insisted that Hyde had never been married in the eyes of the English court system. Since the Mormons at the time still practised polygamy, he reasoned, their marriages couldn’t be considered true marriages. You might point out – and Hyde’s lawyer certainly did – that the Hydes hadn’t had a plural marriage; but no, said Penzance, since some of the marriages the LDS conducted were group marriages, none of their marriages were legitimate. “I conceive that marriage,” he said, “as understood in Christendom, may for this purpose be defined as the voluntary union for life of one man and one woman, to the exclusion of all others,” and if a church didn’t recognise that definition, the marriages they conducted weren’t real. And at that moment he made legal history.

It’s an odd foundation for a global legal institution, to be sure. For starters, the statement was obiter dicta – an informal commentary rather a formal ruling – and should never have been treated as precedential. Second, it was a fairly minor ruling by a normal court (the English Court of Probate and Divorce, a “superior” court, which could at the time be appealed at the Court of Exchequer Chamber); it’s unclear why it should hold particular weight in the UK, much less across the whole world. Third and most importantly, because it clearly has nothing to do with same-sex marriage. “For this purpose,” he said, very specifically, before launching into a long speech lovingly describing the polygamous practices of savage Middle-Eastern cultures. Hell, earlier in the same hearing, he called marriage “the union of two people who promise to go through life alone with one another.” Why the hell couldn’t we have kept using that definition forever?

In earnest, would Penzance approve of same-sex marriage? I doubt it. But this ruling – an informal comment from a 150-year-old divorce proceedings, hinging on whether a marriage in a culture that permits polygamy is binding in a country that doesn’t – really has no bearing on it at all… ‡‡

…and is exactly where the “definition of marriage” argument comes from.

Peace out.

As always, if you want to argue with me, or to chat about this shit, or to propose a topic for a future blog, let me know! Tweet us; Facebook us; let’s have an argument/chat.

*And, well, obviously; that’s why we want to change it.

†And sort of shitty ones at that.

‡Seriously. I remember a dude explaining to me that really the marriage laws weren’t discriminatory, because a gay man had exactly the same right to marry a woman that a straight man had. I pointed out that, of course, after the law changed, he’d have exactly the same right to marry a man that a gay man would. He didn’t reply, which I thought was slightly disappointing.

**I suppose the obvious compromise – a one-time, no-questions-asked annulment ab initio for all heterosexual married couples who no longer wish to be implicated in the institution – isn’t quite what they have in mind.

¶I’m not ruling that out, mind. Bachmann seems like the type.

§Both forbidden, natch.

††This shouldn’t come as a surprise; it doesn’t define priest either, or temple, or jawbone of an ass. Hell, it only very broadly defines God.

‡‡To be honest, I don’t have that much of a problem with polygamy, either.

Posted on

Word Nerd: Begging the Other Cheek (Eight Phrases That Didn’t Always Mean What They Mean Now)

What up, my nerdlings!

So here’s a bit of a fun one. Over the years, as I’m sure you’re aware, any number of phrases have been coined in literature – in various classic books and tales, in the work of Shakespeare, in the Bible – and entered into common usage in the English language. What you may be less aware of is how often those phrases are immediately, massively distorted by misuse.

To that end, eight phrases you know and use, which no longer quite mean what the original authors meant…

Begging the Question

Where It’s From: Originally Aristotle’s Prior Analytics, as τὸ ἐν ἀρχῇ αἰτεῖσθαι (“asking the original point”), which was translated into Latin as petitio principia (“asking/pleading the premise”) and thence into English, in the 16th century, as begging the question.

How It’s Used: Very often, to mean that a statement has provoked further questions, as in, “Ted got the beers in, which begs the question, where did he get money?”

What It Originally Meant: Formally, begging the question is repeating your initial premise as proof for itself, as in the phrase, “broccoli is good for you because it’s a health food” or “one should believe the Bible is the word of God because it says it is and, being the word of God, it’s infallible.” It’s properly an informal fallacy, a rhetorical or logical device which isn’t necessarily false but which doesn’t prove the point it’s intending to prove.*

At any rate, nothing to do with Ted’s beers, which aren’t (according to the older sense) so much begging questions as raising them.

Begging the question is proving what is not self-evident by means of itself… either because predicates which are identical belong to the same subject, or because the same predicate belongs to subjects which are identical.

Aristotle, Prior Analytics, II, xvi
trans. Hugh Tredennick

Foregone Conclusion

Where It’s From: Shakespeare’s Othello, as Iago tells his captain (lying through his teeth) that their lieutenant, Cassio, has been dreaming erotically of his (Othello’s) wife Desdemona.

How It’s Used: We usually use it, these days, to talk about something where the outcome is obvious, given current information. In a fight between a big guy and a little guy, we know the big guy’s probably gonna win; when you ask your mum for a second ice lolly, you know she’s gonna say no.† It means a safe bet, an unavoidable (or hard to avoid) future largely out of human hands.

What It Originally Meant: By conclusion, Shakespeare meant a “decision”; a foregone conclusion was a premeditated act. With Iago’s revelation, Othello has decided that Cassio’s and Desdemona’s (also fictional) infidelity was not a spur of the moment act, but a planned betrayal. Where we use the phrase to describe something that no-one can really choose or prevent, Shakespeare meant something that might not be inevitable or obvious at all, but was decided upon and deliberately carried out by those involved.

…or… “conclusion” was a euphemism for shagging, and Othello thought that Cassio’s dream meant he’d already got his end away with Desdemona prior to the dream. This is actually a point of contention among Shakespeare scholars! It’s definitely not being used that way now.

Othello: O monstrous, monstrous!
Iago: Nay, this was but his dream.
Othello: But this denoted a foregone conclusion.
Iago: ’Tis a shrewd doubt, though it be but a dream,
And this may help to thicken other proofs,
That do demonstrate thinly.

Othello, Act III, scene iii.

The Game is Afoot

Where It’s From: Henry IV Part I, Act I, scene iii, “Before the game is afoot thou still let’st slip.” Northumberland is being quite rude to his son, suggesting he’s already fluffed his part in their scheming.

More famously, though, it’s also one of Sherlock Holmes’s most-quoted lines, second only to “Elementary, my dear Watson.”‡

How It’s Used: Generally, to suggest a game – you know, with dice and a board, and cards or something. It means something interesting and challenging has begun in earnest. The BBC’s Sherlock modernises it to “The game is on!”

What It Originally Meant: It’s a hunting metaphor. “Game,” in this context, is an animal hunted for its meat (as in “game bird” or “game pie”). When the game is “afoot,” it’s on the run and the hunt has begun. Holmes most certainly used it in that sense – his “game” being Sir Brackenstall’s murderer, in The Adventure of the Abbey Grange – but hunting is less relevant to most of us than it used to be, and so nowadays we mostly assume he’s talking about chess or something. He liked chess, right? He must have done.

“Come, Watson, come!” he cried.
“The game is afoot. Not a word! Into your clothes and come!”

The Adventure of the Abbey Grange

The Left Hand Doesn’t Know What the Right Hand is Doing

Where It’s From: The Book of Matthew. It’s from the famous “Sermon on the Mount,” where Jesus laid down the law for the new covenant he was offering.

How It’s Used: To suggest organisational incompetence. “Look here, we’ve been ordered by Procurement to buy six new cars, and Finance are telling us to scrap four of the ones we’ve got! Tch. You know, the left hand don’t know what the right hand’s doing, I tell you.”

What It Originally Meant: It was an injunction! Jesus was preaching against performing kind or charitable deeds for public praise; he said, to show God you’re really nice, you should do your charity in secret. And not just in secret, but so secret that even you don’t know you’re doing it! You’re literally unconsciously giving money to the poor with one hand while concentrating on something completely different with the other.

This is, of course, ridiculous, but it’s probable that the Sermon on the Mount was meant sort of rhetorically (more on this below).

But when thou doest alms, let not thy left hand know what thy right hand doeth: That thine alms may be in secret: and thy Father which seeth in secret himself shall reward thee openly.

Matthew, 6:3-4
King James Version

The Milk of Human Kindness

Where It’s From: Macbeth. Lady Maccers – who is straight up one of the illest characters you can play – is cussing out her husband for being too full of it (milk).

How It’s Used: In a quite nice way. Milk is, of course, a tasty and wholesome drink, and specifically something our society associates with mothers and nurturing. So the milk of human kindness is a compassionate, life-giving quality we admire and respect in others.

What It Originally Meant: She wasn’t thinking of milk in terms of its nourishing qualities, so much as its pallor and mildness. To be milky is to be light, weak, tasteless; none of them qualities suggestive of strength and determination, and Lady M was crazy keen on strength and determination.

Shakespeare returns to the milk imagery in Henry IV, Part I, when Hotspur, reading a letter from an unnamed conspirator, dismisses him as a “dish of skim milk,” something presumably even weaker and more contemptible than regular milk. I dunno what Shakespeare’s got against milk. Milk’s quite nice.

Glamis thou art, and Cawdor, and shalt be
What thou art promis’d. Yet do I fear thy nature,
It is too full o’ th’ milk of human kindness
To catch the nearest way.

Macbeth, Act I, scene v.

Nature, Red in Tooth and Claw

Where It’s From: Tennyson’s In Memoriam A. H. H., a lengthy elegiac to his friend Arthur Hallam, who’d died suddenly in 1833. It’s a meandering discourse on love, loss, and human nature.

How It’s Used: Usually to describe the violent and impersonal nature of Nature, whether in the form of animals or the elements. The implied contrast with humans, who are compassionate and rational, is generally intentional.

What It Originally Meant: It’s actually about humans! Tennyson was struggling with humanity’s tendency to selfishness, and the growing materialism that was doing such a good job of explaining the world he lived in. If we are all just sophisticated beasts, if we are all driven by the simple mechanisms of evolution, then where can we see proof that love – God’s ultimate law – does or should govern us?

Who trusted God was love indeed
And love Creation’s final law
Tho’ Nature, red in tooth and claw
With ravin, shriek’d against his creed

Tennyson, In Memoriam A. H. H.

Samaritan

Where It’s From: The Book of Luke. It’s one of the “Parables,” which were sort of moral riddles that Jesus used to tell his followers. He was crazy about riddles. Like, at his birthday, if you didn’t read the riddle in your cracker, he’d grab it and read it for you. Loved him some riddles.

How It’s Used: A Samaritan, “good” or otherwise, usually means someone who helps a stranger – especially one in dire need, who others are ignoring – with no expectation of reward or recognition. Aww. There’s even a suicide charity called The Samaritans – without the “Good,” which given the below makes me insanely suspicious of them.

What It Originally Meant: The Good Samaritan of the parable behaved in exactly that way, sure enough. But the point of the story was that Samaritans were famed for their selfishness and officiousness; “good Samaritan” was intended as a surprising dichotomy, like “ethical investment banker.”

Of course, the Samaritans are an ethnic and religious community that exists to this day, and I don’t believe they’re especially keen to be regarded as petty and grasping, but I guess it’s okay to be a bit racist when you’re quoting the Bible?

I dunno, man, it doesn’t seem like it should be.

And a Samaritan, as he traveled, came where the man was; and when he saw him, he took pity on him.

Luke, 10:33
King James Version

Turn the Other Cheek

Where It’s From: The Book of Matthew again! Another one from the Sermon on the Mount. Basically Matthew’s all about the Sermon on the Mount.

How It’s Used: This is something your mum or teacher used to tell you if you were bullied or provoked at school. It suggests stoicism and self-discipline; just look away (i.e. “turn your cheek”) and ignore them.

What It Originally Meant: Jesus was going a bit further than just “don’t rise to their bait.” He’s telling you to actively participate in your own victimisation; if a man hits you on one cheek, he says, then present the other cheek so that he can hit that one too. Your mum is unlikely to tell you to do this. The Sermon is famously one of the most challenging parts of the New Testament, presenting such an extreme model of virtue – offering yourself up to violence, performing charity in your sleep, never even accidentally thinking about sex with the wrong person – that, as mentioned above, it’s usually seen as rhetorical rather than intended literally.

Ye have heard that it hath been said, An eye for an eye, and a tooth for a tooth: I say unto you, that ye resist not evil: but whosoever shall smite thee on thy right cheek, turn to him the other also. And if any man will sue thee at the law, and take away thy coat, let him have thy cloke also.

Matthew, 5:38-40
King James Version

As always, if you want to argue with me, or to chat about this shit, or to propose a topic for a future blog, let me know! Tweet us; Facebook us; let’s have an argument/chat.

*A fallacy that is by definition untrue is a formal fallacy. Fun!

†Hence why you ask your dad instead.

‡Which, in fact, Holmes never says.

Posted on

Word Nerd: Gruntled, Wieldy and Pulchritudinous; On Onomatopoeic Theory

Hola nerds!

So a little while ago I was sent an email by one of the guys in our IT department, asking me a word question:*

backformation, working back from disgruntled, mostly used in client management (sometimes as regruntled) to describe the fruits of successfully resolving complaints.

But, as with wieldy, it’s likely you never use it.

And there’s a reason for that…

Words Don’t Sound Right

You see, there are loads of words in the English language – like, dozens at least – and in many cases several words for each concept (even synonym has a synonym!), and those words are in competition for the limited bandwidth on our gristly human vocal cords. When it comes down to it, which word do you reach for to describe a thing?

In practice, it seems like you reach for the word that sounds right, or at least sounds best for the purpose. The reason we don’t talk about things being wieldy is that it just doesn’t sound comfortable. It has that awkward rounded EE-y sound, and that tricky little L, and for a word that specifically describes something as easy and manageable, it just doesn’t seem fitting. Unwieldy’s a great word: there, the clumsiness of the word is a virtue; it just sounds awkward. But wieldy? Just doesn’t sound like what it’s describing. In much the same way, gruntled just sounds unhappy, no matter how you use it.

There are any number of words that behave like this; words that are still there in the English lexis, but which no-one really uses because frankly they’re not very fit for purpose. Like restive, a really quite sedate-sounding word for “manic.” Or enervated, a remarkably buzzy word for “drained” or “exhausted.” Or pulchritude, which means “breathtaking beauty” and sounds rather like a skin condition.

In a few cases, words have even started to drift in meaning. We still use them, but we’ve just flipped their senses, creating contranyms; like peruse, which originally meant “to read closely” but these days more often refers to lightly skimming a text.

But I Hear Them All Moving Inside You

And the flip is true as well! Words that fit the purpose well seem to succeed and flourish. In fact, you’ll often find – certainly within a given language – that certain sounds become associated with particular values, creating whole clusters of similar-sounding, loosely-related words. Note how unpleasant slug, slimy, slither, slick, slow, slush, and slurry all are, or how light and easy glint, glitter, glisten, glimmer, glamourglad and glide are.‡ Strong somehow feels strong, with the harsh str- and the solid -ong; weak feels weak, with the soft w- and the attenuated -eak.

There’s even a theory that this is where all language comes from, known as the Onomatopoeic Theory of Language, or just the onomatopoeic theory. The idea is that our earliest ancestors tried to convey ideas to each other with onomatopoeic noises – hissing to say snake, for instance, or whistling to say wind – and language gradually built up around this core, with the most evocative or apposite sounds surviving, becoming stylised and eventually forming the first words.

So is there anything to this theory? Are these sound-values I’ve noticed universal truths, or am I projecting onto them because I happen to know what the words mean?

Well, that turns out to be a bit of a topic of debate.

Go, I’ll be Waiting When You Call

Why is that?

So Ferdinand de Saussure, the father of semiotics, insisted that the link between “signs” and “signifiers” – ie. between the sound a word makes (or the shape we draw on the page) and the sense we attach to it – was completely arbitrary; all that was required (or should be allowed, in the linguist’s mind) is that the sign is unique to the signifier and that everyone sharing the use of that sign agrees on its significance. There’s no “natural dogness” to the word dog, he said, and no “treeness” to the word tree.

And certainly that’s the principle that linguistics has largely stuck to since. Similar-sounding words are regarded as sheer coincidence, or proof of etymological relationship, rather than universal human things. The onomatopoeic theory’s largely been dismissed, often referred to (a little unkindly, I’d have said) as the “Bow-Wow Theory,” or “Ding-Dong Theory,” and we’re all structuralists these days.**

But… there seems to be something to the idea that certain sounds – or signs – are more fitting than others. In 2001, Neuroscientists Vilayanur S. Ramachandran and Edward Hubbard created the “Bouba/Kiki” Experiment, modifying an older experiment by psychologist Wolfgang Köhler. Subjects were shown two shapes, one spiky and one bulbous, told the shapes’ names were Bouba and Kiki, and asked which names applied to which shapes. Between 95% and 98% of respondents identified the spiky shape as Kiki and the rounded shape as Bouba, irrespective of their native language (the experiment was performed in English, French and Tamil). It seems like Kiki just sounds spiky, as Bouba sounds… blobby.

So… maybe we don’t throw out the Bow-Wow with the bathwater? I dunno. Not my place, I guess. Let the linguists argue about it.

And Whenever I Fall at Your Feet

Either way, the upshot for you and me is that the words we use are often the ones that generations of English speakers loved the most and used the most often, or felt fit best; the ones they repurposed to fit better (or just to keep beautiful words in the language, doing new jobs); they pruned the uglier, ill-fitting words, and molded the best to serve them.

Ultimately, this is why I keep banging on about ignoring the prescriptivists. Because the language was created and shaped by people working on gut feeling, so trying to apply any sort of universal, arbitrary rules is never going to work. Beauty and utility are genuinely the best guides to the language, because beauty and utility are the criteria by which the language has been formed and cultivated, all these centuries…

As always, if you want to argue with me, or to chat about this shit, or to propose a topic for a future blog, let me know! Tweet us; Facebook us; let’s have an argument/chat.

*Yeah, that happens to me more often than you might think.

†Yeah, I got a double-negative post coming for you soon, I promise.

‡Beautifully, Anglo-Saxon poetry largely ignores rhyme schemes in favour of alliteration (where a string of words start with the same letter or letters), precisely to take advantage of this phenomenon, so that the Rood (in the Dream of the same name) was both on lyft lædan, leohte bewunden (“raised in the air, enveloped in light”) and forwunded mid wommum (“sorely wounded in iniquity”). Them Anglo-Saxons, they loved their alliteration.

**Except for Mama and Dada/Papa, which appear to be genuinely pan-human and which linguists absolutely love speculating about, which just goes to show, I guess.

Posted on

Word Nerd: The only SPAG I want comes with bacon and cream

Hello all!

Today’s Word Nerd might be a bit… political.*

You may recall, a few weeks ago, a bit of a kerfuffle about the national curriculum’s new exclamation marks rule? In case you missed it, the government eighteenth-century-grammarian rules. A child who already clearly loves reading, but a this stage could either a) become so ground down and upset by shit like this they lose interest and go into something dumb and easy like physics, or b) become so enraged at the stupidity of the rules they were made to learn that they become a writer. Maybe even an editor.

I know what I’m hoping for, but this fucking test is going to choke more people than it ever inspires.

As always, if you want to argue with me, or to chat about this shit, or to propose a topic for a future blog, let me know! Tweet us; Facebook us; let’s have an argument/chat.

*But still ALL NERD, BABY.

†You’ll note, following this link, that they’ve since relaxed the wording, reeeal quiet.

‡Sometimes questions, although not often, since writing both symbols (and no-one can agree on whether it’s !? or ?!) looks crowded and bless them, the interrobang crowd’s never going to win that fight.

**For those unfamiliar with it, William Strunk, Jr. and E. B. White’s The Elements of Style, first published in 1959 – sometimes known as the “wonderful little book” – is a very small, very short, remarkably thorough, unapologetically prescriptive and magnificently dated style manual. I urge you to buy a copy and never use it. If style books were family members, this one would be your slightly racist gran, whose stories you love and who you desperately hope doesn’t say anything out loud until the guests go home.

¶Which I’m sticking with, because while many reputable sources insist English has only three moods – indicative, imperative and interrogative (unless there’s four, including subjunctive, or three and interrogative doesn’t count) – the “exclamatory” certainly seems to fit the bill for a mood, and no-one seems to be able to convincingly classify it as anything other than a mood. I’m taking a goddamn stand.

§Ætstand! Fylst ond hlyst! Hrim gecierrþ mid min niwe frumsceaft! – from “Hrim, Hrim, Baba,” by Hwite Hrim.

Posted on

Word Nerd: Comedy, Tragedy, History, Cat-Pictures

Hey yo!

So with Monstrous Little Voices coming out next Tuesday, I thought I’d cover some popular euphemisms for the toilet that have come and gone over thehahayeah obviously you know what’s coming.

Let’s do some Shakespeare.

The 1700

So if you ask your typical denizen of our nation’s streets for one random fact about Shakespeare, odds are they’ll say he added a buttload of new words to the English language.* If they’re particularly on form, they’ll even be able specify “more than 1700 words.”

Put them on the spot to actually name one of these, though, and they may come up with assassin,‡ or maybe zany. Or they’ll just shrug, like surely no-one expects you to know which words, right? Which is kind of a shame, because like there’s seventeen-goddamn-hundred of them, and surely it’s worth knowing some, if only for pub trivia conversations.

So while I was looking for a Shakespeare-themed column for Word Nerd, I sort of started thinking along these lines. Let’s cover some of his words!

Then I thought: let’s cover some of his internet words. Because you may not know it, but without Shakespeare’s contributions to the language four hundred years ago, some of the concepts defining the most transformative communications system the world has ever seen might have very different names.

The Bard of the Bandwidth

What do you mean, Dave?

Well, how about these bad boys?

Import

thou mayst not coldly set our sovereign process, which imports at full, by letters congruing to that effect, the present death of Hamlet. – Hamlet, Act IV, sc iii.

Perhaps more of a back-end concern, but importing and exporting files and >Tweet us; Facebook us; let’s have an argument/chat.

*Okay, if you ask an absolutely typical denizen, they’d probably say “he wrote plays or something.” Let’s say an average nerd.†

†Okay, I just canvassed the office and got “He left his wife his second-best bed,” “He lived over a brothel,” “He didn’t write any of his own plays,” “He probably died of cancer,” and “He couldn’t look up.” So let’s just say – look, fuck off, alright? This is my column.

‡He didn’t. Assassin entered English from French shortly before Shakespeare was born; what he did was coin the verb to assassinate. I told you this’d be good! Huh? Huh?

**It should be noted that he gave us marketable (As You Like It, Act I, sc ii.), and in doing so arguably spent almost all of the good will he earned from everything else he did in his life ever.

¶YEAH I DID.

§Although a few organisations along the jackboot-and-flaming-pyre lines have made a spirited attempt.

Posted on

Word Nerd: Mary, Mary, Quite Contranym

Hello again!

*mad cheering from packed auditorium*

So you’re probably –

*more cheering and thrown underwear*

*stands there helpless, trying to speak, smiles ruefully, small bow*

*cheering eventually dies down*

So it’s been –

*renewed cheering and screaming*

Okay, so straight up, this column’s probably got like a hundred regular readers at most, so no packed screaming rooms or unsanitary underwear-hurling, but it is back after quite a long silence, and I’d like to thank you for your patience, and apologise, and it’s all just been a hugely mad time, and I am (hopefully) back, baby.

Public-relations-coordinator-extraordinaire Rob Power has plans for the blog, one of which includes Word Nerd being back as a monthly thing, so I’ll be keeping this column to a slightly less manic schedule than last year, but hopefully will be able to keep it up for longer.

At any rate, let’s start with something fun.

YOU KNOW Synonym Is An Antonym Of “Antonym,” RIGHT?

Okay, so you may remember my literally column last summer, covering probably my single biggest grudge (not the supposed misuse of the literally, but everyone’s obsession with the supposed misuse of the word literally). You may even remember me saying that one of the dumbest things people say about this usage is that it “means the opposite of what it’s supposed to.” It doesn’t, of course, as I covered last year, but

So what if it did?

Apparently you can’t have a word wandering around in the English language with simultaneous opposing meanings because it would cause “widespread confusion” and make it impossible to use the words unambiguously, but aside from the fact that I have totally never encountered that problem, there are dozens of English words that do exactly that.

SO DID YOU KNOW Poecilonym* Is A Synonym Of “Synonym”?

Presenting contranyms,† a special category of words – nouns, adjectives and verbs – that are their own antonyms, whether due to distortion, convergence, drift over time or just quirks of usage. In almost every instance, the intended sense is completely obvious when used (if it weren’t, you can imagine speakers would eventually reject a usage and it would drift into linguistic history); but in some cases you genuinely need a larger context.

There are languages you can imagine not tolerating this sort of thing – there are any number of languages, in fact, where it never happens – but English, mad poet of tongues (and tongue of mad poets) that it is, revels in them.

OR, HEY! Synecdoche Is A Synecdoche Of “Metonym”!

There are shitloads of these things, but a few of my favourites are below.

Verbing Nouns

The most common contranyms start out as nouns and then become verbs that mean either “to add [thing]” or “to remove [thing].” Whichever way the verb works in its first appearance, sooner or later someone gets the idea of using it the other way, and generally both senses stick.

Thus one can trim a tree with secateurs or tinsel, dust with icing sugar or a damp cloth, clip your dog’s claws or clip on his leash, and seed a lawn or a tomato. There are dozens of these damn things.

Latin Confusions

A few contranyms seem to date all the way back to their Greek and Latin roots, and largely occur because the original had a broader sense that we narrowed down in English, and it turned out could be narrowed down in different ways.

Consult refers to asking someone for advice and being asked for it. The confusion seems to stem from the fact it originally referred to a collective act; that is, we come together to take council, and thus, both the advisor and the advisee are consulting. Sanction is an amazing contranym that refers to granting approval or withdrawing support. Its original sense was of a significant spiritual act (consider sanctify). It came to refer to a formal act, with divine and/or legal weight; a connotation that followed it into English and informs both its modern senses. Apology, bizarrely, refers to a statement of regret and to a formal self-justification. Both senses rely on the Latin and Greek uses of apologia as “a speech in one’s (legal) defence,” which could certainly be either contrite or self-justifying as needed.

Drifting Parts of Speech

Many of the most interesting contranyms arise from grammatical variation; words start off being used entirely legitimately, but speaking habits change (or people just get lazy), and later generations misunderstand what they’re hearing, and new senses appear. Most examples of these go back centuries, but some are extremely modern…

Left refers both to something that has gone and to something that remains (eg. “there were ten of us at this party, but four of us left, and now there are six of us left”). This one comes from a drift in usage of the reflexive form of the verb to leave. If I take one apple and leave two, then two apples have been left (obviously); over the years, we’ve come to look at the bowl and say there “are two left,” and left drifted from its origins as a reflexive verb to something more like an adjective.

Fast, wonderfully, refers to moving very quickly and being absolutely secure (eg. “stuck fast” or “fasten this rope”). The sense of “firm” or “secure” is the older usage; it seems to have first been used as an adverb for emphasis (much as we now use hard in “work hard and play hard”) and then drifted into use as an adjective.

Peer refers to someone of an equal status to oneself, or to someone of very high status; slightly cheaty, but in context it can certainly be contradictory. Of course, the latter use (generally referring to members of the British House of Lords) is derived from the fact that peers presumably only consider other peers their peers

Quantum refers to something that’s about as small as anything can get, or about as big as you can imagine. Quite a modern one, this one, down to distortion through figurative use. A quantum leap, in physics, is an instantaneous change between two states (where there is no intermediate state, ie. it is the smallest possible change), usually with an associated release of energy; as a figure of speech it suggests an abrupt, profound change, but in common usage – thanks in part to the word “leap” – it has also come to suggest a very big change.

Custom brilliantly means both “traditional and universal,” and “unique and specific.” Of course, the confusion disappears when you realise the latter sense comes from something being custom to oneself (eg. “I had the tailor cut the suit according to my custom rather than buy one off the peg”). But the original context has faded over time.

Converging Words

And of course some contranyms were never the same words to begin with! Like cleave, which used to be two Old English words, clēofan (to split or sever) and cleofian (to cling or adhere), whose spellings ended up coming together over the years. Any number of homonyms have formed this way, but in a few special cases they’ve found their antonyms and created inadvertent contranyms like this one.

As always, if you want to argue with me, or to chat about this shit, or to propose a topic for a future blog, let me know! Tweet us; Facebook us; let’s have an argument/chat.

*This is the most beautiful word in all languages.

†Also known as auto-antonyms, which while slightly clunkier, has the virtues of being slightly more fun to say and sounding like the title of an ’eighties TV show about a grammatically correct talking car.