Wednesday, August 18, 2010

The Politics Of Meaning And Usage In English

What logic regulates correct usage? Why are words and expressions that are perfectly acceptable in one era taboos in another? In other words, why do the meanings and usages of words mutate radically over generations?

These and many other questions were the subject of an interesting email exchange I had with a British editor recently. The Brit stumbled across a previous article I wrote titled, “10 Most Irritating Errors in American English” and liked it very much. It stoked his British ego. But he also noticed an Americanism (read: a grammatical slip by the standards of British English) in the same write-up.

I wrote: “I have decided to dedicate this and next weeks’ columns to discuss common grammatical errors in American English.” He pointed out that it should be “dedicate… to discussing….” The verb “discuss,” he said, should be in the progressive tense.

The practice of using "dedicate" with the regular forms of verbs is peculiarly American, he pointed out to me. I agreed. But I told him that even in modern British English there is a gradual, osmotic, if for now imperceptible, semantic shift in the direction of that horrible Americanism that irks him. He disagreed. “I don't recall ever seeing this mistake from a British person,” he declared pompously.

I then sent him a link to the British National Corpus where that usage (that is, where the verb that comes after “dedicate” in a sentence is not in the progressive tense) has appeared a number of times in current British English. (The British National Corpus is a comprehensive compilation of a representative sample of contemporary written and spoken British English).

I wrote: “Well, I found these examples of the use of “dedicated to” without the “ing” form of verbs from the British National Corpus. Apparently, it's not only Americans that alternate between using the continuous and uninflected forms of a verb after the verb "dedicated." My British friend ate humble pie.

This prodded a lively email conversation on why there is often a disjunction between what has been prescribed as correct usage by experts and what real, living people actually speak and write—and why grammarians later succumb to popular usages, which they then codify and hold up as inviolable standards, which are then violated again by people, usually in a subsequent generation, ad nauseam.

“Meat” used to denote food in general (that sense of the word is still retained in the age-old saying, “One man’s meat is another man’s poison); “girl” used to mean any young person of either sex; “deer” initially referred to any animal, a reason Shakespeare wrote of “rats and mice and such small deer”; “silly” used to mean fortunate or happy; “broadcast” used to refer to the act of throwing seeds in all directions, not to the dissemination of information through radio and TV; “holiday” is derived from “holy day,” but the word is now used for any day of freedom from work, even if these days are secular; “villain” used to mean a village peasant, but it now only means a wicked or evil person; “aggressive” used to mean hostile and destructive behavior, but in modern business practice there is often a tone of approval when someone is described as an “aggressive businessman” or when methods are described as “aggressive strategies”; “academic,” an otherwise respectable word, is now also used derisively to mean impractical, pedantic; “rhetoric,” a time-honored study and application of the art of persuasion, is now popularly used to mean mere loud, confused, and empty talk; and so on.

Similarly, the meanings of words can expand beyond their original meanings. For instance, the word “alibi” initially only meant “elsewhere,” and was used only in legal defense to mean that someone was elsewhere while a crime was committed and therefore couldn’t be blameworthy. Today, the semantic boundaries of that word have been extended to mean “excuse” or “self-justification” of any kind. Grammarians objected to this semantic extension for a long time. Many have given up now.

It’s the same story with the word “alternative.” It originally meant “other of two,” which meant that it couldn’t correctly be used for items that exceeded two. In time, however, people began to talk about “hundreds of alternatives.” Grammarians were outraged by this mutilation of the word. There can only be “alternatives” for two choices, they protested. No one listened. They lost the battle.

They also lost the battle over the correct usage of the word “decimate.” It formerly meant “to kill one of every ten.” To the horror of grammatical purists, people extended the semantic boundaries of the word to mean “kill a large number,” to “wipe out,” to “eliminate.” So everyday users of English again decimated the grammatical purists in the battle over the usage of “decimate”!

Most of the fulmination against the above usage patterns derives from a desire to be faithful to the etymological distinction of the words. But that’s short-sighted. Many common English words today have radically diverged from their origins; their contemporary meanings bear not the vaguest resemblance to their etymological roots.

For instance, the word “dilapidated” is derived from “lapis,” which is Latin for stone. It is now used of deplorable condition. “Alcohol” was an Arabic word for a substance that women used to darken and thicken their eyelashes; today it means liquor that intoxicates. “Edify” is the Latin word for “build” (a meaning still present in the word “edifice”); today it means to improve through teaching and enlightenment. “Hysteria” is derived from the Greek word for womb; now it means a state of violent mental agitation or extreme emotion. In American English, "hysterical" is becoming synonymous with "very funny."

Usage patterns also mutate over time. For instance, “each other” used to be a reciprocal pronoun that referred only to two people, and it was often understood that it was different from “one another,” which was supposed to refer to three or more people. That distinction no longer exists. In modern English usage, both phrases are used interchangeably.

It was also considered bad grammar to end sentences with prepositions. So instead of writing “I don’t remember the name of the drug he was addicted to,” grammarians of the previous generation would insist that the sentence should be rendered as, “I don’t remember the name of the drug to which he was addicted.”

This rule emerged from a conscious, if unimaginative, mimicry of the syntactical structure of Latin, the language of science and scholarship in Europe until the 17th century. But the “no-preposition-at-the-end-of-a-sentence” rule is counter-intuitive, even senseless, and antithetical to the natural rhythm of the English language. It’s no surprise that people had a hard time obeying it. Today most people end sentences with prepositions, and most grammarians don’t seem to be bothered by this any longer.

So meanings and usages are, for the most part, context-specific and historically contingent. If that is the case, why do people fuss over "bad" usage? I think the reason that changes grate on people is that we see usage as making sense according to the grammatical rules that are established in our own minds. These rules in our minds, of course, reflect those rules generally accepted in our environment.

However, it seems that the rules of grammar are changing less quickly in other parts of the English-speaking world than they are in the United States. As my British friend said, “From where I'm standing, people in the US seem to be playing a game of yo-yo to which we Brits have not been invited.”

Note, though, that Americans have done more to extend the semantic and communicative frontiers and capabilities of the English language in recent times than the Brits have. The Brits should actually be grateful that Americans speak English. Without Americans, English would have receded from the world stage in the same manner that French and other once powerful European languages have.

The strength of the English language derives from the material and symbolic power of its native speakers, particularly Americans, the flexibility of its grammatical rules, and the rich diversity of the sources of its vocabulary. Almost every language in the world has contributed to the vocabulary of the English language. (Next week, I will examine the contributions of African languages to the vocabulary of the English language).

Well, the foregoing is essentially the story of the battle between “what ought to be” (i.e., the snooty prescriptions of professional grammarians) and “what is” (i.e., popular usage patterns among everyday folks) in meaning and language usage. But that’s a grotesque simplification. Actually, the “what ought to be” is more often than not aggregated and codified from the “what is” to produce the “what ought to be.”

So usage rules proceed in dialectical triads: the “what ought to be” is often first instituted as the norm, as the thesis. The “what is” then emerges as an unorganized, unconscious antithesis, and the resolution of this antagonism often gives birth to a new set of rules, which then constitute the new thesis that grammarians preserve and hold up as the standards but which are ultimately subverted by a new antithesis, and on and on. Call it grammatical dialectics, if you like.

0 comments:

Post a Comment