Archive

Tag Archives: language

Translation is a funky thing. You can try to be literal and lose all the flavor, or you can try to find equivalent idiomatic expressions, or you can recast the whole thing, as if you were writing an original from a similar inspiration — your own words for a similar thought. 

And unless you are brought up bilingual so that you are completely comfortable in both languages, you will always be working from a disadvantage. You can work from crib notes, or take a literal translation and recast it. Many writers these days do something of the sort. Ezra Pound did not read Chinese, but that didn’t stop him from translating Chinese poetry. Scholars may quibble with the results (or laugh outright), but the versions Pound printed are good poetry, whether or not they are good translations. 

Would I rather read a poet’s regeneration or a scholar’s word-for-word? The answer is both. When it comes to poetry in languages I do not read, I’d rather have multiple versions to absorb and take in all the angles to arrive at something triangulated. 

There are languages I have some familiarity with and so, I can usually read Pablo Neruda straight from the trough. And in French or German, I have some dealings with the originals, although I do not speak the languages with anything like fluency. I can read a French newspaper, but cannot always make out the spoken version. (Luckily, when in France, I have learned you don’t really need the fineries of grammar. You can speak French pretty usefully even with no verbs at all. You go to the patisserie and when it is your turn, you just say, “Deux croissants, s’il vous plait,” and you get what you want. No one before you on line has used a verb, either.)

And so, I have come to translate some poetry for myself, from German, from French or Spanish (even an occasional Latin poem), and mostly in self-defense. 

I say “self-defense” because most of the translations I’ve been subjected to sound like musty old Victorian twaddle. The translators seem to love archaic word forms and odd word orders — as if written by Yoda they were. 

Such things offend my ear. 

It’s not that I want them to be prose, but the secret of poetry is in the metaphor and the clever turn of phrase, not in the conventional language of old poetry forms. Take the first two lines of Nietzsche’s Zarathustra’s Rundgesang. In German:

O Mensch! Gib acht! Was spricht die tiefe Mitternacht?

Which could be translated, word for word, as:

“O men! Give attention! What says the deep midnight?”

Traditional translations usually go something like:

“O Man! Take heed! What saith deep midnight’s voice indeed?”

or:

“O Man! Attend! What does deep midnight’s voice contend?”

There is the problem with the original. “O Man!” is poetic cliche. It has to go. I suppose you could turn it into idiomatic English as “Hey, y’all, listen up,” but that would be a crime in a different direction. 

If I were to translate this bit, I would just leave off the unnecessary parts and rewrite it as: “It calls to us in the dark. It is deep midnight and the hour speaks:” This sets up a light/dark dichotomy that pays off later in the piece. 

Too many translations, especially of classic Greek or Latin literature are written in this fusty, worn out poeticized and conventional twaddle. It’s amazing anyone waded through the Iliad in the 19th century. Homer’s actual style was immediate and direct. 

Imagine if Robert Frost had written: “Two paths in twain divided were; traverse we may but one.” Who would now bother with it? It is Circe turning men into pigs. 

In other words, I have no issue with completely recasting the originals to make modern, idiomatic sense in a language that I hope remains poetic but without the equipage of outworn convention. 

A stunning example of this approach is Ted Hughes’ Tales from Ovid, beautiful translations of several bits from The Metamorphoses. In Hughes’ style the stories move quickly and smartly and you turn the pages as in a best-seller. One only wishes Hughes had completed the whole thing, instead of mere sniglets. 

In this way, I have translated (or rewritten, if you hesitate) a good bit of German lieder. So much of it is hyperventilated Romantic sludge, which speaks to the early 19th Century of a generation that was weaned on Young Werther, and undoubtedly expressed the genuine feelings of those who lived through it, but now seem unrealistic and kitschy. 

Yet, there are real things being said and expressed in the poetry of Müller, Hölderlein or Eichendorff. It comes through like a buzz saw in the music of Schubert or Schumann, where the music has an authenticity that the verse sometimes lacks. 

I have tackled whole swaths of lieder verse, including a translation of all of the Winterreise. I found I could be a bit more faithful near the beginning of the cycle, but the deeper in, the more I had to rethink the verse. 

Take the first song, Gute Nacht. The text takes care of itself. A simple translation of the first stanza would be:

But, 24 songs later, the text of Der Leiermann, about a hurdy-gurdy man, is too bland without the devastating music Schubert provides (one of the most desolate and despairing bits of music ever penned), and so I’ve written my variation on it, to stand without the music:

Just this week, I started another project, translating four of the texts that Gustav Mahler set. I have arranged them into a set that belongs together, in four “movements,” rather like a symphony, meant to be taken as a single whole. 

I am offering them here as my apology for the type of translation I most appreciate — at least when others my better do it. 

The main benefit of doing such work (since I have no plans or hope ever to publish my translations — they are simply for the pleasure and knowledge I get from them — is that they force me to pay attention to the poetry and to the words. 

We can read through poetry much as we may distractedly hum a favorite tune. But good poetry offers much more, and forcing yourself to go through it word by word, can help you uncover much more. Translating forces concentration. 

And so, I read the German for its sound, parse individual words for their various meanings (for no word in any language has but one simple meaning), read various translations to compare how others have understood the words, reassemble them in my own English and then revise, over and over, until I get something that sounds good to me and — more importantly — makes sense. 

I have to admit that I generally like my own translations better than the ones packaged with the CD as the libretti or lyrics. But that is likely because they match my own particular esthetic — they are tailor made for my ear. Your ear may resonate to a different frequency. 

And so, the first “movement” of my Mahler word-symphony comes from the second of Mahler’s Songs of a Wayfarer, words originally written by the composer himself. The main melody of the song became the first theme of his Symphony No. 1. 

The second movement is Mahler’s own crib of Zarathustra’s Rundgesang, or “Zarathustra’s Midnight Song,” as the composer has it. All four of the texts I have translated focus on the twin but opposite facts that life is suffering but also it is joy. 

Third, there is heartbreaking and rueful song by Friedrich Rückert, Ich bin der Welt abhanden gekommen, set by Mahler first for voice and piano, but later orchestrated and part of his Sieben Lieder aus letzter Zeit (“Seven songs of Latter Days”). It is surely one of his greatest songs, and can hardly be heard or sung without feeling it was written directly with you in mind. 

Finally, there is Der Abschied (“The Farewell”), the final movement of Mahler’s Das Lied von der Erde (“Songs of the Earth”). In it, Mahler has pieced together two Chinese poems of dubious provenance (themselves translated or rewritten, or perhaps invented in French and German) purportedly by Tang Dynasty poets Meng Haoran and Wang Wei, with three lines added at the end, written by Mahler himself. Der Abschied is Mahler’s summa, and at 30 minutes, is as long as the previous five movements combined. And it ends with the quiet reiteration, over and over, in dying voice, “Ewig… ewig…” (“forever… forever…”) finally so in performance you can never quite tell when it ends, the final “Ewig” as quiet as the silence that follows. 

In the end, I recommend to everyone that they attempt to translate a poem from a different language. Take a Baudelaire, for instance, or a Neruda (avoid Rilke like the plague, unless you wish to end in an asylum), and parse it through, word by word. Read it out loud in the original language to hear the music of it (yes, your French may not be as liquid as the original) and read various translations to see how differently the words are construed. Then arrange a version of your own.

In the end, you will have internalized the poetry and it will never again be a stranger to you. 

Do you enjoy the music of Luigi v. Beethoven? That’s how his name appears on the score of his symphonies when they were printed in Italy. In Paris, he was Louis; in England he was Lewis. 

I’m fascinated by the way names morph and squidge as they travel around the globe. In late Classical times, Ludwig was originally Chlodovech in Frankish, which then took two paths. In Latin, it was written as Clovis. Drop the “C” and remember that in Latin, there is no actual “V” but was written as a “U” and you get Louis — and that’s how the Frankish king Clovis became the perpetual King Louis that hit 16 times before the final head was dropped into the basket. 

But the other path is German, where Chlodovech become Ludwig. In Medieval Latin that become Ludovico. Drop the “D” in the middle to Luovico, turn the “C” to the softer “G” and get Luigi. And that is how our van Beethoven becomes all of the people who wrote the same symphony. 

The variants of Ludwig/Louis/Luigi are legion. Other languages favor different sounds and hammer the name into other shapes. And the name gets feminine versions, too. Nabokov’s Lolita is just another version of Beethoven’s name. 

Alphabetically, there are Alois, Aloysius, Lajos, Lew, Lodovico, Louie, Lucho, Luis, and the Portuguese Luiz. Women get Aloysia (Mozart’s first love was Aloysia Webber, but had to settle for marrying her sister, Constanze); Eloise, Heloise, Lois, Lola, Lou (as in Mary Lou), Lu, Louise, Luisa and Lulu. Many of all these names have other spelling variations. 

It is through many standard linguistic changes (the “D” and “T” switching back and forth, for instance, or “G” and “K” sounds) that these variants arise. Languages have their habits, and so, because Italian doesn’t like to end their words or names in consonants, Luigi has a vowel hanging on. Japanese is similar in that, and so Beethoven becomes pronounced  “Aludowiga” remembering that the “L” needs to be that weird undifferentiated liquid — somewhere between an “L” and an “R.” Perhaps loser to “Awudiwiga.” (The final “A” is really a schwa). 

Several Romance languages habitually change an initial “S” into an “E” and “S” (as in Spain and España) and so Steven becomes Esteban. (the “B” and the “V” are practically the same letter, linguistically speaking). 

The real champion among male names, though, must be John. The variants are endless. You wonder how can Ivan and Sean be the same word? 

The original is ancient Hebrew Iohannani, which derives from Yaweh (God) and Hanani, “Gracious.” — although I can’t say I find much gracious about Jehovah (a variant of Yaweh), who seems to like to smite whole populations in pique. In modern Arabic, that becomes Juhanna — as in Bob Dylan’s song, Visions of Johanna (the visions that form the hallucinatory and paranoid basis of the book of Revelations). 

(When Oscar Wilde wrote his scandalous play, Salome, he called John the Baptist Jokanaan, which is closer to the original than our “John.”)

When the Bible was translated into Greek, the name became Ioannis and in Latin, Iohannes. As the name travels east into Slavic lands, it morphs into Iovanness and eventually into the Russian Ivan. (Pronounced “ee-von” in Russian, “eye-vin” in English). 

Because John is a biblical name, it spread through many European cultures. When Latin broke down into the various Romance languages, John rode along with it. Latin Iohannes shortened to Ioan, then, in Spanish to Juan, in French to Jean and in old Breton into Yann. In old Irish, it became Iohain, which evolved several ways — into Ewan, into Ian, and into Iain. Through the influence of French, which had a zh sound in its “J,” Jean also became Sean, or later, Shawn. 

Taking a more Germanic route, the Latin Iohannes became Johannes in German, and Iohannes in Old English, shortened to Johan in Middle English and then lopped to John in Modern English. (Interestingly, the nickname Johnny joined Spanish as Choni, which came from the Canary Islands version of Spanish as a name for any Englishman — “He’s a choni” — and devolved into a word in Spain for a trashy girl and “chonismo” as “trashiness” as a fashion choice.)

There’s a whole train of John variants: Evan, Giannis, Giovanni, Hans, Iban, Jan, Janos, João, Johann, Jovan, Juhani, Shane, Yahya, Yannis, Younan, Yonas. And for women: Hannah, Joan, Joanna, Joanne, Jeanne, Jane, Anna, Jo, Juana, Juanita, Sian — I could go on. 

Oddly, John and Jon are not closely related, but come from two different sources. David’s bosom buddy in the Old Testament was, in Hebrew, Yehonatan, from Yaweh (God) and Natan (“has given”), which, in English is Jonathan. Jon for short, leaving Nathan for another name. 

Most names have these variants. Susan was originally the Hebrew Shoshanna, which also gives us Susanna. The name probably goes back to ancient Egyptian, where the consonants SSN form the hieroglyph for lotus flower. In modern Hungarian, the name is spelled, delightfully, as Zsuzsanna. 

Mary was the Hebrew name Miryam, which may also go back to Egypt, where mry-t-ymn meant “Beloved of Amun.” (Moses’s sister is Miriam, and both her name and his are Egyptian in origin). In the Greek of the New Testament, this becomes Maria, which becomes French Marie, which becomes English Mary. Long ride from the Nile to the Thames. 

The Bible is the source of many names. We’ve already seen John. Considering the peregrinations of that name over the globe and centuries, the other Gospel authors have been comparatively stable. Mark has been remarkably little changed over the eons, having been merely Marco and Marcus, although it gives women both Marcia and Marsha. Luke was originally Lucius in Latin, but has become Lucas, Luca, and for women, Lucy and Lucinda. 

Matthew has more variants, but mostly just spelling changes. Originally Matityahu in Hebrew, meaning “Gift of God,” it became the Mattathias of New Testament Greek and Latinized to Matthaeus, or Matthew in English. In other languages, it is Mateo, Matthieu, Mathis, Matias, Matha, Madis, and Matko. 

The apostle Paul — originally Paulos in Greek — gives us Pal, Paulinus, Bulus, Pavlo, Pau, Paulo, Pablo, Pol, Pavel, Paavo, Podhi, Paolino, Baoro, Pavlis, and the female names Paula, Pauline, Paulette, etc. 

Jesus made a bilingual pun on the name of Peter, calling him “The rock upon which I build my church.” Jesus spoke Aramaic. The Aramaic word for rock is “kefa.” The Greek word is “petra,” turned masculine to name Peter as Petros. Who knew Jesus was a punster? 

Petros has morphed nearly as much as John, becoming Peter, Pierre, Pedro, Pjetros, Piers, Pyotr, Per, Peder, Peep, Pekka, Bitrus, Pathrus, Pesi, Piero, Pietru, Pita, Bierril, Pelle, Pedrush, Piotrek, Padraig, Pero, Pethuru, and a hundred others. 

The influence of Christianity (and Islam to a lesser degree) has meant that variants of biblical (and Quranic) names show up all over the map. Some, like Methuselah, have found little purchase. Others, the Johns, Pauls, Marys, and Peters, are almost universal, but each showing up in the regional costume of its adopting language. 

And so, one name can spawn many children. Perhaps the most prolific name is Elizabeth. Originally the biblical Elisheva, meaning “My God is Abundance,” it became Elizabeth in the King James translation into English. Elizabeth was the wife of Aaron in the Old Testament and the mother of John the Baptist in the New. 

It comes in various spellings, from Elisabeth to Elisabeta to Lisabek. It morphs into Isabelle and Isabella and all the variants of that. These, and the shortened and nicknamed forms make a list several hundred entries long. 

Among the progeny of Elizabeth are: Ella, Ellie, Elsie, Elisa, Alzbieta, Elixabete, Elsbeth, Yelizaveta, Yilishabai (in Chinese), Isabeau, Sibeal, Lettie, Liesbeth, Lisbet, Zabel, Alisa, Elise, Lisette, Lysa, Elka, Lizzy, Liz, Ilsa, Lisa, Yza, Izzy, Lela, Lila, Lili, Liliana, Lisanne, Liselotte, Babette, Libby, Liddy, Bess, Bessie, Bossie, Beth, Betsy, Betty, Bette, Bitsy, Buffy, Zabeth, Bekta and Bettina. That’s about a smidgeon of those I found. 

Each of these names has a branch on a linguistic family tree, a DNA map of sorts. I’ve mentioned only a few names here. There are many more, some with fewer branches, some with whole piles. My own name, Richard, is fairly sparse, with its variants mostly being variant spellings: Rikard, Ricardo, Rigard. Even in Azerbaijani, it’s Riçard. Its origins are in Proto-Germanic “Rik” for ruler or king, and “hardu” which means strong or hardy. So we see how much the name has declined since then. 

So, don’t place too much faith in the etymology of your name, but seeing its family line can be fascinating. Just remember that John and Jon are completely different. 

I started to write about philosophy, but realized I really wanted to talk about pears. Crisp, delicious succulent pears, the kind with small brown spots on the skin and a roly-poly bottom. Given a choice between reading Hegel (insert dry cough here) and slicing wedges off a Bartlett pear, the fruit wins hands down every time. 

I have been thinking about this because of philosophy. The intellectual world seems divided irrevocably between art and philosophy — image and word. One side deals with categories of thought, the other side deals with hubcaps, clouds, tight shoes and the sound of twigs snapping underfoot, to say nothing of pastrami sandwiches and corduroy trousers. 

I’m sorry if I value the one vastly over the other. I am a Dichter not a Denker. I have — this is my ideological burden — a congenital mistrust of language, particularly abstract language and language of categories. The world is too multifarious, indeed, infinite, and language by nature and requirement, simplifies and schematizes, ultimately to the point that language and reality split paths and go in separate directions. When one relies too much on  language, one misses the reality. 

The tragedy is, that language is all we have. We are stuck with it. We can try to write better, more clearly, use evocative metaphor when declarative words fail, use imagery rather than abstractions, and do our best — our absolute best — to avoid thinking categorically, and attempt to see freshly, with eye and mind unsullied by the words that have preceded us. It’s hard, but it is essential. To begin with the categories, and to attempt to wedge our experience into them, is to mangle and to mutilate the reality. 

The matter is only made worse by the impenetrable fustian written by so many philosophers — and especially the recent crop of Postmodern and Poststructuralist explainers. 

Take Hegel — please. Georg Wilhelm Friedrich Hegel (1770-1839) is just the kind of philosopher who thinks thoughtful thoughts and writes incomprehensible prose. 

“Knowledge of the Idea of the absolute ethical order depends entirely on the establishment of perfect adequacy between intuition and concept, because the Idea itself is nothing other than the identity of the two. But if this identity is to be actually known, it must be thought as a made adequacy.”

A made adequacy?” That’s from his System of Ethical Life (1803-4). I’m sure if you spent an hour or two going over it again and again, you might be able to parse something out of it. But, jeez. It’s the kind of prose you get from academia all over the place: 

“As histories of excluded bodies, the bodies that made national Englishness possible, this counterpastoral challenged the politics of visibility that made the very modern English models of nature, society, and the individual visible through the invisibility of bodies that did not matter.”

That’s from Kathleen Biddick’s The Shock of Medievalism (1998). She is also the author of The Typological Imaginary: Circumcision, Technology, History

In such writing, individual abstract words are made to stand in as shorthand for long complex ideas, not always adequately explained. And the words are then categories, and the categories allow blanket statements that cover the world like Sherwin-Williams paint. 

The basic problem is that words are always about words. When Plato talks about “the Good,” he is talking about how we define the word, “good.” Plato is about language. The linguistic grammar and language has its own rules, its own logic, and they soon supersede what the philosophers call “the case.” 

There is a book out there now titled Why Fish Don’t Exist, by Lulu Miller. And taxonomists now largely agree that what we used to call the class of animals Pisces (fish), are really a bunch of increasingly unrelated classes or clades, in fact at least 12 of them, not counting subclasses. For example, a salmon is more closely related to a camel than to a hagfish. 

But, back in the 18th century, both whales and sea urchins were also classified as “fish.” That we distinguish them separately now has made no difference to either whales or urchins, but only to dictionaries. That a whale is not a fish but a mammal is a shift in language, not biology. Fish still swim in the sea, even if we hesitate to call them fish. 

And in the same way, the parsing of philosophers is mostly a shift of wordplay. The philosopher Ludwig Wittgenstein made this the central issue of his later work. 

Meanwhile, as the philosophers mince language in their mental blenders, Gloucester fishermen keep pulling fish out of the oceans. When it comes to trust, I take the fishermen over the philosophers. The world is filled with sensible, seeable, feelable, hearable things. Things that give us pleasure and made the world we find ourselves cast into, like poached salmon. 

Our lives are filled with the things of this world and their shapes, colors, sounds, textures, smells and tastes. And so is our art, which makes images, poems, dances, music and theater from those shapes, colors, sounds, etc., is a direct connection with the things of this world — the “case” as it were. 

And I think of pears in art — those buttery layers of paint by Paul Cezanne — and the other still life art that singles out this bit or that of the physical presences of the world and shows them to us so we may notice them and appreciate them. 

Most of our art tends to be divided between people and things — “things” being mostly landscapes and still life. In our art, we privilege people over things and that is only fitting. I’m sure squirrels are most interested in other squirrels, too. 

But the non-human and non-living things things are so much a part of our lives, and a certain percentage of our art has been made about things. Like pears. 

I step outside into the sun and I hear distant traffic, the breeze hissing in the tree leaves, and, from several blocks away, the intermittent rattle of a chainsaw. In the morning, there are birds — mockingbirds and chickadees. There is the feel of the air and the sun on my skin. There is the smell of the grass, new mown, or maybe the oily resonance of diesel fumes. I stand and feel the temperature. I live in the welter of the world. 

And so, I am in love with the things of this world. I am mad for them to be in contact with me, to absorb them, to notice and appreciate them. To pay attention. To be alive. 

And I slice a pear. The insides are both pulpy and wet; the skin keeps the flesh from drying out. The stem at the top curves off. The nub at the bottom shows where the white flower had been. 

I take pears instead of apples here, because apples have too many words stuck to them, making them gummy with ideas, from Eve’s fruit of temptation to the computer on which I am writing these words. 

But a pear can be seen with less baggage. It bruises more easily than an apple, yet its pulp is firmer, stiffer, unless overripe, when it can go mushy. Nor is it as sweet as an apple, although we must point out that there are hundreds of different varieties of apple and that a red delicious is sweeter than a granny smith. (Yet the granny smith makes a better pie). 

There are varieties of pear, also, and they are perhaps more distinct than the apples. The lanky brown Bosc, the squat green Anjou, the nearly round Le Conte, the very sweet Seckel. In Japan, there is the ruddy, round Kosui, or russet apple pear. The Comice is great with ripe cheese. Yellow Huffcap for making perry — a cider made from pears. 

I believe the central fact of existence is variety, in infinite forms, which in contrast makes the categories of philosophers seem puerile and simplistic. And dry. Pears have juice. Derrida, none. 

These are smart people. I don’t begrudge them that. And perhaps we need people thinking such thoughts. But if we leave these words to the philosophers, I will have more time for myself with all the plants, rocks, fruit, animals, clouds, stars, cheeses and oceans. 

Ultimately, to experience things is more important — more rewarding — than explaining them. 

When it comes time to leave this planet and join oblivion, in those last moments left to my life, mostly, I will be thinking about the people I have loved and who have loved me. But beyond that, will I be thinking about Hegel or will I be remembering pears? My money is on the palpable. There is love there, too.

Click on any image to enlarge

Is there anything left to say? After 5,000 years of putting it all down on clay, stone, parchment and paper, is there anything that hasn’t been said? It is something every writer faces when putting pen to paper, or fingertip to keyboard. Or even thumbs to smartphone.

And it is something I face, after having written more than four million words in my professional lifetime. Where will the new words come from?

It is also something newlyweds often fear: Will they have anything to say to each other after 20 years of marriage? Forty years? Surely they will have talked each other out.

What we write comes from a deep well, a well of experience and emotion and sometimes we have drawn so much water so quickly, it dries, but give it time and it will recharge. If no new experience enters our lives, our wells remain dry.

One friend has offered this: “That each generation thinks they know more than anybody else who has ever lived.  In a way, that’s a good thing because it allows for new ideas.”

But how new are those ideas? “I guess we have to live with a certain amount of repetition under that system,” she says. “Relying on what previous generations wrote would be so boring. Our ego demands that we pick and choose from past works if we heed them at all.”

I have a different interpretation. We never quite hit the target of what we mean; words are imprecise, concepts are misunderstood. One generation values family, the next understands family in a different way and builds its family from scratch with friends.

As T.S. Eliot says in East Coker:

Every time I put word to word, I come up short, leave things out, use phrases sure to be misinterpreted, have my motives doubted, and — as I learned many times from my readers, they read what they think I wrote and not always what I actually wrote.

And so, there is the possibility of endless clarification, endless rewriting, endless apologizing. And new words to be written.

As someone once said, all philosophy is but a footnote to Plato (who, by the way, is a footnote to the pre-Socratics), and all writing is an attempt to get right what was inartfully expressed in the past. It is a great churn.

All writing is an attempt to express the wordless. The words are never sufficient; we are all wider, broader, deeper, fuzzier, more puzzling and more contradictory than any words, sentences or paragraphs can encompass.

Heck, even the words are fuzzier. Consider “dog.” It seems simple enough, but includes great Danes and chihuahuas, Scotties and dobermans.  As a genus, it includes wolves and foxes. It also describes our feet when we’ve walked too much; the iron rack that holds up fire logs; the woman that male chauvinist pigs consider unattractive; a worthless and contemptible person. You can “put on the dog,” and show off; you can “dog it,” by being half-assed; you can call a bad movie a “dog;” at the ballpark, you can buy a couple of “dogs” with mustard; if you only partly speak a language, you are said to speak “dog French,” or “dog German;” past failures can “dog” you; if you are suspicious, you can “dog” his every move. “Dog” can be an anagram of “God.”

Imagine, then, how loose are the bounds of “good” or “bad,” or “conservative,” or when someone tries to tar a candidate as a “socialist.” Sometimes, a word loses meaning altogether. What, exactly do we mean when we talk of morality or memory, or nationality or the cosmos?

And so, every time we pick up pen to write, we are trying our hardest to scrape up a liquid into a bundle.

And so we rework those words, from Gilgamesh through James Joyce and into Toni Morrison. We rework them on the New York Times editorial page and in the high school history textbook. We rehash them even in such mundane things as our shopping lists or our FaceBook entries.

We will never run out of things to write or say, because we have never yet gotten it quite right.

An earlier version of this essay originally appeared on the Spirit of the Senses webpage on  Aug. 2, 2020. 

In 1956, psychologist Benjamin Bloom published his Taxonomy of Educational Objectives, a hierarchical ranking of thought processes, often recast as “Bloom’s Taxonomy.” It has been often revised and recast, but most often, at the bottom were simple tasks such as memorizing, at the top came creativity. 

My late wife, who was at least as smart as Bloom, had her own version of this taxonomy, and for her, the lowest level was “naming.” She taught school for more than 30 years and saw brain-burn at the individual level. Being able to say, “Horsie” or “Duckie” is naming. This is simple rote. Learn the name and repeat it when appropriate. 

Naming also shades into the second level — the level most people get stuck in — that of sorting. Finding categories and shunting the names into silos to contain them. As if that explained anything. 

The greater part of what we do with our brains is to sort things out. To put cats over here and dogs over there. When we learn, most of what we mean by that is to understand that Claude Monet was an Impressionist and that Luis Buñuel was a Surrealist. These are mere sortings. Important for a file clerk, perhaps, but more a form of busy work than of actual thinking. 

We learn a whale is not a fish, and that a spider is not an insect. We have separate categories for them, and when we recognize the categories, we believe we have actually said something meaningful about our whale or spider, when really, all we have done is play with words. 

Categories, are, after all, quite fugitive, quite fungible — squishy. When zoologists first tried to classify lions, for instance, they placed them in the genus “Felis,” for they are some kind of cat. But later, it was decided they were big cats, not small ones, and so they became “Panthera.” Oh, but that wasn’t good enough, and so a new genus was established, dividing them from tigers and leopards, making them “Leo.” New category, new silo. 

For a brief time, I worked at a zoo, and had the opportunity to walk behind the cages and get up close to many of the animals and I can tell you that standing with his zookeeper two feet from a male lion to feed him,(separated from Leo by the cage bars), the lion’s head seemed to be the biggest thing I had ever seen, shaggy and furry, with a very particular smell, and a sense that this beast could swallow my head as if it were an M&M. And then it “purred.” A low, gutteral roar expressing satisfaction at the afternoon meal, that made the ground rumble under my feet. It was one of the most impressive things I have ever witnessed and it mattered not a whit whether I was seeing a Felis or a Panthera or a Leo. The name was rather beside the point. The experience had a physical existence and it didn’t need a name. 

Language is not reality. And the experience — the feel of it in the palm of your hand, or in your nostrils, or under your feet — is worth all the words in the world. Words can be a barrier keeping us from what is real. 

And yet, we spend so much of our time arguing over these categories, as if they mean anything. As if they were a reality. Is Joe Biden a Socialist? Did Elon Musk actually reach outer space? Is a tomato a fruit or a vegetable? So much thought and energy to such meaningless ends. Think of all the dark money spent in political campaigns to paint the opposition into a category-corner that makes the opponent a one-dimensional boogeyman. The world and its things are infinite. 

My late wife took animals to class with her so her pupils would have actual experiences — the twitching nose of a bunny, the blank stare of a hen, the brittle carapace of a hermit crab — and then gave the kids paper and paints and let them express what they had experienced. If names were mentioned, they were the names the kids gave the animals — a rabbit named Tiffany Evelyn or a crab named Eloise. What mattered was physical reality of the experience. Anything else is just language. Names. Categories. 

Historians like to take big chunks of time and give them names: Classical, Postclassical, Late Medieval, Romantic, and so on. Then they argue over it all, because these categories are misleading and constantly changing — being redefined. But, as they say, whatcha gonna do?

Take the Middle Ages. Middle of what? Homo sapiens developed something like — in a common low-end estimate — 300,000 years ago, putting the start of the Middle Ages somewhere approximately in the last 15/3000ths of human history. Not exactly the middle.

But the dates we give the Middle Ages vary widely. It came after the Roman Empire. When did the Roman Empire fall? Well, you can say that the final collapse came in 1453 with the fall of Constantinople. For some people, that is already the Renaissance, squeezing out the Middle Ages entirely. But no one really believes the Byzantine Empire was genuinely Roman. They spoke Greek, for god’s sake. They were Christian.

Usually, when we talk of the fall of Rome, we mean the Western Roman Empire and the sad reign of Romulus Augustulus, which came to an end in AD 476. But really, the Western Roman empire at the time consisted only of most of Italy and Dalmatia (later aka Yugoslavia) and a tiny bit of southern France.

And you could easily argue that Rome ceased to be Roman after Constantine converted to Christianity and legalized it in AD 313. After that, the slow slide from Roman imperialism into Medieval feudalism began its ambiguous transubstantiation.

It is the great paradox of scholarship: The more you read, the more your ignorance grows: The more you learn about something, the more you discover how little you know.

Are Picasso’s paintings Modern art? His first big Cubist painting, Les Damoiselles d’Avignon was painted in 1907. That is closer in time to the reign of Catherine the Great in Russia than it is to us. Closer to George Washington’s Farewell Address. To the Louisiana Purchase. 

So, what do we mean by “modern?” and when did modernity take over? It is a slippery question. And really it is simply an issue of definition — words, not experience. We let the words stand in for reality and then let the debates begin. Reality flows uninterrupted and continuous. Categories are discrete and they start and stop. 

The more you attempt to define the categories, the more they slip away. The history of academic scholarship is often the history of proving the categories wrong. It is historians who argue over the dates of the Renaissance. Or the fall of Rome, or the birth of Modernism. 

Categories are a convenience only. They are a name for the nameless.

I am reminded of the time, some 40 years ago, when I first drove west from North Carolina with my genius wife. We had never seen the great American West and eagerly anticipated finding it. It must be so different, we thought, so distinct. The West is a category. 

We were living in Boone, N.C., named for Daniel, who trod those mountains in the 1700s, when the Blue Ridge was the West. When George Washington surveyed the Northwest Territory in the late 1740s, he was measuring out what became Ohio.

So, when I was driving, I knew I had already pushed my own frontier past such things, and knew in my heart that the West began on the other side of the Mississippi River. But, when I crossed the river into Arkansas, it hardly seemed western. It didn’t look much different from Tennessee, in my rear view mirror. Yet, Arkansas was home to the “Hanging Judge” Isaac Parker and where Jesse James robbed trains. Surely that must be the West. But no, James looked more like a hillbilly than a cowboy. 

Then came Texas, which was the real West, but driving through flat, bland Amarillo on I-40 was as exciting as oatmeal. The first time we felt as if we had hit the West was at the New Mexico line, when we first saw a landscape of buttes and mesas. Surely this was the West.

Maybe, but we hadn’t yet crossed the Continental Divide. All the waters of all the rivers we crossed emptied into the Atlantic Ocean. Finally, crossing the Divide near Thoreau, N.M.,  we felt we had finally made it.

Yet, even when we made it to Arizona, we knew that for most of the pioneers who crossed this country a century and a half ago, the desert was just one more obstacle on the way to California. In some sense it still wasn’t the West.

When we got as far as we could in a Chevy, and stared out at the Pacific Ocean, we knew that there was still something farther: Hawaii, Japan, China, India, Africa — and eventually back to North Carolina.

So, the West wasn’t a place you could ever really reach, but a destination beyond the horizon: Every point on the planet is the West to somewhere else.

When we look to find the beginnings of Modernity, the horizon recedes from us the same way. Perhaps it began with World War I, when we entered a non-heroic world and faced a more sober reality.

Modern Art began before that, however, perhaps with Stravinsky’s Rite of Spring in 1913, perhaps with Debussy’s Afternoon of a Faun in 1894. Some begin with the first Impressionist exhibition in 1874.

Politically, maybe it begins with Bismarck and the establishment of a new order of nations and the rise of the “balance of power.”

You can make a case that Modernism begins with the Enlightenment in the 18th century, when a rising Middle Class began to fill concert halls and Mozart became an entrepreneur instead of an employee of the aristocracy.

Or before that, in 1648, with the Treaty of Westphalia, and the first recognition of national boundaries as something more than real estate owned by the crown.

You can set your marker down with Luther, with Gutenberg, with Thomas Browne, Montaigne, Caravaggio — or Giotto.

For many, Modernism began with the Renaissance, but when did the Renaissance begin? 15th century? The Trecento? Or did it begin further north with the Gothic, which is really the first sparking of a modern way of thinking.

Perhaps, though, the Roman republic divides modern political organization from more tribal eras before. Or you could vote for the democracy and philosophy of ancient Greece. Surely the time before that and the the time after are distinctly different. We recognize the near side of each of these divides as more familiar than the distant side.

You might as well put the starting line with the discovery of agriculture in the steppes of Anatolia and the river plains of Iraq. An argument can be made for any of these points on the timeline — and arguments could be made for many I haven’t room to mention.

Perhaps the horizon should be recognized for what it is: an ever-moving phantasm. For those peasants digging in the manorial dirt in the Ninth Century, the times they were living in were modern. The first person recorded to use the term “modern” for his own age was the Roman writer Cassiodorus in the 6th Century. Each moment is the new modern.

These are all just categories, and spending our time sorting things into their file folders should not be mistaken for actual knowledge. It is words about the knowledge. 

Now, I will concede that the words help us discuss the real things, and that it is probably useful to know the difference between cats and dogs, or butterflies and moths. But categories and sorting are just a second level of thinking. After these baby steps, there is so much more that the human brain can begin working on, much more grist to be ground. And a good deal of thought that outreaches the ability of words to capture. 

The level I have been most thinking about recently is that of observing, of paying attention. Not deciding anything, or sorting anything, but just noticing. The world opens up like a day lily; so much that was invisible is made visible — things that the rush of daily life, moving things from in-box to out-box, have made too inconsequential to waste time with. There is a richness to the world that becomes a glowing glory when attention is paid. 

In the days before the transcontinental railroad, a Cheyenne father would take his 10- or 11-year-old son out into the prairie and have him lie down on his belly. “Just look,” he would say. “Don’t talk, don’t decide, don’t name, just look.” And he would leave his son there for the day, not moving a whit. And when he came back to retrieve the boy he would not ask, “What did you see.” He would say nothing. He would not need to. 

So much of value is beyond words, beyond category.

This is the 600th blog entry I’ve written since retiring eight years ago from the writing job I held for 25 years. But as I’ve said many times, a real writer never retires, he just stops getting paid for it. 

During my career, I wrote over 2.5 million words. Since then, I’ve added another million. If you are born a writer, you simply can’t help it. 

(In addition, since 2015, I’ve written a monthly essay for the website of The Spirit of the Senses salon group in Phoenix, Ariz., a continuation of the many salon lectures I gave there for years.)

And even when I write an e-mail to friends or family — the kind of note that for most people contains a short sentence, a quick “LOL” and an emoji — I am more likely to write what looks like an old-fashioned missive, the kind that used to come in a stamped envelope and delivered by a paid government worker. An e-mail from me will take a while to read through.They are sent not merely to convey information, but to be read. They have been written, not just jotted down. 

Over the eight years of blogifying, I’ve covered a great many topics. Many on art and art history — I was an art critic, after all — many on history and geography, a trove of travel pieces, a few frustrated political musings and a hesitant offering of oddball short stories (if you can call them by that name.) 

People say, “Write what you know,” but most real writers, myself included, write to find out what I know. The writing is, itself, the thinking. Any mis-steps get fished out in the re-writing. 

Ah, words. I love words. I love sentences, paragraphs, chapters. Although I wrote for a newspaper, where short, simple sentences are preferred, I often tested the patience of my editors as I proved my affection for words by using obscure and forgotten words and by using them often in long congregations. 

“I love long sentences. I’m tired of all the short ones. Hemingway can keep them. Newspapers can urge them. Twitter can mandate them. To hell with them.

“My ideal can be found in the long serpentine railways of words shunted hither and thither over dependent clauses, parenthetical remarks, explanatory discursions and descriptive ambiguities; sentences such as those found in the word-rich 18th century publishing world of Fielding, Sterne, Addison, Steele, or Boswell, and perhaps most gratifyingly in the grand, gravid, orotund sentences of Edward Gibbon, whose work I turn to not so much for information about the grandeur that was Rome, but for the pure sensuous pleasure to be had from those accretive tunes built from the pile of ideas and imagery (to say nothing of ironic asides), and peppered liberally with the notations of colons, semicolons, dashes and inverted commas.”

The love of words fuels a fascination with paronomasia. I make up words, play with them, coin spoonerisms and mondegreens and pepper my everyday speech with them. As music critic, I reviewed sympathy orchestras. Sometimes I have trouble trying to mirimba a name. On my shopping list I may need dishlicking washwood. 

I often give my culinary creations names such as Chicken Motocross, Mentil Soup, Ratatootattie, or  — one I borrowed from my brother — Mock Hawaiian Chile. 

When my wife came home from work, I usually asked “How did your Italian?” (“How did your day go?”)

When asked for my astrological sign, I say, “I’m a Copernicus.” My late wife was a Virago. And I’m pretty sure our Orange Bunker Boy was born under the sign of Feces. I call him a would-be Moose-a-loony.

I try to keep unfashionable words in currency. On long car trips with granddaughters, we didn’t count cows, we counted kine. I tend to refer to the girls as the wee bairns, or the kidlings. 

I have no truck with simplifying the language; I will not brook dumbification. The more words we use, the better, and the better inflected those words will be. As we lose words, the slight difference in emphasis and meaning is lost, and a simple word then has to do extra duty to encompass ideas and things that are better understood as different. 

Every word has a dictionary definition, but that definition is little but the skeleton on which the meat and muscle is hung onto. Each word has a nimbus of meaning and affect around it, which is learned by its speakers and readers through long acquaintance. You can always tell when someone has snuffled through a thesaurus, because the fancy word they choose has been stripped of its nimbus, or has an aura that is the wrong color for the spot in which it is placed. In other words, such a writer doesn’t really know the word that has been chosen. The Webster version is only a fuzzy black-and-white photo, not the real thing. 

I have written before how sometimes, instead of doing a crossword puzzle or rearranging my sock drawer, I will make lists of words. Each has a flavor and reading such lists is like perusing a restaurant menu and imagining the aroma and flavor of each offering. It is a physical pleasure, like the major or minor chords of a symphony. Here is a brassy word, there the pungency of an oboe, and over there, the sweet melancholy of a solo cello. 

I think all writers must have something of the same feel for the roundness, spikiness, warmth, dryness or wetness of words. And the way they connect to make new roundnesses, coolnesses, stinks or arousals in sentences. 

Yes, there are some writers — and I can’t pooh-pooh them — who use words in a blandly utilitarian way. Stephen King, for instance, is a great storyteller. He can force you by a kind of sorcery to turn pages. But on a word-by-word level, his writing is flavorless, almost journalistic. I suspect this is a quality he actually aspires to — to make the language so transparent as to be unobservable. I have to admit there are virtues in this, also. But not for me. 

I want a five-course meal of my words. 

Language can take either of two paths: prose or poetry. The first invests its faith in language as a descriptor of systems. It reaches its nadir in philosophy. It makes little difference if it is Plato or Foucault; philosophy — especially the modern sort — is essentially a branch of philology. It seeks to deconstruct the language, as if understanding the words we use will tell us anything about the world we live in. It tells us only about the language we use. Language is a parallel universe to the one we inhabit, with its own rules and grammar, different from the rules and grammar of the real world. 

This has been a constant theme in my own writing. When we say, “A whale is not a fish,” or “A tomato is a fruit, not a vegetable,” we are talking about language only, not about whales or tomatoes. But beyond the language we use to communicate our understanding of the world, no matter how vast our vocabulary, the world itself is infinitely larger, more complex, diverse, chaotic and unsystematic, not to be comprehensively understood by mere mortal. 

And I should clarify, by language, I mean any organized system of thought or communication. Math is just language by other means. When I use the term “language” here, I mean what the Greeks called “logos” — not simply words, or grammar, syntax or semantics, but any humanly communicated sense of the order of the cosmos. Not one system can encompass it all. 

Consider Zeno’s paradox: That in a race between Achilles and a tortoise, if you give the tortoise a headstart, no matter how little, Achilles can never catch up. Before he does, he has to go halfway, and so is still behind the tortoise, and before he goes the remaining distance he must go again halfway. Thus he can never catch up. The paradox is purely in the forms of logic, not in the reality. We all know Achilles will catch up in only a few strides. But the system — the logic, or the words — tells us he cannot. Do not trust the words, at least not by themselves, without empirical evidence to back them up. 

All systems of thought, whether religious, political or scientific, ultimately break down when faced with the weedy complexity of existence.

And so, a good deal of what we all argue about is simply the words we choose to use, not the reality. We argue over terminology. Conservative, liberal? Is abortion murder? These depends entirely on your definitions. 

Poetry, on the other hand — and I’m using the word in its broadest and metaphorical sense — is interested in the things of this world. Yes, it may use words, and use them quite inventively, but its goal is to reconnect us with our own lives. It lives, not in a world of isms, but in one of mud, tofu, children, bunions, clouds and red wheelbarrows. This is the nimbus of which I speak. 

It is ultimately our connection with our own lives that matters, with the things of this world, with the people of our lives that should concern us. It is what provides that nimbus of inexactitude that gives resonance to the words. 

At various times in my career as someone who got paid for writing, I have been asked to speak to groups of students or the curious about my craft. It hasn’t always gone well. 

I remember one time I managed to annoy a community college teacher no end by telling her students to ignore everything she had been hammering into their heads. I didn’t know I was doing that; I was just talking about what I knew through experience. But she had been filling their minds with ugly formulae and what to my mind are tired old saws: Make an outline; use a topic sentence; the rule of threes. As if you could interest readers by rote. 

Part of the problem is that I believe that writers are born, not made. Of course, you can improve anyone’s ability to put down comprehensible sentences, but good spelling and decent grammar do not make a writer. Just as anyone can be taught to draw and sketch, but that won’t make them an artist, anyone can be instructed how to fashion a paragraph or two without embarrassing themselves, but that don’t make’em into Roger Angell. 

One of the things that caused the teacher no end of bother was my insistence that the single most important and defining part of writing was “having something to say.” Without it, no rhetorical device, no repetition of authoritative quotations, no using active rather than passive voice, would suffice. And the truth is, few people have anything to say. 

Of course, everyone thinks they do, but what passes for thought is most often merely the forms of thought, the words that have previously been used to frame the ideas, and hence, someone else’s thoughts. Having something to say is genuinely a rare gift. 

This hardly serves to help the composition-class student or the teacher hoping to form them into perfect little Ciceros. Having something to say requires having had a living experience to draw upon, something original to the writer — a back yard with skunk cabbage, or a two-month deployment with a platoon, or the betrayal of a spouse — and an idiosyncratic reaction to it, something personal and distinct. Instead, most people are just not used to finding words to describe such things and fall back on words they have heard before. Easily understood words and phrases and therefore the mere ghosts of real expression. 

When you use someone else’s words, to that extent you don’t know what you are talking about. 

Being born a writer means being consciously or unconsciously unwilling to accept approximation, to be unsatisfied with the easily understood, to search for the word that more exactly matches the experience. 

One of the consequences is that to be a writer means to re-write. As you read back over what you have just put on paper — or on the computer screen — you slap your forehead over this bit or that. How could I have let that through? And you find something more exact, more telling, more memorable. It is only the third or fourth go-round that feels acceptable. (Each time I come back to a piece I’m working on, I begin again from the beginning and work my way through what I’ve already finished and change things as I go to make myself clearer or my expression more vivid. This means that the top of any piece is usually better written than the end. Sorry.) 

Having something to say and sweating over saying it in a way that doesn’t falsify it — this is what writing is all about. 

But is there anything I can say to those who just want to be a little bit better when turning in a school paper, or writing a letter to the editor, or publishing a novel about your life so far? Here are a few suggestions.

First and most important: Read. Read, read, read. Not so much to imitate what you have found, but to absorb what it is to use language. Just as one doesn’t “learn” English as a youngster, but rather you absorb it. When you are grown, you may have to learn a second language, but as an infant, you simply soak up what you hear and gradually figure it out. And likewise, reading lots of good writing isn’t to give you tricks to follow, but to immerse you in the medium so that it becomes your mother tongue. 

Second: Write. Write, write, write. In his book, Outliers, Malcolm Gladwell famously made the claim that it took 10,000 hours of practice to master a skill. He later explained he only meant that as an average, but the issue remains: You can’t become a writer without writing. Over and over, until it becomes second nature and all the amateur’s kinks are driven out. Write letters, journals, blogs — it doesn’t matter what, but writing and doing it constantly makes you a better writer. 

Third: Fill the well you draw from. Nothing will come of nothing. Everything you see, feel and do is who you are and is the substance of your writing. If you know nothing, feel nothing deeply, do nothing interesting, then you have nothing to bring to the sentences you write. Good writing is not about writing, despite all the reflexive gibberish of Postmodern philosophers. 

Even when you want to write about abstract ideas, you had better do it through touch, feeling, color, smell, sound. Nothing is worse than reading academic prose, because it is upholstered with “isms” and “ologies.” 

“The work of the text is to literalize the signifiers of the first encounter, dismantling the ideal as an idol. In this literalization, the idolatrous deception of the first moment becomes readable. The ideal will reveal itself to be an idol.”

Thank you. I no longer need to count sheep. 

Through the Middle Ages, all educated people communicated in Latin. In a strange way, that doesn’t seem to have changed. Words of Latin origin predominate in academic prose. Sometimes reading a peer-reviewed paper is like translating Virgil. 

Language and experience are parallel universes. We try to get language closer to the life we live, but it is always at least slightly apart. When we speak or write in abstractions, we are manipulating language without reference to the world of things we live in. Language about language. Good writing is the attempt to bring these two streams closer to each other, so that one may refresh the other. We do that primarily through image and metaphor. An idea is clearer if we can see it or feel it. Flushing it through Latin only obscures it. 

“Show, don’t tell” works best even when you are “telling,” i.e., writing. 

For those who don’t have to think about such things, a word is a fixed rock in the moving stream, set there by the dictionary. But for a writer, each idea and each word is a cloud of meaning, a network of inter-reference. To narrow down those possibilities, a picture helps — a metaphor. Not added on at the end, but born with the idea, co-nascent. 

Take almost any line of Shakespeare and you find image piled on image. “Our little life is rounded with a sleep,” says Prospero. Donalbain fears “the daggers in men’s smiles.” “If music be the food of love, play on.” “Now is the winter of our discontent made glorious summer by this son of York.” Shakespeare is nothing if not a cataract of sense imagery. 

How different if Prospero had simply said, “Life is short and then you die.” 

There are a whole host of injunctions and directives that are given to wanna-be writers, and all of them are worthy. Don’t use passive voice; always have antecedents to your pronouns; avoid pleonasm; edit and revise; dump adverbs and, damn it, learn how to use a semicolon. 

But none of them is as important as the primary directive: Have something to say. 

Oh, and yes, it’s always fun to annoy community college teachers. 

I recently wrote a piece about grammar and vocabulary peeves. And I mean “peeves.” It’s too common to take such language infractions as if federal law had been broken. For me, such things are merely irritants. Others may take such examples as I gave as bad grammar, or mistaken grammar, but I meant to show the personal reaction some of us get when the way we were trained to use language gets trampled on by those not similarly trained. 

Sometimes, there is truly a misuse of language and creates misunderstanding or even gobbledegook, but at other times, it is merely a failure to recognize how language changes and grows through time, or a refusal to understand idiom or regionalism. 

The war between descriptionists and prescriptionists is never-ending. As for me, I have matured from being a mild prescriptionist to a rather forgiving descriptionist, with some few hard rules added. I feel that to be either all one way or all the other is a kind of blind stupidity. 

For instance, I would never use the word “irregardless.” It is unnecessary. But neither will I claim it is not a word. Maybe it didn’t used to be, but it is now, even if it is an ugly word. If someone wants to sound coarse and unlettered, he or she is free to use “irregardless,” regardless of its gaucheness. 

There was a notepad full of examples that I did not fit into the previous blog post, and some newer ones sent me by friends or readers. So, I thought a followup might be due. Some of these are clearly mistakes and misusage, but others are just rules I or we learned at an early age and now flinch at whenever we hear or read them flouted (the confusion of “flouted” and “flaunted” being one of the mistakes that make us flinch). 

I am at a particular disadvantage because I was horsewhipped into shape by the Associated Press Stylebook. I never use an abbreviation for “road” when writing an address, while I have no problem with “St.” for “street.” Why the AP chose this path, I have no clue, but they did and now I am stuck with it. It was driven into me by a rap on the knuckles during my first week working on the copy desk. I am also stuck with “baby sitter” as two words, while “babysitting” is one. 

(Sometimes the stylebook is brutally ignorant. When I began as a copy editor, it told us to spell the little hot pepper as a “chili” and the dinner made with it and meat and/or beans as “chilli,” but we were in Arizona, where Spanish and Spanglish are common, and would have looked like idiots to our readers if we had followed that rule, so we were allowed to transgress and spell the word for both as “chile.” I believe that the Associated Press has finally caught up. I am retired now, and no longer have the most recent copy of the book.)

Of course, the AP Stylebook wasn’t designed to decide once and for all what is correct usage, but rather only to standardize usage in the newspaper, so different reporters didn’t spell “gray” in one story and “grey” in another. But the result of this standardization is the implication that what’s in that book is “right and true.” As a result, I almost always avoid saying “last year,” or “the last time so-and-so did this,” but rather contort the sentence so I can use “past” instead of “last,” the logic of which is that last year wasn’t the last one — at least not yet. Yes, I know that is stupid and that everyone says “last year” and no one is confused, but the AP has rewired my neurons through constant brainwashing. 

It also has me aware of distinguishing jail from prison. People are held in jail awaiting trial; after conviction, they serve their sentence in prison (yes, some convicts serve their time in jails, but that doesn’t change things. Jails tend to be run by counties; prisons by state or federal governments.)

And so, here is my list of additional words and phrases that get under my skin when used or misused. 

For me, the worst, is the common use of “enormity” to describe anything large. I twitch each time it sails past me. An enormity is a moral evil of immense proportions. The Shoah was an enormity; the vastness of the ocean is not. 

Then, there is the confusion between “imply” and “infer.” To imply is to slip a clue into the flow; to infer is to pick up on the clue. 

One hears constantly “literally” used instead of “figuratively.” Ouch. It debases the strength of the literal. 

There are rhetorical figures that are misapplied over and over. Something isn’t ironic simply by being coincidental, nor is oxymoron the same as paradox — the latter is possible through reinterpretation, the former must be linguistically impossible. To be uninterested is not the same as being disinterested. It causes me minor physical pain each time I hear some bored SOB called “disinterested.” 

I have other peeves, lesser ones. “My oldest brother,” when there are only one other brother. “Between” three people rather than “among.” Using “that” instead of “who” when referring to a person: “He was the person that sent me the letter.” Pfui. 

There is a particular personal proscription list for anyone who uses “which” instead of “that” in a sentence with a defining adjectival phrase, as in: “It was the dog on the left which bit me.” It’s OK in: “It was the dog on the left, which bit me, that I came to despise.” 

Some of us still make a distinction between “anxious” and “eager.” The virus makes me anxious. I am eager to get past the threat. There are other pairs that get confused. I try to ensure that I never use “insure” when I’m not talking about an insurance policy; the wrong use of “effect” can affect the meaning of a sentence; further, I never confuse “farther” with something other than physical distance. “Floundered” and “foundered” mean different things, please. 

From other people and from comments to the blog, I have heard complaint of “bringing something with me when I go” or “taking something home with me.” “Bring” comes home; “Take” goes away. 

Another hates seeing “a lot” as one word, unless, of course, it has two “Ls” and means to portion something out. Yet another yells at the TV screen every time someone says “nucular” for “nuclear.” I share that complaint, although I remember many decades ago, Walter Cronkite making a reasoned case for pronouncing “February” without the first “R.” “It is an acceptable pronunciation,” he said, “It is listed as a secondary pronunciation in the Webster’s Dictionary.” I’m afraid “nucular” has become so widespread that it is in the process of becoming, like “Febuary” an accepted alternate. But it hurts my ear. 

Trump give “free rein” to his son-in-law, but perhaps it really is “free reign.” Confusion abounds. 

All this can reek of pedantry. I’m sorry; I don’t mean it to. There are many times you might very well subvert any of these grammatical conventions. I have heard complaints about sentences that start off as “I and Matilda took a vacation” as ugly and wrong, (really, the grammatically worse “Me and Matilda” is idiomatically better, like “Me and Bobby McGee”) but I remember with literary fondness the opening of Herman Melville’s “I and My Chimney:”

I and my chimney, two gray-headed old smokers, reside in the country. … Though I always say, I and my chimney, as Cardinal Wolsey used to say, I and my King, yet this egotistic way of speaking, wherein I take precedence of my chimney, is hardly borne out by the facts; in everything, except the above phrase, my chimney taking precedence of me.

And there are presidential precedents. “Normalcy” wasn’t a word until Warren G. Harding used it to describe a vision of life after World War I (there are examples from earlier, but he popularized its use and was ridiculed for it — “normality” being the normal word). 

I would hate to have to do without George W. Bush’s word: “misunderestimate.” If that hasn’t made it into Webster’s, it should. I think it’s a perfectly good word. Language sometimes goes awry. We don’t always hear right and sometimes new words and phrases emerge. I knew someone who planned to cook dinner for a friend. “Is there anything I should know about your diet? Anything you don’t eat?” “I don’t eat sentient beans,” she said. He had never heard of that sort of bean. It was only much later that he smiled at his own misunderstanding. Since then, I have always kept a bin of dried sentient beans to make “chilli” with. At least, that’s how I label the tub. 

Language shifts like tides. Words come and words go; rules pop up and dissipate; ugly constructions are normalized and no longer noticed, even by grammarians. I have listed here some of the formulations that still rankle me, but I am old and wear the bottoms of my trousers rolled. I’m curious, though, what bothers you? Let me know in the comments. 

I began life as a copy editor, which means, I had to know my commas and em-dashes. My spelling had to be impeccable and I memorized the Associated Press Style Book, which taught me that the color was spelled g-r-a-y, not g-r-e-y. Except in “greyhound,” that is. 

It is a line of work I fell into quite naturally, because from the second grade on, I have had a talent for words. I diagrammed sentences on the blackboard that had the visual complexity of the physics formulae written on his whiteboard by Sheldon Cooper. I managed to use my 10 weekly vocabulary words, tasked with writing sentences for each, by writing one sentence using all ten. I did OK with math, but it was never anything that much interested me, but words were another thing. I ate them up like chocolate cake. 

The upshot is that I am a prime candidate for the position as “Grammar Cop,” bugging those around me for making mistakes in spelling, punctuation and usage. And, admittedly, in the past, I have been guilty. But as age softens me, I have largely given up correcting the mistaken world. And I have a different, more complex relationship with language, less strict and more forgiving. 

The cause for this growing laxness are multiple. Certainly age and exhaustion are part of it. But there is also the awareness that language is a living, growing, changing thing and that any attempt to capture it in amber is a futile endeavor. 

But although I have come to accept many changes in speech that I once cringed at — I can now take “their” in the singular (“Everyone should wash their hands”) and have long given up on “hopefully” — there are still a handful of tics that I cannot get over. I try, but when I hear them uttered by a news anchor or starlet on a talk show, I jump a little, as if a sharp electric shock were applied to my ear. 

The first is “I” used in the objective case. It gives me the shivers. “He gave the award to Joan and I.” It gets caught in my throat like a cat’s fur ball. 

The second is using “few” for “less.” I know that the usage has largely changed, but it still assaults my ear when I hear, “There will be less pumpkins this Halloween, due to the drought.” Ugh. 

A third is the qualified “unique,” as in, “His hairstyle is very unique.” It’s either unique or it isn’t. 

Then there are common mispronunciations. “Ek-setera” is just awful. Although, I did once know someone who always gave it its original Latin sounding: “Et Caetera” or “Et Kye-ter-a.” Yes, that was annoying, too. 

The last I’ll mention here is the locution, “centered around.” Gets my goat every time. Something may be “centered on” a focus point, or “situated around” something, but “centered around” is geometrically obtuse unless you’re discussing Nicholas of Cusa’s definition of the deity, whose center is everywhere and circumference is nowhere. 

Others have their own bugaboos. One friend cannot get past the confusion between “lay” and “lie.” She also jumps every time someone uses “begging the question,” which is misused 100 times for every once it is understood. 

Of course, she may be more strict than I am. “There have been errors so egregious that I’ve stopped reading a book,” she says. “I just stomp my foot and throw the book away.” 

Her excuse: “I was an English major.” 

There are many other issues that bother me, but not quite so instantly. I notice when the subjunctive is misused, or rather, not used when it should be. If I were still an editor, I would have fixed that every time. 

I am not sure I will ever get used to “like” used for “said.” And I’m like, whoever started that linguistic monstrosity? I also notice split infinitives as they sail past, but I recognize that the prohibition against them is a relic of Victorian grammarians. It is too easy to lazily give in to those ancient strictures. 

So far, I’ve only been talking about catches in speech, although they show up in print just as often (you can actually come across “ect.” for “etc.”) But reading a book, or a newspaper or a road sign and seeing the common errors there can be even more annoying. There is probably nothing worse, or more common, than the apostrophe plural. You don’t make something plural by adding an apostrophe and an “S.” “Nail’s” is not the plural for “nails.” This is encountered endlessly on shop signs. 

And digital communication is fraught with homophone confusion. “They’re,” “there,” and “their,” for instance, or “you’re,” and “your.” I admit that occasionally this is just a mental hiccup as you are typing. We all make mistakes. I have sometimes put a double “O” after a “T” when I mean a preposition. That’s just a typo. But there are genuinely people who don’t seem to notice the difference with “to,” “too,” and “two.” (I have great tolerance, however, for the ideogrammatic usage of “2” for “to” in electronic messaging. I find it kind of amusing to see the innovation in space-saving for Twitter and e-mail. I may even have been guilty myself of such things. Indeed, there is a long history of this sort in handwritten letters in the 15th to 18th centuries, when “William” was often “Wm,” and “through” was often “thro.” Paper was expensive and abbreviations saved space.)

Some frequent typological absurdities make me twitch each time. I really hate seeing a single open-quote used instead of an apostrophe when a word is abbreviated from the front end. You almost never see “rock ’n’ roll” done correctly. 

There are lesser offenses, too, that I usually let pass. “Impact” as a verb, for instance. It bothers me, but the only people who use it tend to write such boring text that I couldn’t wade through it anyway. (I wrote about the management class mangling of the language in what I call “Manglish.”) “Different than” has become so normalized for “different from” that I’m afraid it has become standard English. Of course, the English themselves tend to say “different to.” So there. 

There are distinctions that have been mostly lost in usage. “Can I” now means the same as “May I” in most circumstances, and almost no one still makes a distinction between “shall” and “will.” 

Many of us have idiosyncratic complaints. I knew someone who complained that “laundermat” was not a word. We saw such a one on Vancouver Island when visiting. “It should be ‘laundromat,’” she said, arguing the parallel with “automat.” 

Another cringes at “would of,” “should of,” and “could of” in place of “would have,” “should have,” and “could have.” But this is merely a mishearing of the contractions “would’ve,” “should’ve,” and “could’ve” and turning them into print. Yes, it should be corrected, but it doesn’t get under my skin when I hear it. 

And there are regionalisms that bother some, although I glory in the variety of language. One person I know complains about such phrases as “had went,” but that is a long-standing Southernism and gets a pass, as far as I’m concerned. 

And much else is merely idiom. If you get too exercised about “I could care less,” please relax. It means the same thing as “I couldn’t care less.” Merely idiomatic. Lots of grammatical nonsense is now just idiomatic English. Like when the doorbell rings and you ask “Who’s there?” and the answer comes back, “It’s me.” If you hear “It is I,” you probably don’t want to open the door. No one talks like that. It could be a spy whose first language is not English. Better ask if they know who plays second base for the Brooklyn Dodgers (old movie reference). 

And the Associated Press hammered into me the habit of writing “past week” instead of “last week,” on the principle that the previous seven days had not, indeed, been terminal. You can take these things too far — but I am so far brainwashed not to have given in. “Past week,” it will always be. 

I may have become lax on certain spelling and grammar guidelines, but one should still try one’s best to be clear, make sense, include antecedents for one’s pronouns and be clear about certain common mistakes. “Discreet” and “discrete” are discrete words. And someone I know who used to transcribe her boss’s dictated letters once corrected him when he said a client should be “appraised” of the situation and typed instead, “apprised.” He brought her the letter back and complained that she had misspelled “appraised.” Being a man and being in management, he could not be persuaded he was wrong. She had to retype the letter with the wrong word. There’s just nothing you can do with some people. 

Language is just usage at the moment. It shifts like the sands at the beach, what was “eke” to Chaucer is “also” to us. What was “conscience” to Shakespeare is “consciousness” to us. Thus does conscience make grammar cops of us all. We don’t learn our mother tongue, we acquire it and what we hear as babes becomes normal usage. Ain’t it the truth?

The world is not black and white, but until fairly recently, photography was. For most of its history, the art was an art of silver on paper, spread from inky blacks through velvety grays into pristine whites. 

There had been attempts to add color, either by painting on top of the monochrome image, or by various experimental techniques to capture the color directly. But even after the commercially successful introduction of Kodachrome in 1935, photography as a museum-approved art continued to be primarily in black and white. 

(In cinema, Technicolor predated Kodachrome by about a decade, but that process was essentially three different black and white negatives overlapped through color filters to create the effect. It was an expensive and difficult process and relatively few films, percentage-wise, were made with the process until after the commercial success of Gone With The Wind and The Wizard of Oz in 1939.)

I have been a photographer for at least 50 years. I have had shows and my work has been published. But for most of that time, I worked in monochrome. I “saw” in black and white. My photographic heroes worked in B&W, the techniques I mastered were silver techniques. I became an excellent printer. But I seldom used color film. It seemed an unnecessary noise to bring to the purity of the medium. 

I was hardly alone in this. When I was younger, even museums shied away from color photography. It was seen as not “permanent.” It’s images faded over time (I’m sure you all have old family snapshots turned rather magenta with age). The real artist-photographers used silver or platinum and made glorious images. 

Back then, art in general was seen with more precious eyes. We thought of “archival processing,” and even paintings were carefully preserved and curators looked down on some artists — such as Jackson Pollock or Mark Rothko — who used non-archival pigments or unprepared canvases and whose works, therefore, had begun to deteriorate. 

In current times, few artists or galleries worry much about such things. Art can be made on newsprint, or can even purposely self-destruct. Concern for the permanence of an artwork is seen as elitist. After all, no matter how careful you are, the art is going to be gone eventually, even if it lasts till the sun explodes. 

And besides, color is now no more or less permanent than black and white: Now they are both nothing but ones and zeros. Silver is dead; long live digital. 

Yet there is still a difference between color photography and black and white. It is a difference not simply of technique, but of thought. Thinking in color is different from thinking in black and white. 

The part of vision that deals in color is processed in a different area of the brain than the part that concerns itself with darks and lights. (Vision is ridiculously more complicated neurologically than you might think — the information on the retina is broken down into many separate components, processed by differing regions of the brain and then re-coordinated as a gestalt.)

And so, some people pay closer attention to the hue, others to the forms they see. 

The fact is, black-and-white photography and color photography are two different art forms. To be successful in both requires a kind of bilingualism. Most of us have brains that function best either in seeing forms and shades, or in seeing hues. The two photographies emphasize those different talents.

One has only to consider the work of Stephen Shore or William Eggleston. Most of their meaning comes through the color. Take one of Eggleston’s best-known images and suck the color out. What have you got?

He made this photo of a ceiling and light bulb. The red is overwhelming. But imagine it as a black and white image.

He also made a similar image of a brothel ceiling painted blue. Also overwhelming. The two are nearly the same image, but with very different emotional and sensuous meanings.

But if we make them both black and white, they very nearly merge into the same thing. 

Color can by itself separate forms. Here are four squares in four colors; as distinct as can be. But the exact image, unaltered except for the draining of all color from it, leaves a confused mess, barely a separation between grays. 

Black and white photography requires the separation of parts not by hue, but by contrast: Lights agains darks. It’s what makes great silver prints sing. Where color photographs separate forms primarily by hue, black and white shapes form with contrast.

I am not saying a color photograph has to be garish. Far from it. But the color will carry a good deal of the meaning and emotional resonance of the image. Even in a color photo that has hardly any color in it.

Many years ago, I tried an experiment. Like so many others, I loved the waterlily paintings of Claude Monet. But I wondered if they would make as much sense in black and white. Is there a structure holding the pictures together, a design or composition, that didn’t depend solely on the rich color. 

So, I began making photographs in black and white of water lilies. 

The most successful of them clearly relied on bright highlights and strong shadows. The shapes made the picture.

If I tried an overall design, like Monet’s the picture lost its strength. 

I did the same experiment with one of Monet’s paintings, rephotographing it in black and white. 

Did it hold up? It is certainly a very different beast. 

Then I went back to one of my own color photographs of his waterlilies in Giverny, a photograph that imitated Monet’s paintings, with color, sky, reflection, shadow and lily. In color and side-by-side, in black and white. 

What I discovered shouldn’t be a surprise: Monet was much more effective in color. But I also noticed that because my photos were well-focused rather than impressionistically fuzzy, they translated better into black and white: Black and white is meant to clarify shapes. Color identifies “areas” rather than discrete textures. 

And so, while I have spent the majority of my photographic career making monochrome images, along with many others now working in digital media, I switch back and forth between color and B&W. They do, however, require different vocabularies. They are different languages. 

While I have always made visual art, I made my career in writing about art. 

As an art critic, I had the unusual need to be bilingual in an odd sort of way. As a journalist, I needed to be good with words, but in writing about art, especially visual art, I needed to know how to use my eyes.

I discovered very early on how these two talents were seldom granted to the same person. All around me were reporters who knew a gerund from a copulative, but who often seemed almost infantile when discussing pictures. They could name the subject of the image, but not go much further than that. 


A photo editor of my acquaintance once explained photojournalism this way: “I need to know it’s a house; don’t trick it up with ‘art.’” This was image as ID photo. 

But on the other side, so many artists I knew couldn’t explain themselves out of a paper bag. They effused in vague buzzwords, words that changed currency every year or so. I once taught a graduate course in writing about art for art students who needed to prepare so-called “artist statements” for their exhibits. Most of what they wrote before the course was utter blather, obscure and important-sounding without actually meaning anything. 

Words and images: Worlds seldom interpenetrable. I call the talent for riding both sides a form of bilingualism. 

I do not know if the ability to deal in multiple “languages” is something you are born with, or that you learn early on the way you acquire language before the ability to do so closes off in adolescence. But somehow, I managed to do it, at least well enough to write about it without embarrassing myself.

The mental juice necessary to process each seems walled off from the other, except in rare cases. One either runs a literary program, based on sentence and paragraph structure, linear words building a whole out of alphabetic parts; or one comprehends shapes, lines, color, size, texture, and frame as carrying the information required to convey meaning. 

This doesn’t mean that visual people are illiterate, nor that literary people can’t enjoy an art gallery, but that their primary modes of understanding vary. The squishiness of an artist’s gallery talk can drive a writer bonkers; the flatness of a word-person’s understanding of a painting can leave an artsy type scratching her head: “Can’t you see?” 

Nor does it mean that either side can’t learn, although it will remain a second language, without native understanding of idiom and customary usage. A word person can be trained to see shape and form, but it will always remain as I learned Spanish. No one will ever confuse me with a native speaker.

This split between word and image, though is only one of the bisections. Musicians can think in tone the way painters can think in pigment. Yes, there is a language that can describe the music, but for non-musicians, that language is usually impressionistic and often visual — what the music “makes you think of,” or the “pictures in your mind.” 

For the musically trained, there is also language, but it is completely opaque to the civilian: Dominant-seventh, voice-leading, timbre, reed trimming, tenor clef, Dorian mode, ritornello, de capo, circle of fifths. But even these are merely words to describe the non-verbal reality of the music itself, which can convey meaning through sound alone. The words are not the music. 

The ability to think in the terms of each mode is essential to create well in that form, and a mighty help in understanding it for the audience. If you are not in love with words, the rich cream of Gibbons or the organ tones of Milton can leave you cold. If you have no eyes for color, the nuance of Turner or the pears of Cezanne can zip past without notice. If you think of pop tunes as music, the shifting tonal centers of Schubert are inaudible, the orchestration of Mahler merely noise. 

We each have a frequency our sensibilities are tuned to, and can receive it loud and clear; we may think we understand the rest, but too often we are only fooling ourselves. Do you really inhale the contrapuntal movement of a Balanchine chorus? Do you notice the rhythm of editing in a Spielberg film? Each is a language that its practitioners and connoisseurs understand profoundly, but zip past the mass of those sitting in the cheap seats. 

It’s a different language

Click on any image to enlarge