Archive

Tag Archives: language

At various times in my career as someone who got paid for writing, I have been asked to speak to groups of students or the curious about my craft. It hasn’t always gone well. 

I remember one time I managed to annoy a community college teacher no end by telling her students to ignore everything she had been hammering into their heads. I didn’t know I was doing that; I was just talking about what I knew through experience. But she had been filling their minds with ugly formulae and what to my mind are tired old saws: Make an outline; use a topic sentence; the rule of threes. As if you could interest readers by rote. 

Part of the problem is that I believe that writers are born, not made. Of course, you can improve anyone’s ability to put down comprehensible sentences, but good spelling and decent grammar do not make a writer. Just as anyone can be taught to draw and sketch, but that won’t make them an artist, anyone can be instructed how to fashion a paragraph or two without embarrassing themselves, but that don’t make’em into Roger Angell. 

One of the things that caused the teacher no end of bother was my insistence that the single most important and defining part of writing was “having something to say.” Without it, no rhetorical device, no repetition of authoritative quotations, no using active rather than passive voice, would suffice. And the truth is, few people have anything to say. 

Of course, everyone thinks they do, but what passes for thought is most often merely the forms of thought, the words that have previously been used to frame the ideas, and hence, someone else’s thoughts. Having something to say is genuinely a rare gift. 

This hardly serves to help the composition-class student or the teacher hoping to form them into perfect little Ciceros. Having something to say requires having had a living experience to draw upon, something original to the writer — a back yard with skunk cabbage, or a two-month deployment with a platoon, or the betrayal of a spouse — and an idiosyncratic reaction to it, something personal and distinct. Instead, most people are just not used to finding words to describe such things and fall back on words they have heard before. Easily understood words and phrases and therefore the mere ghosts of real expression. 

When you use someone else’s words, to that extent you don’t know what you are talking about. 

Being born a writer means being consciously or unconsciously unwilling to accept approximation, to be unsatisfied with the easily understood, to search for the word that more exactly matches the experience. 

One of the consequences is that to be a writer means to re-write. As you read back over what you have just put on paper — or on the computer screen — you slap your forehead over this bit or that. How could I have let that through? And you find something more exact, more telling, more memorable. It is only the third or fourth go-round that feels acceptable. (Each time I come back to a piece I’m working on, I begin again from the beginning and work my way through what I’ve already finished and change things as I go to make myself clearer or my expression more vivid. This means that the top of any piece is usually better written than the end. Sorry.) 

Having something to say and sweating over saying it in a way that doesn’t falsify it — this is what writing is all about. 

But is there anything I can say to those who just want to be a little bit better when turning in a school paper, or writing a letter to the editor, or publishing a novel about your life so far? Here are a few suggestions.

First and most important: Read. Read, read, read. Not so much to imitate what you have found, but to absorb what it is to use language. Just as one doesn’t “learn” English as a youngster, but rather you absorb it. When you are grown, you may have to learn a second language, but as an infant, you simply soak up what you hear and gradually figure it out. And likewise, reading lots of good writing isn’t to give you tricks to follow, but to immerse you in the medium so that it becomes your mother tongue. 

Second: Write. Write, write, write. In his book, Outliers, Malcolm Gladwell famously made the claim that it took 10,000 hours of practice to master a skill. He later explained he only meant that as an average, but the issue remains: You can’t become a writer without writing. Over and over, until it becomes second nature and all the amateur’s kinks are driven out. Write letters, journals, blogs — it doesn’t matter what, but writing and doing it constantly makes you a better writer. 

Third: Fill the well you draw from. Nothing will come of nothing. Everything you see, feel and do is who you are and is the substance of your writing. If you know nothing, feel nothing deeply, do nothing interesting, then you have nothing to bring to the sentences you write. Good writing is not about writing, despite all the reflexive gibberish of Postmodern philosophers. 

Even when you want to write about abstract ideas, you had better do it through touch, feeling, color, smell, sound. Nothing is worse than reading academic prose, because it is upholstered with “isms” and “ologies.” 

“The work of the text is to literalize the signifiers of the first encounter, dismantling the ideal as an idol. In this literalization, the idolatrous deception of the first moment becomes readable. The ideal will reveal itself to be an idol.”

Thank you. I no longer need to count sheep. 

Through the Middle Ages, all educated people communicated in Latin. In a strange way, that doesn’t seem to have changed. Words of Latin origin predominate in academic prose. Sometimes reading a peer-reviewed paper is like translating Virgil. 

Language and experience are parallel universes. We try to get language closer to the life we live, but it is always at least slightly apart. When we speak or write in abstractions, we are manipulating language without reference to the world of things we live in. Language about language. Good writing is the attempt to bring these two streams closer to each other, so that one may refresh the other. We do that primarily through image and metaphor. An idea is clearer if we can see it or feel it. Flushing it through Latin only obscures it. 

“Show, don’t tell” works best even when you are “telling,” i.e., writing. 

For those who don’t have to think about such things, a word is a fixed rock in the moving stream, set there by the dictionary. But for a writer, each idea and each word is a cloud of meaning, a network of inter-reference. To narrow down those possibilities, a picture helps — a metaphor. Not added on at the end, but born with the idea, co-nascent. 

Take almost any line of Shakespeare and you find image piled on image. “Our little life is rounded with a sleep,” says Prospero. Donalbain fears “the daggers in men’s smiles.” “If music be the food of love, play on.” “Now is the winter of our discontent made glorious summer by this son of York.” Shakespeare is nothing if not a cataract of sense imagery. 

How different if Prospero had simply said, “Life is short and then you die.” 

There are a whole host of injunctions and directives that are given to wanna-be writers, and all of them are worthy. Don’t use passive voice; always have antecedents to your pronouns; avoid pleonasm; edit and revise; dump adverbs and, damn it, learn how to use a semicolon. 

But none of them is as important as the primary directive: Have something to say. 

Oh, and yes, it’s always fun to annoy community college teachers. 

I recently wrote a piece about grammar and vocabulary peeves. And I mean “peeves.” It’s too common to take such language infractions as if federal law had been broken. For me, such things are merely irritants. Others may take such examples as I gave as bad grammar, or mistaken grammar, but I meant to show the personal reaction some of us get when the way we were trained to use language gets trampled on by those not similarly trained. 

Sometimes, there is truly a misuse of language and creates misunderstanding or even gobbledegook, but at other times, it is merely a failure to recognize how language changes and grows through time, or a refusal to understand idiom or regionalism. 

The war between descriptionists and prescriptionists is never-ending. As for me, I have matured from being a mild prescriptionist to a rather forgiving descriptionist, with some few hard rules added. I feel that to be either all one way or all the other is a kind of blind stupidity. 

For instance, I would never use the word “irregardless.” It is unnecessary. But neither will I claim it is not a word. Maybe it didn’t used to be, but it is now, even if it is an ugly word. If someone wants to sound coarse and unlettered, he or she is free to use “irregardless,” regardless of its gaucheness. 

There was a notepad full of examples that I did not fit into the previous blog post, and some newer ones sent me by friends or readers. So, I thought a followup might be due. Some of these are clearly mistakes and misusage, but others are just rules I or we learned at an early age and now flinch at whenever we hear or read them flouted (the confusion of “flouted” and “flaunted” being one of the mistakes that make us flinch). 

I am at a particular disadvantage because I was horsewhipped into shape by the Associated Press Stylebook. I never use an abbreviation for “road” when writing an address, while I have no problem with “St.” for “street.” Why the AP chose this path, I have no clue, but they did and now I am stuck with it. It was driven into me by a rap on the knuckles during my first week working on the copy desk. I am also stuck with “baby sitter” as two words, while “babysitting” is one. 

(Sometimes the stylebook is brutally ignorant. When I began as a copy editor, it told us to spell the little hot pepper as a “chili” and the dinner made with it and meat and/or beans as “chilli,” but we were in Arizona, where Spanish and Spanglish are common, and would have looked like idiots to our readers if we had followed that rule, so we were allowed to transgress and spell the word for both as “chile.” I believe that the Associated Press has finally caught up. I am retired now, and no longer have the most recent copy of the book.)

Of course, the AP Stylebook wasn’t designed to decide once and for all what is correct usage, but rather only to standardize usage in the newspaper, so different reporters didn’t spell “gray” in one story and “grey” in another. But the result of this standardization is the implication that what’s in that book is “right and true.” As a result, I almost always avoid saying “last year,” or “the last time so-and-so did this,” but rather contort the sentence so I can use “past” instead of “last,” the logic of which is that last year wasn’t the last one — at least not yet. Yes, I know that is stupid and that everyone says “last year” and no one is confused, but the AP has rewired my neurons through constant brainwashing. 

It also has me aware of distinguishing jail from prison. People are held in jail awaiting trial; after conviction, they serve their sentence in prison (yes, some convicts serve their time in jails, but that doesn’t change things. Jails tend to be run by counties; prisons by state or federal governments.)

And so, here is my list of additional words and phrases that get under my skin when used or misused. 

For me, the worst, is the common use of “enormity” to describe anything large. I twitch each time it sails past me. An enormity is a moral evil of immense proportions. The Shoah was an enormity; the vastness of the ocean is not. 

Then, there is the confusion between “imply” and “infer.” To imply is to slip a clue into the flow; to infer is to pick up on the clue. 

One hears constantly “literally” used instead of “figuratively.” Ouch. It debases the strength of the literal. 

There are rhetorical figures that are misapplied over and over. Something isn’t ironic simply by being coincidental, nor is oxymoron the same as paradox — the latter is possible through reinterpretation, the former must be linguistically impossible. To be uninterested is not the same as being disinterested. It causes me minor physical pain each time I hear some bored SOB called “disinterested.” 

I have other peeves, lesser ones. “My oldest brother,” when there are only one other brother. “Between” three people rather than “among.” Using “that” instead of “who” when referring to a person: “He was the person that sent me the letter.” Pfui. 

There is a particular personal proscription list for anyone who uses “which” instead of “that” in a sentence with a defining adjectival phrase, as in: “It was the dog on the left which bit me.” It’s OK in: “It was the dog on the left, which bit me, that I came to despise.” 

Some of us still make a distinction between “anxious” and “eager.” The virus makes me anxious. I am eager to get past the threat. There are other pairs that get confused. I try to ensure that I never use “insure” when I’m not talking about an insurance policy; the wrong use of “effect” can affect the meaning of a sentence; further, I never confuse “farther” with something other than physical distance. “Floundered” and “foundered” mean different things, please. 

From other people and from comments to the blog, I have heard complaint of “bringing something with me when I go” or “taking something home with me.” “Bring” comes home; “Take” goes away. 

Another hates seeing “a lot” as one word, unless, of course, it has two “Ls” and means to portion something out. Yet another yells at the TV screen every time someone says “nucular” for “nuclear.” I share that complaint, although I remember many decades ago, Walter Cronkite making a reasoned case for pronouncing “February” without the first “R.” “It is an acceptable pronunciation,” he said, “It is listed as a secondary pronunciation in the Webster’s Dictionary.” I’m afraid “nucular” has become so widespread that it is in the process of becoming, like “Febuary” an accepted alternate. But it hurts my ear. 

Trump give “free rein” to his son-in-law, but perhaps it really is “free reign.” Confusion abounds. 

All this can reek of pedantry. I’m sorry; I don’t mean it to. There are many times you might very well subvert any of these grammatical conventions. I have heard complaints about sentences that start off as “I and Matilda took a vacation” as ugly and wrong, (really, the grammatically worse “Me and Matilda” is idiomatically better, like “Me and Bobby McGee”) but I remember with literary fondness the opening of Herman Melville’s “I and My Chimney:”

I and my chimney, two gray-headed old smokers, reside in the country. … Though I always say, I and my chimney, as Cardinal Wolsey used to say, I and my King, yet this egotistic way of speaking, wherein I take precedence of my chimney, is hardly borne out by the facts; in everything, except the above phrase, my chimney taking precedence of me.

And there are presidential precedents. “Normalcy” wasn’t a word until Warren G. Harding used it to describe a vision of life after World War I (there are examples from earlier, but he popularized its use and was ridiculed for it — “normality” being the normal word). 

I would hate to have to do without George W. Bush’s word: “misunderestimate.” If that hasn’t made it into Webster’s, it should. I think it’s a perfectly good word. Language sometimes goes awry. We don’t always hear right and sometimes new words and phrases emerge. I knew someone who planned to cook dinner for a friend. “Is there anything I should know about your diet? Anything you don’t eat?” “I don’t eat sentient beans,” she said. He had never heard of that sort of bean. It was only much later that he smiled at his own misunderstanding. Since then, I have always kept a bin of dried sentient beans to make “chilli” with. At least, that’s how I label the tub. 

Language shifts like tides. Words come and words go; rules pop up and dissipate; ugly constructions are normalized and no longer noticed, even by grammarians. I have listed here some of the formulations that still rankle me, but I am old and wear the bottoms of my trousers rolled. I’m curious, though, what bothers you? Let me know in the comments. 

I began life as a copy editor, which means, I had to know my commas and em-dashes. My spelling had to be impeccable and I memorized the Associated Press Style Book, which taught me that the color was spelled g-r-a-y, not g-r-e-y. Except in “greyhound,” that is. 

It is a line of work I fell into quite naturally, because from the second grade on, I have had a talent for words. I diagrammed sentences on the blackboard that had the visual complexity of the physics formulae written on his whiteboard by Sheldon Cooper. I managed to use my 10 weekly vocabulary words, tasked with writing sentences for each, by writing one sentence using all ten. I did OK with math, but it was never anything that much interested me, but words were another thing. I ate them up like chocolate cake. 

The upshot is that I am a prime candidate for the position as “Grammar Cop,” bugging those around me for making mistakes in spelling, punctuation and usage. And, admittedly, in the past, I have been guilty. But as age softens me, I have largely given up correcting the mistaken world. And I have a different, more complex relationship with language, less strict and more forgiving. 

The cause for this growing laxness are multiple. Certainly age and exhaustion are part of it. But there is also the awareness that language is a living, growing, changing thing and that any attempt to capture it in amber is a futile endeavor. 

But although I have come to accept many changes in speech that I once cringed at — I can now take “their” in the singular (“Everyone should wash their hands”) and have long given up on “hopefully” — there are still a handful of tics that I cannot get over. I try, but when I hear them uttered by a news anchor or starlet on a talk show, I jump a little, as if a sharp electric shock were applied to my ear. 

The first is “I” used in the objective case. It gives me the shivers. “He gave the award to Joan and I.” It gets caught in my throat like a cat’s fur ball. 

The second is using “few” for “less.” I know that the usage has largely changed, but it still assaults my ear when I hear, “There will be less pumpkins this Halloween, due to the drought.” Ugh. 

A third is the qualified “unique,” as in, “His hairstyle is very unique.” It’s either unique or it isn’t. 

Then there are common mispronunciations. “Ek-setera” is just awful. Although, I did once know someone who always gave it its original Latin sounding: “Et Caetera” or “Et Kye-ter-a.” Yes, that was annoying, too. 

The last I’ll mention here is the locution, “centered around.” Gets my goat every time. Something may be “centered on” a focus point, or “situated around” something, but “centered around” is geometrically obtuse unless you’re discussing Nicholas of Cusa’s definition of the deity, whose center is everywhere and circumference is nowhere. 

Others have their own bugaboos. One friend cannot get past the confusion between “lay” and “lie.” She also jumps every time someone uses “begging the question,” which is misused 100 times for every once it is understood. 

Of course, she may be more strict than I am. “There have been errors so egregious that I’ve stopped reading a book,” she says. “I just stomp my foot and throw the book away.” 

Her excuse: “I was an English major.” 

There are many other issues that bother me, but not quite so instantly. I notice when the subjunctive is misused, or rather, not used when it should be. If I were still an editor, I would have fixed that every time. 

I am not sure I will ever get used to “like” used for “said.” And I’m like, whoever started that linguistic monstrosity? I also notice split infinitives as they sail past, but I recognize that the prohibition against them is a relic of Victorian grammarians. It is too easy to lazily give in to those ancient strictures. 

So far, I’ve only been talking about catches in speech, although they show up in print just as often (you can actually come across “ect.” for “etc.”) But reading a book, or a newspaper or a road sign and seeing the common errors there can be even more annoying. There is probably nothing worse, or more common, than the apostrophe plural. You don’t make something plural by adding an apostrophe and an “S.” “Nail’s” is not the plural for “nails.” This is encountered endlessly on shop signs. 

And digital communication is fraught with homophone confusion. “They’re,” “there,” and “their,” for instance, or “you’re,” and “your.” I admit that occasionally this is just a mental hiccup as you are typing. We all make mistakes. I have sometimes put a double “O” after a “T” when I mean a preposition. That’s just a typo. But there are genuinely people who don’t seem to notice the difference with “to,” “too,” and “two.” (I have great tolerance, however, for the ideogrammatic usage of “2” for “to” in electronic messaging. I find it kind of amusing to see the innovation in space-saving for Twitter and e-mail. I may even have been guilty myself of such things. Indeed, there is a long history of this sort in handwritten letters in the 15th to 18th centuries, when “William” was often “Wm,” and “through” was often “thro.” Paper was expensive and abbreviations saved space.)

Some frequent typological absurdities make me twitch each time. I really hate seeing a single open-quote used instead of an apostrophe when a word is abbreviated from the front end. You almost never see “rock ’n’ roll” done correctly. 

There are lesser offenses, too, that I usually let pass. “Impact” as a verb, for instance. It bothers me, but the only people who use it tend to write such boring text that I couldn’t wade through it anyway. (I wrote about the management class mangling of the language in what I call “Manglish.”) “Different than” has become so normalized for “different from” that I’m afraid it has become standard English. Of course, the English themselves tend to say “different to.” So there. 

There are distinctions that have been mostly lost in usage. “Can I” now means the same as “May I” in most circumstances, and almost no one still makes a distinction between “shall” and “will.” 

Many of us have idiosyncratic complaints. I knew someone who complained that “laundermat” was not a word. We saw such a one on Vancouver Island when visiting. “It should be ‘laundromat,’” she said, arguing the parallel with “automat.” 

Another cringes at “would of,” “should of,” and “could of” in place of “would have,” “should have,” and “could have.” But this is merely a mishearing of the contractions “would’ve,” “should’ve,” and “could’ve” and turning them into print. Yes, it should be corrected, but it doesn’t get under my skin when I hear it. 

And there are regionalisms that bother some, although I glory in the variety of language. One person I know complains about such phrases as “had went,” but that is a long-standing Southernism and gets a pass, as far as I’m concerned. 

And much else is merely idiom. If you get too exercised about “I could care less,” please relax. It means the same thing as “I couldn’t care less.” Merely idiomatic. Lots of grammatical nonsense is now just idiomatic English. Like when the doorbell rings and you ask “Who’s there?” and the answer comes back, “It’s me.” If you hear “It is I,” you probably don’t want to open the door. No one talks like that. It could be a spy whose first language is not English. Better ask if they know who plays second base for the Brooklyn Dodgers (old movie reference). 

And the Associated Press hammered into me the habit of writing “past week” instead of “last week,” on the principle that the previous seven days had not, indeed, been terminal. You can take these things too far — but I am so far brainwashed not to have given in. “Past week,” it will always be. 

I may have become lax on certain spelling and grammar guidelines, but one should still try one’s best to be clear, make sense, include antecedents for one’s pronouns and be clear about certain common mistakes. “Discreet” and “discrete” are discrete words. And someone I know who used to transcribe her boss’s dictated letters once corrected him when he said a client should be “appraised” of the situation and typed instead, “apprised.” He brought her the letter back and complained that she had misspelled “appraised.” Being a man and being in management, he could not be persuaded he was wrong. She had to retype the letter with the wrong word. There’s just nothing you can do with some people. 

Language is just usage at the moment. It shifts like the sands at the beach, what was “eke” to Chaucer is “also” to us. What was “conscience” to Shakespeare is “consciousness” to us. Thus does conscience make grammar cops of us all. We don’t learn our mother tongue, we acquire it and what we hear as babes becomes normal usage. Ain’t it the truth?

The world is not black and white, but until fairly recently, photography was. For most of its history, the art was an art of silver on paper, spread from inky blacks through velvety grays into pristine whites. 

There had been attempts to add color, either by painting on top of the monochrome image, or by various experimental techniques to capture the color directly. But even after the commercially successful introduction of Kodachrome in 1935, photography as a museum-approved art continued to be primarily in black and white. 

(In cinema, Technicolor predated Kodachrome by about a decade, but that process was essentially three different black and white negatives overlapped through color filters to create the effect. It was an expensive and difficult process and relatively few films, percentage-wise, were made with the process until after the commercial success of Gone With The Wind and The Wizard of Oz in 1939.)

I have been a photographer for at least 50 years. I have had shows and my work has been published. But for most of that time, I worked in monochrome. I “saw” in black and white. My photographic heroes worked in B&W, the techniques I mastered were silver techniques. I became an excellent printer. But I seldom used color film. It seemed an unnecessary noise to bring to the purity of the medium. 

I was hardly alone in this. When I was younger, even museums shied away from color photography. It was seen as not “permanent.” It’s images faded over time (I’m sure you all have old family snapshots turned rather magenta with age). The real artist-photographers used silver or platinum and made glorious images. 

Back then, art in general was seen with more precious eyes. We thought of “archival processing,” and even paintings were carefully preserved and curators looked down on some artists — such as Jackson Pollock or Mark Rothko — who used non-archival pigments or unprepared canvases and whose works, therefore, had begun to deteriorate. 

In current times, few artists or galleries worry much about such things. Art can be made on newsprint, or can even purposely self-destruct. Concern for the permanence of an artwork is seen as elitist. After all, no matter how careful you are, the art is going to be gone eventually, even if it lasts till the sun explodes. 

And besides, color is now no more or less permanent than black and white: Now they are both nothing but ones and zeros. Silver is dead; long live digital. 

Yet there is still a difference between color photography and black and white. It is a difference not simply of technique, but of thought. Thinking in color is different from thinking in black and white. 

The part of vision that deals in color is processed in a different area of the brain than the part that concerns itself with darks and lights. (Vision is ridiculously more complicated neurologically than you might think — the information on the retina is broken down into many separate components, processed by differing regions of the brain and then re-coordinated as a gestalt.)

And so, some people pay closer attention to the hue, others to the forms they see. 

The fact is, black-and-white photography and color photography are two different art forms. To be successful in both requires a kind of bilingualism. Most of us have brains that function best either in seeing forms and shades, or in seeing hues. The two photographies emphasize those different talents.

One has only to consider the work of Stephen Shore or William Eggleston. Most of their meaning comes through the color. Take one of Eggleston’s best-known images and suck the color out. What have you got?

He made this photo of a ceiling and light bulb. The red is overwhelming. But imagine it as a black and white image.

He also made a similar image of a brothel ceiling painted blue. Also overwhelming. The two are nearly the same image, but with very different emotional and sensuous meanings.

But if we make them both black and white, they very nearly merge into the same thing. 

Color can by itself separate forms. Here are four squares in four colors; as distinct as can be. But the exact image, unaltered except for the draining of all color from it, leaves a confused mess, barely a separation between grays. 

Black and white photography requires the separation of parts not by hue, but by contrast: Lights agains darks. It’s what makes great silver prints sing. Where color photographs separate forms primarily by hue, black and white shapes form with contrast.

I am not saying a color photograph has to be garish. Far from it. But the color will carry a good deal of the meaning and emotional resonance of the image. Even in a color photo that has hardly any color in it.

Many years ago, I tried an experiment. Like so many others, I loved the waterlily paintings of Claude Monet. But I wondered if they would make as much sense in black and white. Is there a structure holding the pictures together, a design or composition, that didn’t depend solely on the rich color. 

So, I began making photographs in black and white of water lilies. 

The most successful of them clearly relied on bright highlights and strong shadows. The shapes made the picture.

If I tried an overall design, like Monet’s the picture lost its strength. 

I did the same experiment with one of Monet’s paintings, rephotographing it in black and white. 

Did it hold up? It is certainly a very different beast. 

Then I went back to one of my own color photographs of his waterlilies in Giverny, a photograph that imitated Monet’s paintings, with color, sky, reflection, shadow and lily. In color and side-by-side, in black and white. 

What I discovered shouldn’t be a surprise: Monet was much more effective in color. But I also noticed that because my photos were well-focused rather than impressionistically fuzzy, they translated better into black and white: Black and white is meant to clarify shapes. Color identifies “areas” rather than discrete textures. 

And so, while I have spent the majority of my photographic career making monochrome images, along with many others now working in digital media, I switch back and forth between color and B&W. They do, however, require different vocabularies. They are different languages. 

While I have always made visual art, I made my career in writing about art. 

As an art critic, I had the unusual need to be bilingual in an odd sort of way. As a journalist, I needed to be good with words, but in writing about art, especially visual art, I needed to know how to use my eyes.

I discovered very early on how these two talents were seldom granted to the same person. All around me were reporters who knew a gerund from a copulative, but who often seemed almost infantile when discussing pictures. They could name the subject of the image, but not go much further than that. 


A photo editor of my acquaintance once explained photojournalism this way: “I need to know it’s a house; don’t trick it up with ‘art.’” This was image as ID photo. 

But on the other side, so many artists I knew couldn’t explain themselves out of a paper bag. They effused in vague buzzwords, words that changed currency every year or so. I once taught a graduate course in writing about art for art students who needed to prepare so-called “artist statements” for their exhibits. Most of what they wrote before the course was utter blather, obscure and important-sounding without actually meaning anything. 

Words and images: Worlds seldom interpenetrable. I call the talent for riding both sides a form of bilingualism. 

I do not know if the ability to deal in multiple “languages” is something you are born with, or that you learn early on the way you acquire language before the ability to do so closes off in adolescence. But somehow, I managed to do it, at least well enough to write about it without embarrassing myself.

The mental juice necessary to process each seems walled off from the other, except in rare cases. One either runs a literary program, based on sentence and paragraph structure, linear words building a whole out of alphabetic parts; or one comprehends shapes, lines, color, size, texture, and frame as carrying the information required to convey meaning. 

This doesn’t mean that visual people are illiterate, nor that literary people can’t enjoy an art gallery, but that their primary modes of understanding vary. The squishiness of an artist’s gallery talk can drive a writer bonkers; the flatness of a word-person’s understanding of a painting can leave an artsy type scratching her head: “Can’t you see?” 

Nor does it mean that either side can’t learn, although it will remain a second language, without native understanding of idiom and customary usage. A word person can be trained to see shape and form, but it will always remain as I learned Spanish. No one will ever confuse me with a native speaker.

This split between word and image, though is only one of the bisections. Musicians can think in tone the way painters can think in pigment. Yes, there is a language that can describe the music, but for non-musicians, that language is usually impressionistic and often visual — what the music “makes you think of,” or the “pictures in your mind.” 

For the musically trained, there is also language, but it is completely opaque to the civilian: Dominant-seventh, voice-leading, timbre, reed trimming, tenor clef, Dorian mode, ritornello, de capo, circle of fifths. But even these are merely words to describe the non-verbal reality of the music itself, which can convey meaning through sound alone. The words are not the music. 

The ability to think in the terms of each mode is essential to create well in that form, and a mighty help in understanding it for the audience. If you are not in love with words, the rich cream of Gibbons or the organ tones of Milton can leave you cold. If you have no eyes for color, the nuance of Turner or the pears of Cezanne can zip past without notice. If you think of pop tunes as music, the shifting tonal centers of Schubert are inaudible, the orchestration of Mahler merely noise. 

We each have a frequency our sensibilities are tuned to, and can receive it loud and clear; we may think we understand the rest, but too often we are only fooling ourselves. Do you really inhale the contrapuntal movement of a Balanchine chorus? Do you notice the rhythm of editing in a Spielberg film? Each is a language that its practitioners and connoisseurs understand profoundly, but zip past the mass of those sitting in the cheap seats. 

It’s a different language

Click on any image to enlarge

Is murder a real thing? That is, does it exist in the world, separate from the language that describes it?

This is an important question, because it illustrates one of the central issues hindering our politics. Has always hindered politics. 

Certainly, there are humans who have caused the death of other humans, but at what point do we draw the line and call it murder? It is a line that shifts over time and culture. When we kill someone during war, we generally do not call it murder, even if it a civilian who is dead. It might be “collateral damage.” When we execute a convicted killer, we do not call it murder — or at least most people don’t. And if we accidentally run over a pedestrian who steps in front of our car, we don’t call it murder, either. 

The results are the same in all cases: Someone stops being alive. 

But we make legal distinctions between murder and manslaughter. There are shades and subsets of homicide. First- or second-degree murder, felony homicide, unlawful death, voluntary and involuntary manslaughter, justifiable homicide, parricide, suicide, infanticide, fratricide, assassination, euthanasia, regicide, honor killing, revenge killing, human sacrifice, self immolation, suicide by cop, extrajudicial killing, genocide. 

The words used and the lines drawn are different, not only in different countries and cultures, but in different states in the U.S. Some states recognize third-degree murder. A few have legalized voluntary assisted suicide. There is no uniform, worldwide, universal definition of what constitutes “murder.” 

So, again, is murder a real thing, outside of language? Or is it just a word? 

So, when we argue that abortion is murder, we are not really talking about anything real, but about language: We are arguing about the dictionary. 

I do not mean here to minimize the moral concerns over abortion, which are quite troubling, and I have no intention of changing anyone’s mind on the issue. People on both sides are intractably dug in. My concern is rather to point out the way we tend to use language as if it were a one-to-one depiction of reality. When we call abortion “murder,” we are using a conditional and contextual term as if it were categorical. 

When we name something, what is the relationship between that tag and the thing itself? Not only is it arbitrary, it is constantly shifting.

Let’s take Jonah and the whale. The King James Bible says the prophet was swallowed by a “great fish.” Does that mean it wasn’t a whale? Well, before the early 19th century, a whale was a fish. It was so categorized in books and dictionaries. In his popular History of the Earth and Animated Nature, from 1774 and reprinted well into the 19th century, Oliver Goldsmith divided the fish into “spinous fishes,” “cartilaginous fishes,” “testaceous and crustaceous fishes” and “cetaceous fishes.” A mackerel, a sand dollar and Moby Dick were all kinds of fish. After Linnaeus rearranged the orders of living things, did any of the actual animals change? Of course not. The change was linguistic, not biological. 

The logic of language and the chaos of experience are sometimes parallel, but never coexistent. Language has, for instance, nouns and verbs. Things and actions. But in experience, all things are always in action and all actions occur in things. They are a single entity; splitting them is part of the logic of language. Language consequently splits into discrete bits what cannot in life be divided. 

Sentences are written in a certain word order. Subject and predicate; modifiers and conjunctives; relative and independent clauses; semicolons and hyphens. None of these things find matches in the real world. Their logic is the logic of language. Life is other. 

And we too often (in fact, almost always) come to believe that our words match our lives. They don’t. 

I say this with some perturbation, having made my living with words. I love words. I love language. But the older I get, the more obvious it becomes that language is the “other.” It is a simulacrum of reality, but far removed. 

Take Zeno’s paradox. Here is a prime example. For millennia, logicians have argued over it. Give a tortoise a head start in a race and Achilles can never catch it. Logic proves it. Before catching the tortoise, Achilles has to go halfway to catching it. But before he goes halfway, he has to go a quarter of the way. You keep fractioning it out, and it becomes obvious, there will always be a fraction that Achilles has not yet overcome. 

But, try it empirically, and it takes Achilles only a single stride to pass the tortoise. The structure of the proposition has a self-referential reality that does not mirror the reality of experience. Two completely different things. 

This has been my beef with Plato. His idealism is only possible in language. His bed is a definition of bed. His good is a definition of good. He is writing a dictionary. If the Greeks had a fault, it is their hubris over their language. They never understood the difference between word and fact; they believed that, if the Greeks had a word for it, everything was covered. 

The voluntary or unwitting confusion of language and reality has been used by political factions for as long as there are records of language. It is how Mesopotamian kings explained their reigns, how Spartans and Athenians justified killing each other, how secessionists recruited soldiers in 1861, how the Cold War was sustained. And it is how Donald Trump herds his believers. 

I hope you noticed how I just used language to characterize what, in fact, is a heterogeneous accumulation of voters who probably each had his or her own reason for picking the Great Orange Pumpkin. Some of those reasons were poltroonish, some ignorant, some hopeful, some rebellious, some genuinely patriotic. Probably as many reasons as there were reasoners. But with language, I can imply they were both bovine and religiously zealous. Thus language can be dismissive. 

Trump uses language this way constantly, setting up dichotomies that don’t exist in reality, creating categories that only function linguistically, using insults to stick labels into opponents with a pin. “Crooked Hillary,” “Lyin’ Ted Cruz,” “Little Mario.” This is language shrinking reality. Reality is vast, multifarious, undefinable; language is a door slammed in the face of possibility.  

The very model of the world that Trump lives by — us vs. them and the sense of everything being a zero-sum game — are linguistic in origin, not reality-driven. They do not match experience. 

I clearly have my own political preferences, but I am not here trying to change votes, but to persuade that our understanding of the world is constricted by our faith in language. Language is not the only means of engaging the world. There is sound, sight, spatial reasoning, mathematics. Each with its own structure and meaning.

Consider how you decide whether to pass a car on the road. You do neither arithmetical calculation, nor verbal argument, but rather, you have a spatial sense of objects moving in time and space and you can judge quite accurately if there is time and space to get around the geezer driving 25 in a 45 zone. This is not verbal, but it is thought nonetheless. 

The life we experience is continuous and contiguous; it is not parceled into tiny bits, each distinct and definable. It is one huge swirl and swathe. Language cannot ever encompass it. Beware.

It is the literary equivalent of “Da-da-da-Dum” from Beethoven’s Fifth Symphony. “2B or not 2B.” Everyone knows it, whether they have seen Hamlet or not. It would be hard to find another phrase as often quoted or as immediately recognized by a wide public. “Call me Ishmael.” “It was the best of times; it was the worst of times.” “In the beginning was the word.” Even these lag behind the opening of Hamlet’s soliloquy as cultural roughage. 

Because it is so deeply buried in the culture, it is hard to even hear it anymore. It glides by not as information, but as a kind of tune, hummed thoughtlessly while sanding a table top or cutting carrots in the kitchen. 

But that soliloquy, just as the play it sits in the middle of, can be performed many different ways, with very different meanings. There are Hamlets that are Oedipal, Hamlets that are schizophrenic, Hamlets that are hot-blooded, those that are indecisive, those that are crafty — and at least one Hamlet played as a stand-up comedian. Take the words the playwright wrote and you can construe them myriad ways. In Ulysses, James Joyce has his character Stephen Daedalus prove that Hamlet is his own father. Sort of. 

Likewise, the “to be or not to be” speech can be spoken theatrically, like Master Thespian — this is too often the case — or emotionally, or enunciated with clinical precision. It can be spoken to the audience, breaking the fourth wall, or whispered under the breath. It can be done as a voice-over, as if we are hearing Hamlet’s thoughts. 

Benedict Cumberbatch; Mel Gibson; Thomas Hiddleston

(The one thing that seldom changes is Hamlet holding up poor Yorick’s skull in Act 5. Everyone has to do it, and what is more, be photographed doing it. Even publicity photos for provincial productions have to feature the Dane and his moldy jester.)

Hamlet is perhaps Shakespeare’s greatest play. It certainly has his wittiest hero: Hamlet, the Dane, is in fact too smart for his own good. In part, that’s what the play is about. 

In it, Claudius has killed his brother, the king — Hamlet’s father — and usurped the throne and queen. 

When the dead king’s ghost tells Hamlet to revenge him, Hamlet enters a storm of uncertainty: How, when, why and if to kill Claudius. In the process, Hamlet alienates most of the people he knows, even killing several. 

When Claudius contrives to murder Hamlet before the young prince can kill him, the whole Danish court is thrown into violence and death. 

You can just keep turning this play around and the light will keep catching a new facet. The more you look at it, the more you see. An actor has to decide: At any moment, is what is driving the character? 

Hamlet is the single most complex, multilayered and confusing character in any play. Is he insane? Is he pretending to be insane? Is he sane at some moments and mad at others? Is he obsessed with his mother? Is his inability to act caused by fearfulness, thoughtfulness, indecision or a desire to kill Claudius only when murder will do the most harm to Claudius’ eternal soul? 

None of these versions is ruled out by the text, but none is sufficient of itself. 

“As an actor,” one Hamlet said, “I’m going to try to illuminate as many facets as I can. But you can’t do it all, or you’ll lose focus. I feel sometimes I’m trying to cover myself with too little blanket: If I cover my head and shoulders, my feet stick out.” 

Critics have argued for 400 years about Hamlet’s inaction. But the reason the character refuses to go away is that he is at least as complex as we are in the audience: Hamlet is real. 

Hamlet has a line, when he’s talking to Rosencrantz and Guildenstern, “You would seem to pluck out the heart of my mystery,” and that is what most scholars and critics try to do.

Not only actors, but whole ages have their takes. In the 19th century, Hamlet was often played as effeminate, or at least as one easily in touch with his feminine side. 

Edwin Booth brother of Lincoln’s assassin, and considered the greatest American actor of the 19th century, himself wrote in 1882, ”I have always endeavored to make prominent the femininity of Hamlet’s character and therein lies the secret of my success — I think. I doubt if ever a robust and masculine treatment of the character will be accepted so generally as the more womanly and refined interpretation. I know that frequently I fall into effeminacy, but we can’t always hit the proper keynote.’’

Edwin Booth; Sarah Bernhardt; Asti Nielsen; John Barrymore

In fact, there were many notable actresses who took on the role then, most famously, Sarah Bernhardt, who said, ”I cannot see Hamlet as a man. The things he says, his impulses, his actions, entirely indicate to me that he was a woman.’’

The practice actually goes back further. In 1775, Hamlet was played by the young Sarah Siddons to great acclaim (she continued to play the role until she was 47). Two decades later, the role went to Elizabeth Powell in London’s Drury Lane theater. 

These women achieved great praise. The stuffy Dr. Samuel Johnson saw Kitty Clive in the play and compared her performance with that of the famous actor David Garrick. “Mrs. Clive was the best player I ever saw,” he noted. “What Clive did best, she did better than Garrick.” 

Ruth Mitchell; Frances de la Tour; Lisa Wolpe

In 1822, Julia Glover played Hamlet in London and fellow actor Walter Donaldson said, “Her noble figure, handsome and expressive face, rich and powerful voice, all contributed to rivet the attention of the elite assembled on this occasion; while continued bursts of applause greeted her finished elocution.” The greatest actor of his age, Edmund Kean, came backstage to congratulate her: “Excellent. Excellent,” he said. 

In 1820, the first American female Hamlet was Sarah Bartley, in New York. At mid-century, Charlotte Cushman took on the role in New York and Boston, wearing the costume Edwin Booth had lent her. 

The sentiment was not unanimous, however. The New York Mirror disapproved of Nellie Holbrook’s Hamlet in 1880. “This absolutely masculine character is not capable of proper presentation by a woman, however great or talented,” the reviewer wrote. “We are, however, free to say that Miss Holbrook’s Hamlet is eminently respectable.”

That is better than the patronizing review of critic William Winter in 1911. “It is difficult to understand why Hamlet should be considered feminine, seeing that he is supereminently distinguished by a characteristic rarely, if ever, discerned in women: namely that of considering consequences, of thinking too precisely on the event.” 

Christopher Eccleston

In the 20th century, Hamlet took a decidedly macho turn (say it like the British: “Match-oh”). He becomes a swashbuckler or a sadist, by turns. Olivier, Mel Gibson, Christopher Eccleston, who makes him look like a soccer hoodlum. 

Yet, there have been actresses who took the role. Maxine Peakes is available on DVD. Frances de la Tour, Ruth Mitchell and Lisa Wolpe played the Dane. In 1982, Joseph Papp produced a Hamlet with Diane Venora. 

“There are men who have played Hamlet very effeminate and there are those who played it macho; the male spectrum goes from the very tough to the effete and very delicate,” Papp said. “Most English Hamlets from the 19th century on were quite delicate, while American Hamlets were much tougher — like Barrymore. Diane is a strong Hamlet, but not a macho Hamlet; vulnerable, but not hysterical.

“For years I have wanted to do a female Hamlet,” Papp said. “I have always felt that there is a strong female side to Hamlet — not feminine so much as female. To me that has to do with an easier capacity to express emotion. The person playing Hamlet should be able to weep unabashedly and unashamedly. There are men who can do that, but they should be young; Hamlet is a very young person, an adolescent, a student.”

In 1937, it was Eva LeGallienne, who said, “I think psychologically one feels Hamlet was a youth … He’s still going to Wittenberg, to college, you know. He can’t be a mature man. The whole thing points to a very young youth, and therefore because a boy of that age might not be technically equipped to play the role, this is why many women in their thirties who can look like a youth, and had the technical skills to play this great role, have played it.”

Top row: Campbell Scott; Alan Mahon; Danforth Comins; Jonathan Douglas; Bottom row: Nathan Darrow; Rory Kenner; Tobias Fonsmark; Holder Bulow; Michael Benz

But, of course, Hamlet can be played all of these ways. The part is supremely plastic — you can stretch it this way and that and it still makes theatrical sense. 

But this divigation has gone on too long. Back to the soliloquy. To be or not. To be? That is the question. Nothing can stale its infinite variety. Let’s take a few different versions. Olivier, in his 1959 film, does it mostly as a voice-over, set on a precipice overlooking roiling surf. It is Hamlet on the edge of a breakdown. (Link here).

Gielgud was an enunciator. The clarity of his delivery overtakes the overt emotionalism that Olivier brought. (Link here).

Kevin Kline gives it the Master Thespian touch, emphasizing every word as if it were the most important. It becomes monotonous. But, soft, he doth drop a tear. (Link here). 

In the entire opposite direction, Benedict Cumberbatch speaks the lines as if they were spoken off the cuff. This is the way real people speak. I especially love the way he makes sense of the line: “to sleep. No more.” He makes it into “death is to sleep, no more than that.” His is my current favorite version. (Link here). 

One last version. John Barrymore was the great Hamlet of the early part of the 20th century. The bulk of his career was before sound film, so it was only in his decline that he filmed the speech — or part of it — in a silly comedy starring Kay Kyser as a hick bandleader attempting to learn to be an actor. He hires Barrymore, playing a parody version of himself, to be his mentor. At one point, the comedy stops and Barrymore gives his bit of the soliloquy and you can see the majesty of his talent peek through the alcoholic puffiness. The take is almost ruined by his uncontrollable eyebrows, looking like two marmots fighting over a cheese. But the words, the words, the words. (Link here). 

Papa Essiedu, Simon Russell Beale, Paul Giamatti, Grantham Coleman

As for the words, they can be difficult for modern listeners. What the hell is a fardel? Would you bear fardels with a bare bodkin? Sometimes you wonder what Shakespeare meant, although the problem isn’t as apparent when the words are spoken on stage, as when you read them in text. An actor can make the meaning clear in context. When Hamlet says, “with a bare bodkin,” he draws his dagger and the audience understands. 

But language has changed in the past 400 years and even words that are still in current usage often had different meanings then. A careful reading of Shakespeare’s work demands an attention to lexicographical detail, if we are to avoid confusion. 

And even when we know what the words mean, we are still faced with the fact that the Bard often uses the words metaphorically, as when he has Hamlet talk of “taking his quietus,” which doesn’t literally mean to kill himself, but rather means, having finished an enterprise, or having paid off a longstanding debt. Such is life, he implies.

The most famous soliloquy in Hamlet is a profound meditation on death and suicide — the question Albert Camus said is the only philosophical question that really matters. But what do the words mean?

To be, or not to be: That is the question:/ Whether ‘tis nobler in the mind to suffer/ The slings and arrows of outrageous fortune,/ Or to take arms against a sea of troubles,/ And by opposing end them? To die: to sleep;/ No more; and by a sleep to say we end/ The heartache and the thousand natural shocks/ That flesh is heir to, ‘tis a consummation/ Devoutly to be wish’d. To die, to sleep:/ To sleep: perchance to dream: ay, there’s the rub;/ For in that sleep of death what dreams may come/ When we have shuffled off this mortal coil,/ Must give us pause: There’s the respect/ That makes calamity of so long life;/ For who would bear the whips and scorns of time,/ The Oppressor’s wrong, the proud man’s contumely,/ The pangs of despised loved, the law’s delay,/ The insolence of office and the spurns/ That patient merit of the unworthy takes,/ When he himself might his quietus make/ With a bare bodkin? Who would fardels bear,/ To grunt and sweat under a weary life,/ But that the dread of something after death,/ The undiscover’d country from whose bourn/ No traveler returns, puzzles the will/ And makes us rather bear those ills we have/ Than fly to others that we know not of?/ Thus conscience does make cowards of us all;/ And thus the native hue of resolution/ Is sicklied o’er with the pale cast of thought,/ And enterprises of great pitch and moment With this regard their currents turn awry,/ And lose the name of action.

Alec Guinness, Peter O’Toole, Derek Jacobi, Jonathan Pryce

A quick glossary: 

Rub – actually, an obstacle on a lawn bowling green.

Shuffled – cast off, like a snake skin

Coil – Turmoil

Respect – consideration or regard

Of so long life – long lived.

Time – The world as we know it.

Contumely – Contemptuous insults

Despised – Rejected.

Office – Office-holders; bureaucrats.

Spurns – Insults.

Quietus – the paying off of a debt; the resolution of an enterprise.

Bare – used here, “bare” may mean “mere.”

Bodkin – a sharp object, sometimes a hatpin, but here a dagger.

Fardels – Burdens, as a bindle or an army’s dunnage.

Bourn – Region; boundary.

Conscience – Used in an older sense of consciousness; thought.

Native hue – Natural color.

Cast – shade of color.

Pitch – The height of a soaring falcon’s flight; before falling on its prey. 

Moment – Importance.

Regard – Consideration.

It is poetry, in iambic pentameter, with rhythm and melody. But we can translate the whole into modern American tapwater. And so, if we take the poetry out of this soliloquy, what we are left with is the bare-bones meaning:

The only question that counts is suicide: Should one put up with the suffering of life or do something about it and end it all? Death is like sleep: And if as in sleep, the troubles go away, that would be wonderful. But when we sleep, we also dream. And if we dream after death, the way we do in sleep, well, that’d make you stop and think wouldn’t it? That’s why this disaster we call life goes on: For who would put up with life’s crap if he could end them all through suicide? Who would bear the burdens of life but that the threat of something much worse after death makes us hesitate and makes us put up with the troubles we have rather than fly to others we don’t know anything about? And so, thinking makes us cowards; And the will to action is weakened by thinking, And what mighty deeds we would perform come to exactly zip.

And that is why Shakespeare is Shakespeare. 

The Arnold, Buster Keaton, David Bowie, Weird Al Yankovich

Photo at top: Top row, L-R — Lawrence Olivier, John Gielgud, Richard Burton, Nicole Williamson; bottom row — Kenneth Branagh, David Tenant, Ethan Hawke

Click on any picture to enlarge

 

When I was in second or third grade, we had weekly lists of vocabulary words to learn, lists of ten or a dozen new words. And we were assigned to write sentences using these words. And me, being a smartass even back then, I worked hard each week to write a single sentence using all ten words. Even now I’m not sure if I did it to be clever or because I was lazy and didn’t want to write ten sentences.

But when I look back on it, I realize it was a dead give-away clue that I would later earn my crust by becoming a writer. I loved words, and I loved using words.

Other kidlings might groan when the teacher picked up the chalk to diagram sentences, but I loved those underlines and slants, those networks of adjectives and conjunctions. It was fun, like doing a crossword puzzle or connecting the dots.

When I was young enough, before the cutoff date for it, I didn’t learn words so much as acquire them. But even when it later took the effort, I still did my best to expand my word trove.

And as I grew into adolescence and I read constantly — everything from Lew Wallace to the backs of cereal boxes — I continued to absorb words. I would sometimes pore over a dictionary, picking out new and intriguing words. They were not merely signifiers of semantic meaning, but entities in and of themselves. Others might go “ooh” and “aww” over a puddle of newborn kittens, I did the same thing over bits of verbal amber and gleam.

It did not seem at all odd when the ailing pulp writer Philip Marlow in The Singing Detective asked his nurse, “What’s the loveliest word in the English language? In the sound it makes in the mouth? In the shape it makes in the page?” His answer was “elbow.” That would not have been mine, but I’m not sure I could have chosen. Words have a taste in the mouth, and however much one might like foie gras, one cannot do without ripe peaches or buttered asparagus. I loved all words, fair and foul. And I loved the mouth-feel of them, like a perfect custard.

British polymath Stephen Fry often tells the story (perhaps too often) of how when he was a wee bairn, he saw on the small black-and-white TV in his home the 1952 film version of The Importance of Being Earnest. He was struck by a line spoken by Algernon: “I hope, Cecily, I shall not offend you if I state quite frankly and openly that you seem to me to be in every way the visible personification of absolute perfection.”

“How unbelievably beautiful,” Fry says. “The swing, balance and rhythm. I’d known you could use language to say, ‘May I please be excused to go to the washroom,’ or ‘I want some more,’ but the idea that it could be used to dance, to delight, to enthrall — it was new to me.”

And Fry became what he called “a celebrant and worshipper at the altar of language.”

For me, it wasn’t Wilde, but James Joyce, first reading A Portrait of the Artist as a Young Man when I was in high school and being swept along in a tidal current of language. “Once upon a time and a very good time it was there was a moocow coming down along the road and this moocow that was down along the road met a nicens little boy named baby tuckoo …”

We had been taught in grade school to speed read, along with a dreadful little machine that mechanically drew a rod down along a page, drawing one to move line by line in a forced march through the text; we would then be tested on our comprehension. Day by day, the guide rod was moved more and more speedily down the page, making us read faster and faster, until we could skim and recall very well, thank you.

But that wasn’t the kind of reading that gave me physical, bodily pleasure. And when I came across books like Joyce’s, I slowed down. I could not read them without hearing the words in my head. Without feeling them on my tongue and teeth.

A sentence such as our introduction to our hero in Ulysses cannot be read merely for sense. It has to be understood for its music, almost ecstatic, like Handel’s Zadok the Priest or Beethoven’s Great Fugue: “Mr. Leopold Bloom ate with relish the inner organs of beasts and fowls.” Your tongue creates phonic choreography in your mouth as you form those words.

I remember when I was perhaps 24 or 25, reading Lawrence Durrell’s Alexandria Quartet and stumbling on so many odd and eccentric words, that I kept a notepad next to my desk to write down such words as I underlined in my copies of the books (yes, I write in my books. If you don’t write in the margins or underline passages, you haven’t really read the book). “Pegamoid,” “ululation,” “usufruct,” “exiguous,” chthonic,” “etiolation,” “boustrophedon,” “tenebrous,” “crepitating,” “cachinnation,” “comminatory,” and, apropos our current resident of the White House, “troglodyte.” (Another great word to remember in this regard is the title of a satiric philippic by Seneca the Younger — “apocalocyntosis” the “Pumpkinification,” in the original of the emperor Claudius, but our case of the Great Orange Boor.)

You probably have to be young to read Durrell, when you still hold idealistic and romantic expectations, and to put up with the prose pourpre, but my word-hoard grew. It became something of a joke when I wrote for my newspaper, where I’m sure the copy editors were laughing at me for using six-dollar words like chocolate sprinkles on a donut. I used them because I loved them, and because they were precise: When you develop a ripe vocabulary, you learn there are no synonyms in the English language: Each word carries with it a nimbus of connotation, a flavoring or a shade that makes it the right or wrong word for the context. No matter how close their dictionary definitions, words are not simply interchangeable.

Anyway, I had my little joke back on the copy editors. For a period of about six months back in the 1990s, every story I wrote had in it a word I plain made up. My game was to see if I could sneak them past the copy desk. Some were onomatopoeic, some were Latinate or Hellenic portmanteaus, some were little more than dripping streams of morphemes. And, to my utter delight, every one of them made it through the editors. A few were questioned, but when I explained them, they were permitted. Looking back, I regret this persistent joke, because it was aimed at that little-praised but admirable set of forgotten heroes, who have many times saved my butt when I wrote something stupid. Let me express my gratitude for them; everyone needs a copy editor.

Occasionally, when I have an empty moment, and I don’t have access to a crossword puzzle, I will sit and write lists of words as they come to my brain. Each word has its own cosmos of meaning, an electron-cloud of ambiguity and precision, its emotional scent, its sound and its fury. As I write them down, I savor each one, like an hors d’oeuvre. Such lists, in their way, are my billets doux to my native tongue, which has fed me both spiritually and financially over many decades.

I have a book I love greatly. In august buckram, of a deep navy blue, with gold embossed letters on the spine, it is the Oxford Dictionary of Nursery Rhymes, compiled in 1951 by Iona and Peter Opie. It is more than an anthology; it is a deeply researched tome of scholarship, as one would expect from the Universitatis Oxoniensis.

Each rhyme is compiled with variorum versions and usually several pages of history, interpretation and arcana. Humpty Dumpty covers four pages, with footnotes. We learn that versions exist in Sweden (“Thille Lille”); in Switzerland (“Annebadadeli”); Germany (“Rüntzelkien-Püntzelken”); France (“Boule Boule”) and elsewhere. That Humpty-Dumpty is the name of a boiled ale-and-brandy drink; that there is a little girls’ game by the same name; that the name was also given to a siege engine in the English Civil War.

And we learn that there is a commonly-held belief that the rhyme (I can’t really call it a poem) is really about the fall of “My kingdom for a horse” Richard III. Not, apparently, true.

If there is a common theme in the book, it is that although so many people believe there is a “secret” meaning to so many of these nonsensical nursery rhymes, and seek out who in history is really being referenced, almost always such belief is unfounded. The poems are either attested to much earlier than the historical figure, or we know by internal evidence, it could not be.

How many people believe “Ring around the rosey” is about the Black Death or the Great Plague of 1665? This folk etymology doesn’t appear until after World War II, but now seems universally accepted, despite all evidence to the contrary. The symptoms in the verse are simply not the symptoms of the disease.

Or take “Sing a song of sixpence, A pocket full of rye; Four and twenty blackbirds, Baked in a pie.” The Opies relate several “interpretations” of the rhyme: “Theories upon which too much ink has been expended are (1) that the twenty-four blackbirds are the hours of the day; the king, the sun; the queen, the moon; (2) that the blackbirds are the choirs of the about-to-be dissolved monasteries making a dainty pie for Henry; the queen, Katherine; the maid, Anne Boleyn; (3) that the king, again, is Henry VIII; the rye, tribute in kind; the birds, twenty-four manorial title deeds presented under a crust; (4) that the maid is a sinner; the blackbird, the demon snapping off the maid’s nose to reach her soul; (5) that the printing of the English Bible is celebrated, blackbirds being the letters of the alphabet which were ‘baked in a pie’ when set up by the printers in pica form. … If any particular explanation is required of the rhyme, the straightforward one that it is a description of a familiar entertainment is the most probable.”

Occam’s razor, once again.

I grew up in suburban New Jersey, largely destitute of what Bruno Bettelheim called the “enchantment of childhood.” I never read any fairy tales until college. And the child rhymes I had about me were not usually the ancyent classiques, but rather, the newer comic ones.

Fuzzy Wuzzy was a bear

Fuzzy Wuzzy had no hair

Fuzzy Wuzzy wasn’t fuzzy

Wuz he?

or:

Oo-ee Goo-ee was a worm

A mighty worm was he

He sat upon the railroad track

The train he did not see

Oo-ee goo-ee!

Then there were the spelling rhymes:

Chicken in the car

The car won’t go

That’s how you spell

Chicago.

or

A knife and a fork

A bottle and a cork

That’s the way to spell

New York.

There were those set to familiar tunes, like the “Great green gobs of gooey grimy gopher guts,” or:

Be kind to your webfooted friends

For a duck may be somebody’s mother.

Be kind to your friends in the swamp,

where the weather is very, very damp.

Now you may think that this is the end —

Well, it is!

That abrupt ending was a theme, as in “Ooey-Gooey” and in

There was an old crow 

Sat upon a clod; 

That’s the end of my song. 

—That’s odd.

When I was a kid, I thought that kind of deconstruction of the scansion was hilarious.

Later, I learned such eternal classics as:

O I had a little chicken and she wouldn’t lay an egg

So I ran hot water up and down her leg

O the little chickie cried and the little chickie begged

And the little chickie laid me a hard boiled egg.

Which we rounded off with the modern rewrite of “Shave and a haircut, Five cents:”

Match in the gas tank:

Boom-boom.

Also hilarious:

On top of spaghetti,

All covered with cheese,

I lost my poor meatball

When somebody sneezed.

It rolled off the table

And onto the floor,

And then my poor meatball

Rolled right out the door.

“Rolled right out the door,” had me rolling on the floor.

Almost as much as:

I see London, I see France;

I see someone’s underpants.

Underwear being, of course, in grade school second in delirious comedy only to farts.

Such rhymes may refer to real personages, of course, as:

Lizzie Borden took an ax

And gave her mother forty whacks

And when she saw what she had done,

She gave her father forty-one.

(Although court records tell us Lizzie’s stepmother received 18 blows and her father, 11. Still, we don’t go to children’s doggerel for historical research.)

The fact is, this stuff is just nonsense verse, and we loved it, not only because we were immature little brats who found bodily functions risible, but because rhyme and meter delight the mind and ear. The children’s rhymes we recited when we were bairns were one of the ways we acquired language. (It has often been pointed out that we don’t “learn” our native tongue, but rather “acquire” it, picking it up by example, and examples that are memorable are easier to remember, QED.)

I don’t mean to imply these versicles were understood to be, or designed to be pedagogical, but that their effect was to make language magical and something we didn’t simply use, but delighted in.

Of course, sometimes the stupid rhymes were meant to teach, like “In Fourteen-hundred and Ninety-two, Columbus sailed the ocean blue.” Or, in even more egregious form, causing lifelong damage to those required to memorize them in music-appreciation classes, those mnemonics that taught classical music:

This is the symphony

That Schubert wrote

And never finished.

Or:

In the hall of the Mountain King

Mountain King

Mountain king

In the hall of the Mountain King

Was written by Edvard Grieg.

Can’t unhear what you’ve heard. Such things led to parodies, also, sung to the opening of Mozart’s Symphony No. 40:

It’s a bird, it’s a plane, it’s a Mozart

Shoot him down, shoot him down, shoot him down…

So, as we grew up, we still loved the silliness that we first encountered with our nursery rhymes and nonsense verse. It is why Walt Kelly’s Christmas carols are sung even by people who don’t know where they come from:

Deck us all with Boston Charlie

Walla-Walla, Wash., and Kalamazoo

Nora’s freezing on the trolley

Swaller dollar cauliflower alley-garoo!

It is why we love Shel Silverstein’s ditties:

The Slithergadee has crawled out of the sea.

He may catch all the others, but he

won’t catch me.

No you won’t catch me, old slithergadee,

you may catch all the others, but you wo— 

My brother says he doesn’t even remember writing this one, but I wrote it down many, many years ago:

Watch your scotch

Or it’ll get brittle.

And I was once asked to be a Cyrano for a college roommate I detested and to write a poem that he could pretend he wrote for a girl he fancied. Her name?

If you have a yen,

Don’t ask if, ask Gwen.

I don’t remember how that romance turned out, but, you know, “Match in the gas tank; Boom-boom.”

1948-1949-1953

Who are you?

I don’t mean your name or your job or your nationality or ethnicity. But who and what are you? I should like you to think about that for a moment.

Many people believe in a heaven after death where they will meet their loved ones again. But what will they look like? For that matter, in heaven, what will you look like? If you have an internal sense of who you are, what does that person look like? This is not a random question, but a way of considering one of the fundamental issues of existence and of our way of understanding that existence. If you had an entry in the dictionary, what would the picture look like next to your name? Is there even a single image that captures the totality of your existence. When Alfred Stieglitz proposed to create a portrait of Georgia O’Keeffe, he took a bookload of photographs, since one could not ever be enough.

The problem is in thinking of existence as a noun.

For most of us, the cosmos is made up of things; indeed it is the sum total of things. This is a misunderstanding of the reality we live in. It is also a misunderstanding caused by our reliance on language as a way of dealing with that reality. Language leads us astray.

1956-1959-1962

When most people consider what the world is made of, they expect to encounter nouns — that is, things. When they consider themselves, they either think of how they look in the mirror now, but more often of an idealized version of them at the peak of their existence, perhaps when they were 25 or 30 years old. It’s how we will appear in heaven. We have this peculiar idea that nouns are a static identity, that a horse is a horse, a flower is a flower and a bed is a bed. Webster’s dictionary is a catalog of reality.

I bring up beds because of Plato and his damnable idealism. He posited that all earthly beds are but a misbegotten imitation of the “ideal” bed, which does not exist in this world, but in some idealized non-material realm. There is an ideal bed, he says, and an ideal chair, ideal tortoise, ideal apple pie, ideal human, compared to which the earthly item is a knock-off. These ideals are perfect and unchanging, whereas the world we know is sublunary and corrupt.

1966-1969-1977

I’ve written before about this blindness in the ancient Greeks, that they conflated language with reality, that they truly believed that the word they knew was a perfect and complete representation in language of the reality they lived in, and further, that the logic of language replicated identically the order of the universe. Language and reality had a one-to-one relationship. It was a naive belief, of course, but one that led them to believe that nouns were a real thing, not merely a linguistic marker. We still suffer from vestiges of this superstition.

(There was at least one Greek who demurred. Heraclitus recognized that all existence was movement. “Panta Horein,” he said. “Everything flows.” It is why, he said, you cannot ever put your foot into the same river twice. Heraclitus is my hero.)

1984-1996-2015

plant-life-cycleFor in the real world, there are no nouns, there are only verbs. It is all process. A noun is just a snapshot of a verb, freezing it in a particular time and place. But the one ineluctable thing is the verb — the process, the motion, the growth, the dissolution and re-formation. A flower is not a thing, but a motion. It begins as a seed, sprouts beneath the soil and breaks its surface, grows upward, pushing out leaves, swelling into a bud at its apex, popping open the bud to a blossom, which fertilized by a moving bee, dries and drops, leaving a fruit encapsulating a new seed, which falls into the soil once more. The whole is a process, not a thing.

It is the same for you or me. When we were conceived, we were a zygote turning into a fetus, into an infant, a toddler, a boy or girl, an adolescent, a young adult, a grown-up (when we set the seed once more for the next birth) and then accept middle age and senescence, old age and death. We are not any of the snapshots we have in our albums, but the motion forward in time, always pushing up and outward.

“I am inclined to speak of things changed into other things,” writes Ovid at the beginning of his Metamorphoses. Indeed, Plato’s bed began as a seed, a tree turned into lumber, the lumber into a bedframe. Eventually the bed will rot away into the soil once more. Just because the movement is slow doesn’t mean it isn’t happening and isn’t constant. Fie on Plato. (Plato that proto-fascist — I despise the man).

 
This brings us to the recent election. I never intended to write about it, but I cannot avoid it. Plato is a fascist not merely because of the deplorable blueprint for totalitarianism in his Republic, but because that very belief in a noun-world leads to a belief that there is a stasis, a final solution, a political order that will finally and forever settle all the problems we face. Current American conservatives have this sense that if we would only do things their way, we would finally solve the problem of crime, of a stable economy, a balanced budget, of creating a smooth-running order. Oddly they share this teleological view with Marxists. They do not see politics as the constant give-and-take of contending interests, but rather as a kind of machine that could remain static and ever-functioning. They see a noun, not a verb, but politics is a verb. Panta horein.

the-whole_edited-1Just as every flower leads to a seed, so every solution leads to a new problem. There is no ultimate order, no final stasis. It is perpetual churn. Contending interests constantly change, upsetting the received order, and anyone who believes that if we only did this, or did that, everything would be hunky-peachy — well, good luck with that. But there is no end to labor; we keep working, moving, changing until we are no longer aware of the changes that will take over when we die.

I see this clearly looking at the series of pictures of myself from when I was an infant to now, when I am an old man. In between come the student, the husband, the ex-, the career, the exhaustion, the grayed hairs, the grandfather. Which is me? Instead, what I see are frames from a continuous movie and the only reality that counts is the movement, the constant flux from one being into another, no boundaries, no scene changes, no new chapter headings, but one continuous wipe, from beginning to an end now approaching close enough almost to touch. copepod

Further, I can look backward to my parents and their parents, and forward to my daughter and her children and can easily imagine their offspring and those following — all one continuous sweep. My wife had her DNA tested and that allowed her to see her background past sweep from North Carolina back through Ireland, the Mediterranean, the Levant and into Africa, mutation by slow mutation. If there were tests sophisticated enough, I’m sure we could peer back through microscopes at that same DNA to lemurs, crocodiles, placoderm fish, hydrae, algae, and various spirochetes.

And then the planet back through the accretive dust, into the exploding novae, back to the plasmic hydrogen to the Big Bang. From then, it is always moving forward in a cosmic rush, skating through space-time — the long verb.

A noun is just a snapshot of a verb.

De ratificatie van de Vrede van MunsterOver several years in the 1870s, composer Bedrich Smetana wrote a series of six tone poems for orchestra that he titled Ma Vlast, or “My Country.” Although the patriotism explicit in Smetana’s music is genuine, the fact is Smetana was a citizen of the Habsburg Empire and grew up speaking German. His most popular piece of music is “Vlatva,” a glorification of the river that runs through what is now the Czech Republic, but is almost universally known by its German name, “Die Moldau.”

It is one of the stranger and unremarked oddnesses of history that most of those Nationalist composers of the 19th century had no nation to call home. Dvorak had no Czechoslovakia, Liszt had no Hungary, Edvard Grieg’s Norway was ruled by a Swedish king, and despite all the mazurkas and polonaises that Chopin wrote, there was no Poland on the face of the earth. Even the Germany extolled in Wagner’s “Die Meistersinger” was only a gleam in the eye of Otto von Bismarck.

In truth, they were not so much “nationalist” composers as composers of ethnic awareness. Which brings up an important point. What we mean by a “nation” is a fairly recent concoction, and although we tend nowadays to assume that a map divided into colors bounded by border lines is a natural and inevitable reality, history tells us otherwise.

We hear politicians and demagogues harangue us about national sovereignty and the threat of immigrants diluting our national character, and we tend to regard our country — regardless of whether it is the United States, Germany, China or Iraq — as a fixed and permanent “thing” consecrated by history and natural law. But a closer look tells us otherwise. Our idea of a nation-state is rather new in history, and may have been merely a temporary thing. To take it as unchanging and unchangeable is a serious miscalculation.

Going back before reliable history, kingdoms were just areas successfully defended by military leaders who demanded taxes in a kind of protection racket. No one spent much fret over what languages the subjugated people spoke, or what their ethnic descent might be.

Through the Middle Ages, when we speak of Henry V at Agincourt what we are talking about is real estate. Henry ruled land, not people. He owned most of the British Isle and a good chunk of the Continent. The people living on his land owed him taxes and fealty — meaning a term in the army when needed. There was no legal construct known as England or France or Germany, but feudal cross-relations and family ties securing deeds of title to chunks of real estate. The idea of a nation as we know it didn’t exist.

1492

It wasn’t until 1648 and the Peace of Westphalia that the concept of the nation-state emerged, and we developed a sense that France exists whether or not a Bourbon sat on the throne, and that national borders were somehow permanentized — although, of course, they weren’t. Wars — now between nations instead of between kings — kept those boundary lines in flux.

Later ideas gave us different concept of nationhood, often in conflict with the Westphalian ideal. Ethnicity gave many people a different sense of identity, even though ethnicity itself is a slippery thing, and can swell and shrink through time, including and excluding various groups and subgroups. Are you European? Are you Polish? Are you a Slav or a German?

Ethnicity sometimes falters in face of language identity. We talk of “Russian speakers” in Ukraine. Are they Ukrainian or Russian? Certainly they are Slavs. Where do we draw the line?

The historical result of all these shifting ambiguities can be seen in the unstable borders seen on maps. Let’s take Poland as an example. If we think of the country as it exists currently, stuck between Germany and Ukraine, we might assume this was somehow the true and ultimately proper place for Poland. But the country has rolled around the map of Europe like a bead of mercury on a plate. At times it reached the Black Sea, at times it vanished from the face of the earth. You can see this in a clever You Tube video at: https://www.youtube.com/watch?v=66y49BnxLfQ

Poland pre-war outlined in blue; postwar outlined in red.

Poland pre-war outlined in blue; postwar outlined in red.

At times Poland expanded, at times, joined with the kingdom of Lithuania, after it was split into pieces and annexed by Prussia, Russia and Austria in 1795 it ceased to exist as a nation, until it was reconstituted in 1918 at the end of the First World War. It was invaded in 1939 by both Germany and the Soviet Union and essentially disappeared again. After World War II, because Stalin refused to give back his half, the entire country lifted up its skirts and moved some 200 miles to the west, where it set itself down again and became the Poland we have now — although that is no guarantee that it won’t move again sometime in the future. The eastern half of Poland became part of the Soviet Union until it split off and became Ukraine, while the eastern third of Germany, having lost the war, turned into the western half of Poland and millions of German-speaking inhabitants were politely asked to relocate in East Germany — which later reunited with West Germany to be the Germany we have today.

1918

You might consider Yugoslavia, which is now several different sovereign nations, or the “sovereignty” of Czechoslovakia, which finally gave Smetana and Dvorak their own nation, only to dissolve into the Czech Republic and Slovakia.

These constantly unsteady borders should not be seen as anomalies, but rather the norm. You can find another entertaining video displaying the bubbling ferment of national border from roughly AD 1100 to now at: https://www.youtube.com/watch?v=Iha3OS8ShYs

(It should be noted that the dates in the animation are not terribly accurate, and should be taken as a general indication of the era demonstrated by the time-lapse maps rather than a precise year-by-year definition.)

1982

We have talked primarily about Europe, but the same sense of unstable borders and the comings and goings of nations can be seen worldwide. Another video worth watching: https://www.youtube.com/watch?v=-6Wu0Q7x5D0

So when some knucklehead politician tells you that the U.S. should defend its “natural” borders, consider the phantom nature of nationhood and its outlines. The United States itself began as a group of 13 separate nation-states joining together for the common good and soon spread outward and westward, eating up other nations, evicting other peoples and other national authorities, stealing most of northern Mexico and reconstituting that nation’s “natural” borders.

1992

All across the world, there are people corralled inside those lines screaming to get out: Basques and Catalans in Spain, Kurds in Iraq and Turkey, Chechens in the Russian Federation, Russians in Ukraine, Scots from Great Britain, Quebecois from Canada, Tamils in Sri Lanka, the Flemish and Walloons in Belgium, Uighurs in China. Driving around southern France and the Camargue, you will come across angry graffiti demanding Occitan separatism

Nationhood is a dynamic; it is not permanent. Russia is altering the map around the Black Sea and globalization is destabilizing the Westphalian arrangement. Corporations are now transnational, the European Union is subverting ancient sovereignties (with considerable pushback from rising nationalisms) and the post-World War I national borders in the Middle East seem ever more tenuous and artificial. Can the Kurds create their own ethnic state? Can Shia and Sunni ever coexist in a multi-sectarian state?

Instead of assuming that the world cannot change and the Rand-McNally maps we grew up with are the way things should be from now into posterity, we should recognize nations as transient entities momentarily agreed to by whoever is powerful enough to maintain a stalemate.