Archive

Tag Archives: writing

I want to put in a good word for TV sitcoms. They don’t get much respect. And it is true that many of them are routine, uninspired and forgettable. “Chewing gum for the eyes.” But the genre as a whole has both a long history (longer than you may suspect), and a significant role to play in the arts. Yes, the arts.

What we call art is a lot of things, and serves many purposes, but one thing all art, whether painting, music, theater or literature, is asked to do is entertain. There are different levels of entertainment, but even Joyce’s Ulysses or Berg’s Lulu offer an underlying level of amusement. 

Comedy players, Mosaic from Pompeii

Some offer much more, but the base line of keeping us interested has been there from the earliest times we have record of. And much of it even fills university courses. We study Plautus and Terrence — among the earliest sitcom writers (Rome, 6th century BC), with plays full of dirty old men, unfaithful wives, clever slaves, mistaken identities and love-struck young men. 

There are few actual characters in such plays, and a great panoply of stock figures. These kinds of figures, stuck in difficult and comic situations, populate the works of Italian commedia dell’arte, the comedies of Molière, and the plays of Shakespeare (who would sometimes borrow from Plautus and Terrence). Victorian novels — by Dickens, Trollope, Thackeray —  now treated as literature, were at the time serialized in popular magazines and thought of much the same as we now consume TV shows. And all now deemed worth of academic study and even reverence. 

So, why not the same for All in the Family or The Honeymooners? Are they any “lower” an art form than The Twin Menaechmi? Or The Braggart Soldier

Remember, Shakespeare’s audience included the uneducated  groundlings; he wrote also for them. And he was not above the traditional fart joke. It ain’t all Seneca and Henry James. 

I am roughly the same age as television, and have watched the sitcom from its earliest TV days. I was one year old when The Goldbergs switched from radio to television (“Yoo-Hoo, Mrs. Bloom…”).  There was The Aldrich Family from 1949 to 1953 (“Henry! Henry Aldrich!” “Coming, Mother.”), and the first season of The Life of Riley, with Jackie Gleason originally taking over the title role from William Bendix, who had played the part on radio (“What a revoltin’ development this is”). Bendix took back the role for the rest of the series run. I don’t know how old I might have been when I first started watching these series. Probably in my playpen watching the images wiggle on the 12-inch screen of a Dumont television. 

The 1950s brought the onslaught and the sitcom became a staple of the boob tube. These series I remember quite well: Beulah; The Bob Cummings Show; The George Burns and Gracie Allen Show (the first Postmodern show, where George could watch what Gracie was planning on his own TV screen and comment to the audience); December Bride; I Married Joan; Private Secretary; Mister Peepers

I haven’t mentioned the three most important shows of the time. The Honeymooners emerged as a sometime skit on Cavalcade of Stars, the Jackie Gleason variety show on the Dumont network, sometimes taking up most of the run time. But in 1955, the skit was spun off into a half-hour sitcom for 39 episodes, still run in syndication on various cable channels. (“To the moon, Alice”). 

I Love Lucy ran from 1951 to 1957 and pioneered the three-camera filmed sitcom with live audience and laugh track. For its entire run, it ranked No. 1, No. 2, or No. 3 in the ratings. (I have to confess, contrary to the majority opinion, I never found Lucy very funny. Watching reruns, I still don’t). Those reruns can still be found in syndication on cable. 

Alvin Childress, Tim Moore, Spencer Williams

But you won’t find Amos ’n’ Andy. It was enormously popular from 1951 to 1953. But reaction to the racial stereotypes changed markedly during the rise of the Civil Rights movement. It would be hard to complain about the series cancellation. In the context of its times, it was deserved. It can be hard to watch nowadays. But I have seen all 78 episodes on bootleg DVDs and must admit we have lost some brilliant comic performances, especially by ex-vaudevillian Tim Moore as the Kingfish. Yes, there are some awful stereotypes, but not everyone was shufflin’ and grifting. Amos was an upright citizen and family man, and the series showed quite a few Black doctors and judges, all horrified at the shenanigans of the series stars. 

And it should be pointed out that most sitcoms, Black, white or otherwise, focus on less-than-admirable characters. Let’s face it, bland Ward Cleaver does not support a TV series. You need Archie Bunker, Ralph Kramden, or Larry David. Something out of the norm, but exaggerated. Getting past the particulars of Amos ’n’ Andy, basically the same stereotypes come back later as George Jefferson or J.J. in Good Times (“Dyn-O-Mite”) or Redd Foxx in Sanford and Son. Same caricatures, different generation. 

I’m not suggesting we forgive Amos ’n’ Andy, but rather to see it in context, and recognize the talent that went into it. 

The fact that even Millennials know who Lucy Ricardo was, or Ralph Kramden or Rob and Laura Petrie, means that some of the hundreds of sitcoms that have aired, from the last century and this, have a cultural staying power, very like the classics we read at university. 

The foundational stereotypes — or archetypes — have persisted, too. How many sitcoms feature bumbling husbands, from Chester A. Riley and Ozzie Nelson to Curb Your Enthusiasm and The King of Queens? Conversely, the trope of the ditzy wife, from Gracie Allen to Married … With Children to The Middle? Mothers-in-law are a perennial butt of jokes, as are clueless bosses and gay best friends. They each provide a predictable set of familiar and comfortable jokes. (Although the limits of comfort can and have changed over time: Blonde and Polish jokes haven’t worn as well).

And most of these are just modern changes rung on the characters of the commedia dell’arte. Harlequin, Colombina, Pantalone, Pulcinella, Zanni and the lot. We aren’t looking for fully rounded characters so much as familiar types to build plots and gags around — the “situations” in situation comedies. 

So, the sitcom has a long history. And I have a long history with them. And I have divided them into four roughly defined groups. The borders of these groups may be squishy — you may parse them differently — but the categories are defensible.

First, there are those that have had an effect on culture broadly. They tend to be the best written and acted, but they have wormed their way into the general consciousness. Class A includes I Love Lucy (my qualms not withstanding), The Honeymooners, All In the Family, The Mary Tyler Moore Show, M*A*S*H, Murphy Brown, The Office (American version), Seinfeld, Roseanne, Curb Your Enthusiasm, and The Cosby Show (which now is hard to watch — both hard to find and hard to endure, knowing what we now know). And I would include both The Big Bang Theory and Young Sheldon. Class acts all the way. 

But I would include also: Taxi, Barney Miller, Dick Van Dyke, Cheers, The Bob Newhart Show (the original one), and a few that I never warmed to, but still have a cultural significance, like Friends and Married… With Children. All are or have been in the national conversation.

I should also include a few British series that have had an impact, mainly Fawlty Towers, the British Office, and Absolutely Fabulous.  

Class B includes all the quality shows that came and went, with funny characters and solid jokes, but never buried into the Zeitgeist in quite the same way. All solid entries. You can add quite a few to this list and it will depend on your taste and funny bone. I would include: 3rd Rock From the SunBlack-ish; Brooklyn Nine-Nine; Frasier; Golden Girls; The Good Place (should be in Class A, but not enough people watched); Happy Days; Malcolm in the Middle; The Middle; Mike & Molly; Modern Family; The New Adventures of Old Christine; Parks and Recreation; Scrubs; Two and a Half Men; Veep; WKRP in Cincinnati; and your choice of others. Among my favorites are Mom, Night Court, Reno 911. Individual taste may vary. (I have not included many of the old shows from the ’50s and ’60s that few people have had a chance to see: My Little Margie, Private Secretary; Topper.)

The next rung down, in Class C are the workaday shows, sometimes OK time-wasters, but full of cliched characters and tired jokes — the kind that have the familiar form of jokes, but seldom the wit or laughs. Writing on autopilot. This is the vast majority of TV sitcom bulk. The roughage and fiber of the viewing diet. 

When we watch these, it is often more out of habit than desire. The forms are familiar and the laugh track tells us when a joke has passed by. Did anyone ever think The Munsters was prime comedy? or Gilligan’s Island? McHale’s Navy? Saved by the Bell? Mediocrity incarnate. Hogan’s Heroes? I could name a hundred, propelled by laugh tracks and the need of writers to fill air time. Networks toss them on the screen, hoping they’ll stick. Some do, but only because they are gluey. 

Wikipedia lists hundreds of sitcom titles and I would guess some 75 percent of them fall into Class C. At least half of those are gone in a single season, un-renewed, or cancelled after a few goes. The rest stick around because they are not overly offensive. They may feature actors we like, even if they have to spout insipid dialog. 

Bewitched; The Brady Bunch; Chico and the Man; Community; Ellen; F Troop; The Facts of Life; The Flying Nun; I Dream of Jeannie; Last Man Standing; The Monkees; Perfect Strangers; That ’70s Show; Who’s the Boss? Go ahead: Make a case for any of them. Tube fodder. 

Three’s Company is the epitome of Class C, although my son, deeply knowledgeable in the ways of film and media, assures me it is a classic. He loves it. De gustibus

Then, there is the bottom feeding Class D, those shows so bad they have become legend. My Mother the Car is the type specimen for this class. A series only a studio executive high on cocaine and bourbon, and distracted by facing an expensive divorce and maybe a teenage son in jail  could have green-lighted. Quite a few of these were meant to be vehicles for aging film stars given their own sitcom series. The Doris Day Show, The Debbie Reynolds Show, The Tammy Grimes Show, Mickey (with Mickey Rooney), The Paul Lynde Show (in which he is an attorney and family man), Wendy and Me (with George Burns and Connie Stevens), Shirley’s World (Shirley MacLaine as a photojournalist), and The Bing Crosby Show. Most of these didn’t make it past the first season. 

Also at the dismal bottom: Hello Larry, New Monkees, She’s the Sheriff, The Trouble with Larry (“not just not funny, but actively depressing”), Cavemen, Homeboys in Outer Space, The Ropers. Most cancelled after one season. 

In England, Heil, Honey, I’m Home with Adolf and Eva never made it past the first episode. (currently unavailable on streaming or disk. Too soon?).  

Among the abject failures are most of the American remakes of popular British comedies. Many of them never made it past the pilot stage.

And so, you have four general classes of television sitcoms. The best worthy of saving for future generations, the worst best left for whatever is the digital version of the bottom of the canary cage. 

The past wasn’t so different. What we remember of classical Roman comedy are what is extant. Much isn’t. A good deal of it was probably just as banal as most bad TV. We don’t know: It didn’t survive. The Victorian novel was largely a serial enterprise, like seasons of a sitcom, weekly chapters published. But for each Dickens or Trollope, there were dozens, maybe hundreds of lesser works now mostly forgotten. In time, we will no doubt continue winnowing the TV past, saving the Norman Lears and perhaps the Chuck Lorres and ranking them as our Plautus and Terrence. Perhaps. 

The low arts can still be art.

We’ve all heard of Alexander the Great, William the Conqueror, or Suleiman the Magnificent. Who wouldn’t want to be known to history by such flattering names? But among the many kings, princes, barons and otherwise leaders, there are a few names a bit less splendiferous.

Alexander the Great; William the Conqueror; Suleiman the Magnificent

Every literate English speaker has probably heard of Ethelred the Unready, but what about Ivar the Boneless? We think of the third Julian Roman emperor as Caligula, although his actual name was Gaius Caesar Augustus Germanicus. “Caligula” was a less-than-laudatory nickname which actually means “little boots,” because when he was a boy, he tried wearing a real soldiers footware and the troopers made fun of him by calling him Little Boots, or “Bootsie.”

But the list of unflattering cognomens, sobriquets and nicknames is really quite long. Not everyone was the Sun King. There was also Sebastian the Asleep of Portugal, who reigned from 1554 to 1578. And Barefoot Magnus III of Norway (1073-1103). Or Johann Georg the Beer Jug, Elector of Saxony from 1611 to 1656. As you might have surmised, he loved to bend his elbow. 

Alfonso the Leper; Ivaylo the Cabbage; Piero the Gouty

None of these men seem to have had press agents spinning their boss’ reputations. Richard I may have been Lion-Hearted, but the Third was Richard Crookback. Ivaylo the Cabbage ruled Bulgaria in the 13th century. The grandson of Barefoot Magnus ruled Norway from 1130 to 1135 as Magnus the Blind. Vasili Kosoi ruled Moscow in the 15th century as Vasili the Cross-eyed. And, of course, Charles the Bald, or Charles II of France, who apparently had a full head of hair. 

Physical debility seems to have been popular. William the One-Eyed of Meissen; Peter the Stutterer of Portugal; Sverker the Clubfoot of Sweden; Piero the Gouty of Florence; Alfonso the Leper of Portugal. Eric XI of Sweden (1216-1250) was called Eric the Lisp and Lame.

 The most famous was probably Timur the Lame, who conquered much of the world and best known as Tamerlane. And don’t forget Uros the Weak of Serbia. Or Wilfred the Hairy of Catalonia. 

Wilfred the Hairy; Ethelred the Unready; Halfdan the Bad Entertainer

The 1100s featured Bolesłav the Curly in Poland, Conan the Fat in Brittany, and Ragnvald Roundhead in Sweden. 

John the Posthumus was born in France in 1316 after his father’s death and so was born a king already. He lived only a few days, and so was king from birth to death. 

In AD741 Constantine was born in Byzantium and when he was baptized, the infant defecated in the baptismal font and so became ever after Emperor Constantine the Dung Named. 

Sancho the Populator was king of Portugal in the 13th century (that nation seems to get a lot of these names). He had 20 children, both legitimate and otherwise. But he is beat out by John the Babymaker of the Holy Roman Empire in the 16th century, who had 63 illegitimate kids. Something for Elon Musk to aim for, I guess. 

On the other hand, there was Henry the Impotent of Castile in the 15th C. and nasty old King John of England was called John Soft-Sword. 

Louis Do-Nothing; Louis the Unavoidable; James the Shit

England has had its share of names, some glorious, like Elizabeth I as Gloriana, but otherwise, there was George IV, known as the Prince of Whales because of his obesity, and George III, known as derisively as Farmer George for his less-than-regal interest in agriculture. And James II was known in Ireland as Seamus as Chaca or “James the Shit” for his treatment of that country.

I love encountering these historical names. Constantius the Pale was Roman emperor. Stupid Willy was Wilhelm I of Germany. Louis Do-Nothing was Louis V of France. Germany had Wenceslaus the Drunkard. And Portugal (again) had Manuel I, the Grocer-King. 

Coloman the Bookish ruled Hungary in the 12th century and Ivan I of Russia was Ivan Moneybags. And King Ludwig of Bavaria was Mad King Ludwig. And while Vlad Tepec is remembered to history as “Vlad the Impaler” and the model for Dracula, Besarab IV gained the throne of Wallachia with Vlad’s help and was then known as Besarab Tepalus, or “The Little Impaler.”

Then, there’s Alfonso the Slobberer, King of Galicia from 1188-1230, who foamed at the mouth when angered. And Eystein Halfdansson, an 8th century Norwegian king known as Eystein the Fart. Eystein’s son was known as Halfdan the Bad Entertainer — couldn’t throw a decent party. Another Norwegian king, from the 13th century was Haakon the Crazy. King Harald I of Norway from 1454 to 1474 had several names. He was Harald Fairhair, but that may have been meant ironically, since he was also Harald Tanglehair, Harald Shockhead and Harald the Lousy.  

Harald Tanglehair, aka Harald the Lousy

Finally, Eric II of Denmark was Eric the Memorable but doesn’t seem to have done anything of note in his short four-year reign, at least not that anyone can remember.

These are just a few epithets and sobriquets. Wikipedia lists more than 200 historical figures once named as “The Great,” From Abbas the Great of Iran (1587-1629) to Zayn al-Abadin the Great, Sultan of Kashmir (1418-1470). Alexander the Great wasn’t called that until the Roman playwright Plautus named him that in a play, Mostellaria, in the Third Century BCE. He was also known, in Persia as Iskander the Accursed. 

A list of monarchs by nickname in Wikipedia contains a thousand entries, some quite familiar, like Ivan the Terrible, some more obscure, such as Piero the Gouty of Florence, Italy. 

Constantine Dung-Named; Childeric the Idiot; Ferdinand the Bomb

Some have more than one alternate identity. Napoleon Bonaparte had at least 21, including L’Aiglon (The Eagle), Le Petit Caporal (The Little Corporal), The Corsican, The Gunner of Toulon, Little Boney, and more. Most of them authored by his enemies, who seemed hesitant to pronounce his actual name. And so: The Nightmare of Europe, the Corsican Ogre, The Devil’s Favorite, The Fiend of Europe (or just, The Fiend). The British seemed to hate and fear Nappy the most and never seemed to run out of insulting names for the man. 

Superstition about saying certain names out loud have given us many of these. Avoiding the name of the Devil has given us a bunch of  folktale cognomens: Old Scratch, Old Nick, The Evil One Split-Foot, Father of Lies, Green-horned Monster, Jimmy Square-Foot, Old Adam, Tail-N-Horns, the Wicked One, Rule of Demons. 

Which brings us to Donald John Trump. No one recently has accrued so many alternate cognomens, epithets or sobriquets. One single website lists 409 of them. Nixon might have been Tricky Dick and Clinton was Slick Willy, but no one before has had them delivered by the truckload. I stopped looking after about 800 of them. (I’m not going to list them all — I haven’t the heart). Everyone has their favorites. 

They fall into several vast categories: His lying; his heft; his thin skin; his greed and self-dealing; his tweeting; his name-calling; his business failures; his sexual predation; his fascism; his failing mental powers; his word salad — the list goes on. And let’s not forget his orange hue or his hair, or his too-long ties, his golf cheating, or his bragging or the garish bad taste of a tinpot dictator. He makes it easy. 

In his first term I began calling him — Moose-a-Loony. It’s almost a party game to make up new to call him. It’s fun; try it. Orange Foolius. (Although younger readers might not know where that one comes from.) Large-mouth Ass. Keep it going. Your turn. 

Among my favorites: Trumplethinskin: Trumpster Fire; Mango Mussolini; Cheat-O; Tangerine Palpatine; Mar-a-Lardo; Captain Bonespur; Hair Furor; Prima Donald; Assaulter-in-Chief; Boss Tweet; Deadbeat Donald; the Lyin’ King; The Man of Steal; Forrest Trump; Donny Dementia. Use these and no one needs to be told who you are referring to. The Creature from the Orange Lagoon. 

Everyone who wants is free to invent more of them. Late night talk show hosts seem to come up with new ones each night for a good laugh. 

Stephen Colbert called him the Orange Manatee; John Oliver said he was Rome Burning in Man Form; Seth Meyers dubbed him Creep Throat; Jon Stewart said he was a Decomposing Jack-O-Lantern. Samantha Bee called him a Screaming Carrot Demon, and also America’s Burst Appendix. Trevor Noah said he was a Pile of Old Garbage Covered in Vodka Sauce (although I would have thought “ketchup” more apt). 

The characterization that seems to have gotten under the Tangerine-Tinted Trashcan Fire (S. Bee) more than any other was delivered almost 40 years ago in 1988 when Vanity Fair editor Graydon Carter called him a “short-fingered vulgarian.” 

Carter said that he made the comment “just to drive him a little bit crazy.” And according to Carter, it still does.

“To this day, I receive the occasional envelope from Trump. There is always a photo of him — generally a tear sheet from a magazine. On all of them he has circled his hand in gold Sharpie in a valiant effort to highlight the length of his fingers,” Carter said. “I almost feel sorry for the poor fellow because, to me, the fingers still look abnormally stubby.”

But whether his fingers are abnormally short or not, there is no question he is a vulgarian. Every time DJT opens his pie-hole he demonstrates how little class he possesses. “Quiet, Piggy!” 

And so, we keep renaming the Yam-colored Yammerer, as if we don’t want to have to say his name. 

A linguist named Jenny Lederer said, “people feel like not repeating his name is [a way of] not speaking to the brand and the value system that goes along with his political ideology.”

Even a mention of his name is a problem, with a kind of folk-magic power, causing many of us to avoid it. Tweets will spell the name “Tr*mp,” like it’s a four-letter profanity, although that doesn’t really hide the name, but it does make the Tweet unsearchable by the keyword, “Trump,” and so limits its spread.  And the asterisk implies that his name is vulgar, like the the dirty words censored in old books. 

One is left to wonder what posterity (if there is posterity) will finally settle on as the epithet for “He-Who-Must-Not-Be-Named.” 

We have Old Hickory, Honest Abe, Silent Cal, the Gipper, Bubba and Dubya — but how will we ever choose from the superfluity of imprecations  swirling around Trump as he circles the drain? 

I was an English major and I’m married to an English major and it’s hard being an English major and only getting harder. An English major feels genuine pain hearing the language abused, mal-used, corrupted and perverted. 

I’m not talking here about outdated grammar rules and a fussy sort of prisspot pedantry. As a matter for fact, one of the most persistent pains a true English major suffers is such misplaced censure, especially when interrupting a casual speaker to let them know that “them” is plural and “speaker” is singular. In my book, this is merely rude. A caboose preposition is nothing to lose sleep over. 

No, what I’m writing about here is are things that are genuinely ugly or purposely unclear, or overly trendy, to the point of losing all meaning. You know — politics. That and corporate language, or management speech — which I call “Manglish” — is a great corrupter of speech. 

“We have assessed the unprecedented market shifts and have decided to pivot this company’s new normal to a deep-dive into a robust holistic approach to our core competency, circling back to a recalibrated synergy amid human capital in solidarity with unprecedented times.”

Committees try to make language sound profound and wind up writing piffle. I’m sure it was a committee that agreed in Iredell County, N.C., to make the county slogan, “Crossroads for the future.” I do not think that word means what you think it means. 

I wrote for a daily newspaper and language was my bread and butter, but the higher you go up the corporate ladder, the worse your words become. I remember the day they posted a new “mission statement” on every other column in the office, filled with buzz-word verbiage that didn’t actually mean anything. I looked at my colleague and said, “If I wrote like that, I’d be out of a job.”

But it isn’t just Manglish. Ordinary people are becoming quite lax about words and meanings. Too often, if it sounds vaguely right, it must be so. And you get “For all intensive purposes,” “It’s a doggy-dog world,” being on “tender hooks,” or “no need to get your dandruff up.”

“I’ve seen ‘viscous attack’ too many times recently,” my wife says. “It gives me an interior pain like a gall bladder attack.” 

And the online world is full of shortcuts, some of which are quite clever, but most of which are just barbarous. An essay about digital usage is a whole nother thing. Not room here to dive in. 

But, there is a world of alternative usage that is not standard English, that any real English major will welcome as adding richness to the mother tongue. Regionalisms, for instance. Appalachian dialect: “I’m fixin’ to go to the store;”— actually, that is “stoe,” rhymes with “toe” —  “I belong to have a duck;” “I have drank my share of Co-Cola.”

And Southern English has served to solve the historical problem of having lost the distinction between the singular “thee,” and the plural “you,” with the plural “you all,” or “y’all.” Although, more and more “y’all” is now being used also for the singular, and so is becoming replaced with “all y’all.” Keep up, folks. 

Then, there’s African-American English, which has enriched the American tongue immensely, as has Yiddish: “Shtick,” “chutzpah,” “klutz,” “schmooze,” “tchotchke.”

Regionalisms and borrowings are like idioms. Sometimes they don’t really make sense, but they fall comfortably on the tongue. “Who’s there?” “It’s me.” Grammatically, it would be proper to say “It is I,” but no one not a pedant would ever say such a thing. The ear is a better arbiter than a rulebook. 

George Orwell ended his list of rules for writing with the most important: “Break any of these rules sooner than say anything outright barbarous.”

English is a happily promiscuous tongue and so much of the richness of the language comes from borrowings. But there are still other problems: bad usage, misuse of homonyms, loss of distinctions. The English major’s ears sting with each onslaught. 

What causes our ears to burn falls into three broad categories. Things that are just wrong; things that are changing; and things that offend taste. Each is likely to set off an alarm ping in our sensitive English major brains. 

Every time I go to the grocery store I am hit with “10 items or less.” Few people even notice, but the English major notices; we don’t like it. “Comprised of” is a barbarism. “Comprises” includes all of a certain class, not some items in a list of choices. I feel slapped on the cheek every time I come across it mal-used. 

And homonyms: a king’s “rein,” or a book “sited” by an article, or a school “principle” being fired. It happens all the time. It is endemic online. “Their,” “they’re,” “there” — do you know the difference? “You’re,” “your?” Most young’uns IM-ing on their iPhones don’t seem to care. 

And fine distinctions of usage: I am always bothered when I see a murderer get “hung” in an old Western, when we know he was “hanged.” 

Again, most Americans hardly even notice such things. They all get by just fine not caring about the distinction between “e.g.” and “i.e.” In fact, they can get all sniffy about it. 

I know a former medical transcriptionist who typed up a doctor’s notes each day, and would correct his grammar and vocabulary. She corrected the man’s “We will keep you appraised of the outcome,” with “apprised,” and each time, he would “correct” that back to “appraised.” Eventually, she gave up on that one. 

English majors know the difference between “imply” and “infer,” and it causes a hiccup when they are mixed up. “Disinterested” mean having no stake in the outcome; “uninterested” means you cannot be arsed. They are not interchangeable. 

We EMs cringe when we hear “enormity” being used to mean “big,” when it actually means a “great evil.” And “unique” should remain unique, unqualifiable. Of course, “literally” is used figuratively literally all the time.  Something is not “ironic” simply because it is coincidental. 

I hear the phrase “begging the question” almost every day, and always misused. It refers to circular reasoning, not to “raising the question.” I hold my ears and yell “Nya-nya-nya” until it is over. 

Some things, which used to be wrong, are slowly being folded into the language quietly, and our EM ears may still jump at hearing them. “Can” and “may” used to mean distinct things, but that difference was lost at least 100 years ago. We can give that one up. And “hopefully” used to be a pariah word, but we have to admit, it serves a grammatical function and we have to let it in, however grudgingly. 

My wife is particularly sensitive to the non-word, “alot.” She absolutely hates it. I understand, although I recognize a need for it. “A lot” is a noun, and sometimes we use it as an adjective: “I like chocolate a lot.” Perhaps “alot” will eventually become an adjective. Not for Anne. 

People are different and have differing talents. We seem to be born with them. Some people grasp mathematics in a way a humanities student will never be able to match. Some have artistic or musical talent. We can all learn to play the piano, but only certain people can squeeze actual music out of the notes. 

And each of us can learn our native tongue, but some of us were born with a part of our brains attuned to linguistic subtlety. We soaked up vocabulary in grade school; we won spelling bees; we wrote better essays; we cringe at the coarseness of political speech. Language for the born English major is a scintillating art, with nuance and emotion, shadings and flavors. We savor it: It is not merely functional. 

(I loved my vocabulary lessons in grade school, and when we were asked to write sentences using that week’s new words, I tried to use them all in a single sentence. People who show off like that often become writers.)

Just as my piano playing, even when I learned whole movements of Beethoven sonatas, was always the equivalent of speaking English as a second language, my best friend, Sandro, could sit at a piano and every key was fluent, natural, and expressive. It was a joy to hear him play; a trial for me just to hit the right notes. 

I am saying that those of us who gravitate to speech, writing, and language in general, have something akin to that sort of talent. It is inherent, and it can be a curse. Bad language has the same effect on our ear as a wrong note on a piano. 

We can feel the clunkiness of poorly expressed thoughts, even if they are grammatical. Graceful language is better. And so, we can rankle at awkward expressions.

I have a particular issue, which I share with most aging journalists, which is that I had AP style drummed into me — that is, the dicta of the Associated Press Stylebook, that coil-bound dictator of spelling, grammar and usage. One understands that AP style was never meant to be an ultimate arbiter of language, but rather a means of maintaining consistency of style in a newspaper, so that, for instance, on Page 1 we didn’t have a “gray” car and on Page 3 one that was “grey.” 

And so, the rules I lived by meant that there was no such thing as 12 p.m. Is that noon or is that midnight? Noon was neither a.m. nor p.m. Same for midnight. They had their own descriptors. “Street” might be abbreviated in an address, but never “road.” Why? I never knew, but in my first week on the copy desk I had it beaten into me when I goofed. 

“Back yard” was two words, but one word as an adjective, “backyard patio.” Always. “Air bag,” two words; “moviegoer, one word. “Last” and “past” mean different things, so, not “last week,” unless Armageddon is nigh, but “this past week.” Picky, picky. 

I had to learn all the entries in the stylebook. The 55th edition of the AP Stylebook is 618 pages. And this current one differs from the one I had back in 1988; some things have changed. Back then, the hot pepper was a “chilli pepper,” and the Southwestern stew was “chili.” I lived and worked in Phoenix, Ariz., and we all understood this would get us laughed off the street, and so exceptions were made: yes, it’s a chile. 

Here I am, nearly 40 years later, and retired for the past 12 years, and I still tend to follow AP style in this blog, with some few exceptions I choose out of rebellion. But I still italicize formal titles of books, music and art, while not italicizing chapter names or symphony numbers, per AP style. I still spell out “r-o-a-d.” It’s a hard habit to break. 

The online world seems to care little for the niceties of English. We are even tending back to hieroglyphs, where emojis or acronyms take the place of words and phrases. LMAO, and as a card-carrying alte kaker, I often have to Google these alphabet agglutinations just to know what my granddaughters are e-mailing me. Their seam to be new 1s each wk. 

And don’t get me started on punctuation.

We look at history all wrong. In school, history seems like a bunch of random dates we have to remember. You know: 1492; 1066; 1929; 1588. Just numbers. But history isn’t like that. History is continuous, all connected. 

It all needs to be looked at a different way. I think of history as a series of grandmas. It makes history more relatable, but also a good deal shorter than you might imagine. 

My grandmother was born in 1900. It is now 2025. That’s a 125 years. The way the Romans counted centuries — grandparents birth to grandchild’s death — was called a “saeculum,” which we tend to translate either as “era” or “century.” But I know my grandmother was born before the Wright brothers flew and didn’t die until after Neil Armstrong walked on the moon. An augenblick in time. 

And so, I try to line up these consecutive eyeblinks to imagine how distant, really, is the past. Now, I understand that not all grandma-to-grandkid timespans are the same. Some are considerably shorter. But I use my own as a milestone to count my way through the past I’ve learned about. In that way, the Declaration of Independence was from my grandmother’s grandmother’s time. Only two grandma’s ago, so to speak. 

The Roman Empire, in the east, fell in 1453, which is just 4.5 grandmas in time. My grandmother’s grandmother’s grandmother’s grandmother could have witnessed it. 

We think we have been on this planet so long, that prehistory seems interminable. But I attempted to calculate how long it has been since humans painted aurochs on the cave walls at Lascaux. The date for the original artwork is usually given as about 19,000 years ago. A long time, no? 

Well, in grandmas, that is just 152 grandmas ago. Your standard charter bus carries about 50 people, and so it would only take three busloads of grandmas and their grandmas to get us back to the cave paintings. It really isn’t all that long ago. 

In sum, I have long believed we should never teach history simply as dates, and shouldn’t teach it from back then to now, but rather from now, counting grandmas — or some other measure of time — backwards through time to see the continuous thread we are connected with. 

Delmarva Peninsula

When I was a kid, most things in the world just were. Everything was normal, even if it wasn’t. People lived in houses, roads were paved, families had a mom and dad. And, for a kid, parents were just parents; I never gave much thought to why they did what they did. It wasn’t even that if they did something, they must have had a reason. They just did things. They were Mom and Dad, and as children, me and my brothers just followed along. As a child, the world is a given. 

It’s hard to recall that state of affairs, when things happened because they happened. That they might have done things for the abstract good of their children never dawned on me. Of course, now that I am old, I look back and realize how much they did for us, things they didn’t have to do, or things they probably would have preferred to do some other way, including some other things they might have wanted to do with their sparse vacation time. 

Gettysburg, Pennsylvania

But one of the things they did for us was travel. When we had our summer school vacations, Mom would pore over various brochures, magazine articles, and roadmaps and carefully plan routes to drive and sites to take in, and then, when it came time, we would load up the car and take off. We went to Washington D.C., to Niagara Falls, to Fort Ticonderoga, to Gettysburg — and many spots in between. Only later did I come to understand there was a purpose to these vacation trips. they chose these things to expose their children to history, geography, politics and to let us see what was possible outside suburban New Jersey. Travel was part of our education. 

Back when they were children, they didn’t get to travel. They both grew up in northern New Jersey and neither had more than a high school education. Dad’s family was heavily religious and didn’t approve of having fun, and Mom had to raise her two younger siblings after her father died when she was still in grade school and her mother had to find work in New York City. And so, the worlds they grew up in were limited.

German gun, Omaha Beach, Normandy, France

But when my father was drafted in 1940, he was sent to Camp Wheeler in segregated Georgia. He didn’t talk much about his army life, but he did express shock at discovering the pervasiveness of racial discrimination. And then he was sent overseas after D-Day to France, Czechoslovakia and Germany and found that other peoples had various ways of making a decent life. His horizons were involuntarily expanded. 

When they got married, shortly after the war, they felt it was important to broaden their children’s horizons, too, and to make sure we got the best educations. There was never any question that their kids would go to college. (I was told that when I entered second grade, I asked if that meant I could go to college “next year?”) Education was Priority One. Travel was a component of that. 

Kristiansand, Norway

Later, when I was in high school, they made it possible for me to go to Europe, accompanying my immigrant grandmother to her birthplace in Norway, and to take a bus tour through France, Germany, the Netherlands, Belgium and Luxembourg. A few years later, my younger brother, Craig, was sent off on a similar trip. 

Mark Twain famously said that “Travel is fatal to prejudice, bigotry, and narrow-mindedness, and many of our people need it sorely on these accounts. Broad, wholesome, charitable views of men and things cannot be acquired by vegetating in one little corner of the earth all one’s lifetime.” 

The vast bulk of the MAGA world, it seems to me, is made up of those who have never ventured far from their birthplace. If they had seen how others live, they would not accept the lies. 

Cairo, Illinois

Travel, then, has pretty much always been an essential part of my life, and having become an adult — at least as quantified in years accrued on the planet — I have continued the habit instilled by my parents. Education wasn’t only in books; I read Huckleberry Finn, of course, but I’ve also been to Mark Twain’s childhood hometown of Hannibal, Mo., and to what remains of Cairo, Ill. I’ve driven along the entire length of the Mississippi River, from Lake Itasca, Minn., to the Venice Marina where the road ends in Louisiana. 

Vancouver Island, British Columbia

It is rare to mention a place I haven’t been in the continental United States or provincial Canada. Big Sur; Mt. Katahdin; the Everglades; El Paso;, the Union Pacific Bailey Train Yard in North Platte, Neb.; Hudson Bay; Halifax, Nova Scotia; the Tehachapi Loop in California; the Sturgis Rally in South Dakota; even Glacier Bay in Alaska.

Glacier Bay, Alaska

Believe me, I can be quite irritating, when someone mentions some far-off place and I chime in, “Oh, yeah, I’ve been there. Had a great Po Boy sandwich in a little shack along the road in Pecan Island.” Which is a tiny community along the southernmost road in Louisiana. Or “Abiquiú? Yes, we visited Georgia O’Keeffe’s place there.” I am slowly learning to keep my yap shut. 

Red Mesa, Navajo Reservation, Arizona

I’ve been to every state except Hawaii, and most of them many times, and every province in Canada, save only Prince Edward Island and Newfoundland. I’ve lived in four corners of the continental states, growing up in the Northeast, moving to the American South, then to Seattle in the Northwest, and spending 25 years as a writer in the Southwest. Each move gave me a chance to explore all the territory nearby, often in some detail (There’s hardly a square meter of Arizona that I haven’t been to. Go ahead, name something obscure: Ajo? Red Mesa? Tumacacori? Freedonia? Gadsden? Quartzsite? Check, check, check, check, check and check.)

Cape of Good Hope, South Africa

I’ve been to the Cape of Good Hope in South Africa and to Marylebone Road in London; I’ve been to the Chartres Cathedral, and to the bull ring in Arles, in Provence. My late wife and I went back to France many times, and drove to all six corners of “The Hexagon.” 

And so, yes, I’ve been to the D-Day beaches at Normandy and to the bomb craters still evident in Verdun, to the cave art along the Vézère Valley, to the paleolithic menhirs and dolmens near Locmariaquer. 

Dolmen, Locmariaquer, Brittany, France

I’ve visited the Museum of Questionable Medical Devices, the Museum of Jurassic Technology and the Hammer Museum of Haines, Alaska. 

And it has all been an education. I certainly learned that the omelet I get at Denny’s is a pale mutilation of the rich, creamy offering you can find at any neighborhood cafe in Paris. 

O’Keeffe Country, New Mexico

Of course, I’ve also learned how little of the world I’ve actually seen. I have so much of the non-European inflected planet yet to see, although, at my age, it’s almost certain I will never get to Japan or the Seychelles or Samoa. Just as, no matter how many books I’ve read, there is always a hundred times more books I will never get to read, and not enough time left even to put a dent in that list. I keep trying. 

Windsor Ruins, Mississippi

And so, in addition to bragging about all the places I’ve been, I am shamed by all those places I have not gone to, just as all the art I haven’t seen, all the books I haven’t read, all the music I have never heard. But I think that it is only because of all I have read, seen, and heard, that I know enough to feel the gaping holes in my education.

My Uncle Stanley had an ambition in life to own a Weimaraner hound. I was only a boy at the time and didn’t quite understand the appeal of such a dog, but for the IBM typewriter technician he was, living in New Jersey in the 1960s, I imagine it had something of the attraction a solid gold toilet had for Elvis Presley. The rest of us had dogs that we lucked into, finding a stray, or getting a mutt from the dog pound. But the Weimaraner was a pricey breed and my uncle wanted one. He finally got one. It was a nice dog, but for me, that’s just what it was — a dog. 

Many, I think, have some similar focus in their lives, some object that signifies arrival, or a sense of completeness in life. Most items hold that position only for as long as they are unachieved. Yet there remains a pride in the achievement, even if the reward is rather less than anticipated. 

I think of those who have yearned to own a Cadillac. They may live in a mobile home and work as janitor in the local factory, but if they can park a Caddy out front, it will show they aren’t complete failures. 

As in the familiar song, St. James Infirmary: When I die, “Put a twenty-dollar gold piece on my watch chain,/ So the boys’ll know that I died standin’ pat.”

The idea of getting that bit you believe you want or need is common. Perhaps it is a $300 Wüsthof chef knife; or a Rolex watch; or a bespoke suit from Hong Kong. Whatever is your icon of either quality or status or style, it chases you through life until you can finally afford it. I certainly have felt it. When I was young, it was a Nikon camera, then a Leica and then a Hasselblad. I finally got each and while I wasn’t disappointed — they are all as good as their reputations — they never quite made that great a difference in the photographs I made. 

I imagine that if the People’s Republic of China ever finally get their hands on Taiwan, it will not prove to be quite so satisfying a triumph as they had imagined. 

I never chased a particular car or watch, but there are books I longed for. I have managed to get some of them; others still elude me. But here are the big three I lusted after for years.

The Times Comprehensive Atlas of the World

Beginning in the third grade, I loved maps. And what I loved more than any were the big maps in the classroom that were pulled down like a windowshade, and were richly colored in thick inks — not halftone dots: The green was dark green ink, not a mix of yellow and cyan dots. Mountain regions were a rich chestnut brown. Those maps were beautiful. They may have been out of date even in my childhood, but I didn’t love them for their accuracy, but as art. 

Years later, I found something very similar in older editions of the Goode’s School Atlas, where the maps were created using wood engravings, so there were straight-line cross-hatchings for shadings, and again, multi-colored inks for the printing. I saw them as art books. I found a few in old, musty used book stores and I still treasure them. 

The very first puzzle pieces I remember, as far back as infancy, were map puzzles, where each U.S. state was a single piece. I took apart and redid that states puzzle hundreds of times, even as, in my infant-tongue the states were Uncle Homer and Miss Thompson. 

Later, as a young man, newly empowered with a car and an income, I began traveling, and to aid that travel, I had a Rand McNally Road Atlas. I have updated them every other year or so, but I also acquired vintage versions from 1935 and 1942, which are things of beauty of their own, in two-color printing, with most roads in dark blue and highways in red. I treasure the old ones, while the newer, full-color maps are merely disposable useful tools. 

But, out there on the horizon, was the Times Comprehensive Atlas of The World, published in constantly updated editions from 1895 through its 16th edition in 2023. By 1959, the Midcentury Edition of the atlas was a five-volume elephant folio edition measuring 12-by-19-inches. It was the Cadillac of world atlases and it was way out of my price range when I was young. I did manage to get the single volume 10th edition, picked up used. 

It was a large, handsome volume. The maps were halftones, so, not as esthetically distinct as the Goode’s, but still, it was by all counts the best atlas on the market. Unfortunately, when I retired, I had to sell off about 75 percent of my library to make the move across the country from Phoenix to North Carolina, and the Times atlas was one of the casualties. Kept the Goode’s, though. 

The Encyclopedia Brittanica

By the time I was in sixth grade, I wanted to learn everything. I was young enough still to think that possible. And where would I find all this knowledge? I’d read the encyclopedia. 

My neighbors had an old Compton’s Picture Encyclopedia, from the 1930s, which they gave us, and I read it over and over, with its streamlined steam trains, autogyros and biplanes. But even as a kid, I knew the books were out of date. A wonderful long entry on “The Great War,” but, although I was reading it in 1953 or so, less than 10 years after WWII, there was no mention of any of it. 

My mother wanted to help, and so, she began buying the promotional supermarket offering Funk and Wagnalls, one per week, for 99 cents each, until we had the full set, cheaply printed and bound. I used them for years to write theme papers for school. But I always knew that they weren’t the “real thing.” For that I needed the Britannica, which was way outside my family’s budget. 

I continued lusting for my own Encyclopedia Britannica, all through college, the jobs that followed and into my years at the newspaper in Arizona, when I finally got a set at Bookman’s, a used book store in Mesa. But my enthusiasm was tempered by the fact that the set I got was not the traditional Britannica, but the combined “micropædia” and “macropædia,” in which the entries were divided into the more popular entries, in shorter, easier to read versions — the micropædia — and the more in-depth entries in the rest of the volumes. It felt like a dumbed-down, even trendy version of what I truly wanted. I wanted the Belmondo Breathless and got the Richard Gere Breathless

Years later, I came across the revised 14th edition, in 24 volumes and its leathery maroon covers and thistle logo, and managed to buy it. This was the real thing, at last. The pride of my collection. 

At Bookman’s, I later also found a facsimile version of the original three-volume Encyclopædia Britannica, from 1768. The replica was quite convincing, even including (imitation) foxing on some of the pages. More interesting was evidence that the 18th century project engaged the enthusiasm of its makers early on, and then rather petered out. The first volume covers the letters “A” and “B.” The second includes “C” through “L.” And the third and slimmest volume gets to cover everything else to the letter “Zed.” The facsimile edition was published in 1971. 

Then, of course, the internet came along, with its Wikipedia. The Britannica sat on the shelf as a kind of trophy, but largely unused. And when we moved, it was one of the casualties. So long in the getting, so short in the forgetting. 

The Oxford English Dictionary

But the real prize, the one thing that I lusted for more than any other, was the Oxford English Dictionary, the 20 volume final word on the English language. 

I was a long-time reader of dictionaries. From second-grade on, I loved learning vocabulary. From 8th grade on, I loved learning the etymologies of words, and how they could change meaning over time. The OED contained all that information. Entries were long, involved and gave dozens, maybe scores, of citations, each dated and quoted. A simple word with multiple meanings, such as “set,” went on for pages, and required 60,000 words to describe some 580 senses. The whole of the dictionary was 21,730 pages and 59 million words covering more than 300,000 entries. It was heaven. It was also pricey. The set could sell from $1500 to $2000, depending on where you bought it. 

The full OED is still my unicorn. I have never found an affordable used set. But, in the 1970s, the Oxford Press put out a 2-volume Compact Edition, with every four pages of the original OED shrunk photomechanically down to quarter-size and printed four original pages squeezed into each single page of the edition, which required the use of a magnifying glass (included) to be able to read it. The Compact Edition was offered at rock-bottom price as a promotion through a book club, and I signed on, and got my copy.

It is very hard to read, even with the magnifying glass, and the volumes were big and bulky and uncomfortable to use, but at least I owned a version of the OED. This was as close as I got to Nirvana. 

I still have the Compact Edition, occupying the upper shelf of a coat closet. I haven’t dragged it out in years, but I still have it, a reminder of those things I once thought would change my life forever. Perhaps they did. 

So, what did you always want and did or didn’t finally achieve? 

Click on any image to enlarge

The venerable writer John McPhee wrote a short, episodic memoir for the May 20, 2024 edition of The New Yorker, and in it he discussed proofreading. The piece hit home with me. 

I began my career at The Arizona Republic as a copy editor, which is not exactly the same thing as a proofreader, but many of the duties overlap, and many of the headaches are the same. 

 A proofreader, by and large, works for a book publisher and will double check the galley proofs of a work for typos and grammatical errors. The work has already been typeset and a version has been printed, which is what the proofreader goes over. 

A copy editor works for a magazine or newspaper and is usually one of a team of such editors, who read stories before they are typeset and check not only spelling and grammar, but factual material and legal issues, to say nothing of that great bugbear, the math. English majors are not generally the greatest when dealing with statistics, percentages, fractions — or for that matter, addition or subtraction. 

Arizona Republic staff, ca. 1990

In a newspaper the size of The Republic, a reporter turns in a story (usually assigned by the section editor) and that editor then reads it through to make sure all the necessary parts are included, and that the presentation flows in a sensible manner. Section editors are very busy people, dealing with personnel issues (reporters can be quite prissy); planning issues (what will we write about on July 4 this year); remembering what has been covered in the past, so we don’t duplicate what has been done; dealing with upper management (most of whom have never actually worked as reporters) and who have “ideas” that are often goofy and unworkable; and, god help them, they attend meetings. They cannot waste their time over tiny details. Bigger fish to fry. 

Once the section editor has OKed a piece it goes on to the copy editors, those troglodyte minions hunched over their desks, who then nitpick the story, not only for spelling and style — the Associated Press stylebook can be quite idiosyncratic and counterintuitive — but also for missing bits or mis-used vocabulary, and double-checking names and addresses. A copy editor is a rare beast, expected to know not only how to spell “accommodate,” but also who succeeded Charles V in the Holy Roman Empire (Ferdinand I, by the way). 

The copy editor then hands the story over to the Slot. (I love the more arcane features of any specialized vocation). The Slot is the boss of the copy desk. In the old days, before computers, copy editors traditionally sat around the edge of a circular or oblong desk with a “slot” in the center where the head copy editor sat, collecting the stories from the ring of hobbits surrounding him. He gave the stories a final read-through, catching anything the previous readers may have missed. Later the story would be given a headline by a copy editor and that headline given a final OK by the Slot. Only then would the story be typeset. 

That means a typical newspaper story is read at least four times before it is printed. Nevertheless, there will always be mistakes. Consider The New York Times. Every typo that gets through generates angry letters-to-the-editor demanding “Don’t you people have proof-readers?” Well, we have copy editors. And why don’t you try to publish a newspaper every day with more words in it than the Bible and see how “perfect” you are? Typos happen. “The best laid schemes o’ Mice an’ Men Gang aft agley.” 

When I was first hired as a troglodyte minion, I had no experience in journalism (or very little, having spent time in the trenches of a weekly Black newspaper in Greensboro, N.C., which was a very different experience from a big-city daily) and didn’t fully understand what my job entailed. I thought I was supposed to make a reporter’s writing better, and so I habitually re-wrote stories, often shifting paragraphs around wholesale, altering words and word order, and cutting superfluous verbiage. That I wasn’t caught earlier and corrected tells me I must have been making the stories better. 

There was one particular movie critic who had some serious difficulty with her mother tongue and wrote long, run-on sentences, some of which may have been missing verbs in them, or full of unsupported claims easily debunked. (I hear an echo of her style in the speeches of Donald Trump). I regularly rewrote her movie reviews from top to bottom, attempting to make English out of them. 

One day, I was a bit fed up, and e-messaged the section editor that the critic’s review was gibberish and I included the phrase, “typewriters of the gods.” Unfortunately the reviewer was standing over the desk of the section editor and saw my sarcastic description and became outraged. I had to apologize to the movie critic and stop rewriting her work. 

Lucky for me, the fact that I could make stories better brought me to the attention of the section chiefs and I was promoted off the copy desk and into a position as a writer — specifically, I became the art critic (and travel writer, and dance critic and architecture critic, and classical music critic, and anything else I thought of). I’m sure the other copy editors and the Slot were delighted to see the back of me.

That is, until they had to tackle copy editing my stories. I had a few idiosyncrasies of my own. 

Here I must make a distinction between a reporter and a writer. I was never a reporter, and was never very good at that part of the job. Reporters are interested primarily in collecting information and fact. Some of them can write a coherent sentence, but that is definitely subordinate to their ability to ferret out essential facts and relate them to other facts. A reporter who is also a good writer is a wonder to behold. (In the famous team of Carl Bernstein and Bob Woodward, the latter was a great reporter and mediocre wordsmith — as his later books demonstrate. Bernstein was a stylish writer. Together they functioned as a whole). 

I was, however, a writer, which meant that my primary talent and purpose was to put words into an order that was pleasant to read. I love words. From the second grade on, I collected a vocabulary at least twice as large as the average literate reader, and what is more, I loved to employ that vocabulary. Words, words, words.

And so, when my stories passed through the section editor and got to the copy desk, the minions were oft perplexed by what I had written. Not that their vocabularies were any smaller than mine, but that such words were hardly ever printed in a newspaper. I once used “paradiddle” in a review and the signal went up from the copy desk to the section editor, who came to me. We hashed it out. I proved to her that the word was, indeed, in the dictionary, and the word descended back down the food chain to the copy desk and the word was let alone. 

But this led to a bit of a prank on my part. For a period of about six months (I don’t remember too clearly exactly, back in the Pleistocene when this occurred) I included at least one made-up word in every story I wrote. It was a little game we played. These words were always understandable in context, and were often something onomatopoetic and meant to be mildly comic (“He went kerflurfing around,” or “She tried to swallow what he said, but ended up gaggifying on the obvious lies”). For those six months, a compliant copy desk let me get away with every one of them. Every. Single. One. Copy editors, despite their terrifying reputation, can be flexible. Or at least they threw up their hands and got on to more important matters.

I will be forever grateful to my editors, who basically let me get away with murder, and the copy desk at The Arizona Republic, for allowing me to write the way I wanted (and pretty much the only way I knew how). Editors, of both stripes, will always be my heroes. 

John McPhee

Back to John McPhee. He describes the difficulty of spotting typos. Of course most are easily caught. But often the eye scans over familiar phrases so quickly that mistakes become invisible. In a recent blog, I wrote about Salman Rushdie’s newest book, Knife, and I had its subtitle as “Meditations After and Attempted Murder.” I reread my blog entries at least three times before posting them, in order to catch those little buggers that attempt to sneak through. But I missed the “and” apparently because, as part of a phrase that we use many times a day, the eye reads the shape of the phrase rather than the individual words and letters. 

There is a common saying amongst writers: “Everyone needs a copy editor,” and when I retired from The Republic, I lost the use, aid, and salvation of a copy desk. I had to rely on myself and my re- and re-reading my copy. But typos still get through. And on the day after I post something new, I will sometimes get an e-mail from my sister-in-law pointing out a goof. She let me know about my Rushdie “and,” and I went back into the text and corrected it (something not possible after a newspaper is printed and delivered). She has saved my mistakes many, many times, and has become my de-facto copy editor. 

But my training as both writer and copy editor have stood me well. Unlike so many other blog posters, I double check all name spellings and addresses, my math and my facts. I am quite punctilious about grammar and usage. And even though it is no longer required, I am so used to having AP style drilled into me, I tend to fall in line like an obedient recruit. 

In his story, McPhee details trouble he has had with book covers that sometimes misrepresented his content. And that hit me right in the whammy. One of the worst experiences I ever had with the management class came when I went to South Africa in the 1980s. Apartheid there was beginning to falter, but was still the law. I noticed that racial laws were taken very seriously in the Afrikaner portions of the country, but quite relaxed in the English-speaking sections. 

And I wrote a long cover piece for the Sunday editorial section of the paper about the character of the Afrikaner and the racial tensions I found in Pretoria, Johannesburg and Cape Town. The Afrikaner tended to be bull-headed, bigoted and unreflective. And I wrote my piece about that (and the fascist uniformed storm troopers that I witnessed threaten the customers at a bar in Nelspruit). The difference between the northern and eastern half of South Africa and its southern and western half, was like two different countries. 

As I was leaving the office on Friday evening, I saw the layout for the section cover and my story, and the editor had found the perfect illustration for my story — a large belly-proud Afrikaner farmer standing behind his plow, wiping the sweat from his brow and looking self-satisfied and as unmovable as a Green Bay defensive tackle. No one was going to tell him what to do. “Great,” I thought. “Perfect image.”

But when I got my paper on Sunday, the photo wasn’t there, being replaced by a large Black woman waiting dejectedly at a bus station, with her baggage and duffle. My heart sank. 

When I got back to the office on Monday, I asked. “Howard,” was the reply. Earlier in this blog, I mentioned management, with which the writer class is in never-ending enmity. Howard Finberg had been brought to the paper to oversee a redesign of The Republic’s look — its typeface choices, its column width, its use of photos and its logos — and somehow managed to weasel his way permanently into the power structure. He was one of those alpha-males who will throw his weight around even when he doesn’t know or understand diddly. I will never forgive him. 

He had seen the layout of my story and decided that the big Afrikaner, as white as any redneck, simply “didn’t say Africa.” And so he found the old Black woman that he thought would convey the sense of the continent. Never mind that my story was particularly about white South Africa. Never mind that he hadn’t taken the time to actually read the story. That kind of superficial marketing mentality always drives me nuts, but it ruined a perfectly good page for me. Did I say, I will never forgive him? 

It reminds me of one more thing about management. In the early 2000s, when The Republic had been taken over by the Gannett newspaper chain, management posted all over our office, on all floors, a “mission statement.” It was written in pure managament-ese (which I call “manglish”) and was so diffuse and meaningless, full of “synergies” and “goals” and “leverage” that I said, “If I wrote like that, I’d be out of a job.” 

How can those in charge of journalism be so out of touch with the language which is a newspaper’s bread and butter? 

These people live in a very different world — a different planet — from you and me. I imagine them, sent by Douglas Adams, on the space ship, packed off with the phone sanitizers, management consultants, and marketing executives, sent to a tiny forgotten corner of the universe where they can do less harm. 

One final indignity they have perpetrated: They have eliminated copy editors as an unnecessary cost. When I retired from the newspaper, reporters were asked to show their work to another writer and have them check the work. A profession is dying and the lights are winking out all over Journalandia. 

Click on any image to enlarge

Of all the pop psychology detritus that litters our culture, none bothers me more than the fatuous idea of “closure.” People talk about it as if it were not only a real thing, but an obvious one. But “closure” is a purely literary concept, ill suited to describe the actual events of our lives. 

By “literary,” I mean that it fulfills the esthetic necessity we humans feel to round out a story. A story must have a beginning, a middle, and an end (“but not necessarily in that order,” said French filmmaker Jean-Luc Godard). For each of us, our “life story” is a kind of proto-fiction we create from the remembered episodes of our lives. We are, of course, the hero of our own life story, and the supporting characters all fit into a tidy plot. 

But, of course, actual life is not like that. Rather it is a bee-swarm of interconnecting and interacting prismatic moments seen from the billion points of view of a riotously populated planet. There is no story, only buzzing activity. Eight billion points of view — and that is only counting the human ones. One assumes animals and even plants have their own points of view and no narrative can begin to encompass it all. It is all simply churn. 

Of course, there are anecdotes, which are meant to be stories, and end, usually, with a punchline. Like a joke, they are self-contained. But our lives are not anecdotes, and tragedies, traumas, loss, are not self-contained. There is no punchline.

So, there is a smugness in the very idea that we can write “fin” at the completion of a story arc and pretend it means something real. It is just a structure imposed from outside. 

In his recent book, Knife: Meditations After an Attempted Murder, author Salman Rushdie notes the meaninglessness of the concept of “closure.” After he was attacked by a would-be assassin in 2022, he came desperately close to death, but ultimately survived. The thought that he might face his attacker in court might bring some sort of closure is dismissed. He went through medical procedures and therapy, and even the writing of the book. “These things did not give me ‘closure,’ whatever that was, if it was even possible to find such a thing.” The thought of confronting his attacker in court became less and less meaningful. 

Writers, in general, are put off by such lazy ideas as “closure.” Their job is to find words for actual experience, words that will convey something of the vivid actuality of events. Emily Bernard, author of Black is the Body was also the victim of a knife attack, and her book is a 218-page attempt to come to terms with her trauma: The book opens up a life in connection with the whole world. She never uses the word “closure.” 

Both Bernard and Rushdie to their utmost to describe their attacks with verbal precision and without common bromides. It is what all serious writers attempt, with greater or lesser success. It is easy to fall into patterns of thought, cultural assumption, cliches. It is much harder to express experience directly, unfiltered. 

The need to organize and structure experience is deeply embedded in the human experience. And art, whether literary, musical, cinematic or visual, requires structure. It is why we have sonnets and sonata-form, why we have frames around pictures, why we have three-act plays. 

The fundamental structure of art is the exposition, the development, and the denouement. Stasis; destablization; reestablishment of order. It is the rock on which literature and art is founded. When we read an autobiography, there is the same tripartite form: early life; the rise to success with its impediments and challenges; and finally, the look back at “what we have learned.” 

We read history books the same way, as if U.S. history ended with the signing of the Constitution, or with Appomattox, or the Civil Rights movement, or the election of Reagan. But history is a continuum, not a self-contained narrative. Books have to have a satisfying end, but life cannot. 

Most of us have suffered some trauma in our lives. It could be minor, or it could be life-changing. Most often it is the death of someone we love. It could be a medical issue, or a divorce. We are wrenched from the calm and dropped into a turmoil. It can leave us shattered. 

And the story-making gene kicks in and we see this disruption as the core of a story. We were in steady state, then we are torn apart, and finally we “find closure.” Or not. Really no, never. That is only for the story. The telling, not the experience. 

In truth, the trauma is really one more blow, one more scar on the skin added to the older ones, one more knot on the string. We will all have suffered before, although the sharpness may have faded; we will all suffer again. 

Closure is a lie. All there really is is endurance. As Rushdie put it, “Time might not heal all wounds, but it deadened the pain.” We carry all our wounds with us, adding the new on top of the old and partly obscuring what is buried. 

There are myriad pop psychology tropes. They are like gnats flying around our heads. Each is a simplifying lie, a fabricated story attempting to gather into a comprehensible and digestible knot the infinite threads of a life. 

I have written many times before about the conflation of language and experience, and how we tend to believe that language is a one-to-one mirror of reality, when the truth is that language is a parallel universe. It has its own structure and rules — the three-act play — while those of non-verbal life are quite other. And we will argue — even go to war — over differences that only matter in language (what is your name for the deity?)

Most of philosophy is now really just a branch of philology — it is about words and symbols. But while thoughtful people complain about the insular direction that philosophy has taken, it has really always been thus. Plato is never about reality: It is about language. His ideal bed is merely about the definition of the word, “bed.” As if existence were truly nouns and verbs — bits taken out of context and defined narrowly. Very like the question of whether something is a particle or a wave, when in truth, it is both. Only the observation (the definition) will harness it in one form or the other. It is all churn. πάντα χωρεῖ

A story attempts to make sense of the senseless. I’m not sure life would be possible without stories, from the earliest etiology of creation myth to the modern Big Bang. All those things that surpass understanding can only be comprehended in metaphorical form, i.e., the story. 

But stories also come in forms that are complex or simple, and are true or patently silly. My beef with “closure” is that it isn’t a story that reflects reality, but a lie. A complacent lie. 

Like most of popular psychology, it takes an idea that may have some germ of truth and husks away all the complex “but-ifs” and solidifies it into a commonly held bromide. It is psychobabble. 

That is a word, invented by writer Richard Dean Rosen in 1975, which he defines as “a set of repetitive verbal formalities that kills off the very spontaneity, candor, and understanding it pretends to promote. It’s an idiom that reduces psychological insight to a collection of standardized observations that provides a frozen lexicon to deal with an infinite variety of problems.”

And afternoon TV shows, self-help books and videos, and newspaper advice columns are loaded with it. It is so ubiquitous that the general populace assume it must be legitimate. We toss around words such as co-dependent, denial, dysfunctional, empowerment, holistic, synergy, mindfulness, as though they are actually more than buzz words and platitudes. Such words short-circuit more meaningful understanding. Or a recognition that there may be no understanding to be had. 

(In 1990, Canadian psychologist B.L. Beyerstein coined the word “neurobabble” as an extension of psychobabble, in which research in neuroscience enters the popular culture poorly understood, with such buzz words as neuroplasticity, Venus and Mars gender differences, the 10-percent-of-the-brain myth, and right- and left-brain oversimplifications.)

 As a writer (albeit with no great claim to importance), I know how often I struggle to find the right word, phrase or metaphor to reach a level of precision that I don’t find embarrassing, cheap, or an easy deflection. Trying to find the best expression for something distinct, complex and personal — to try to be honest — is work. 

This is true in all the arts: trying to find just the right brown by mixing pigments,; or the right note in a song that is surprising enough to be interesting, but still makes sense in the harmony you are writing in; or giving a character in a play an action that rings true. We are so mired in habits of thought, of culture, that finding that exactitude is flying through flak.

Recently, Turner Classic Movies ran a cheesy science-fiction film I had never seen before. I grew up on bad sci-fi movies from the 1950s and always enjoyed them, in the uncritical way a 9-year-old watches movies on television: Quality never entered the picture. At that age, oblivious there even was such a thing. It wiggled on the screen; I watched. 

But this film was released in 1968, too late for me. When I had gone off to college, the only films I watched were snooty art films. And so I never got to see The Green Slime. Now, here it was, and it was prodigiously awful. Actor Robert Horton fights an alien invasion of tentacled, red-eyed monsters. 

Everything about The Green Slime was awful: the acting, the lighting, the set design, the special effects — and, of course, the science. Or lack of it. There was the garish color of sets and costumes and the over-use of the zoom lens, the way of made-for-TV movies of the era. I have outgrown my open-hearted love of bad science fiction. I stared in wonder at the horribleness I was seeing on the TV screen. 

And it was the acting, more than anything, that appalled me. Why were these actors so stiff, wooden, even laughable? And something I guess I had always known, but never really thought about jumped to mind: Actors are at the mercy of writers. The dialog in Green Slime was stupid, awkward and wooden.

There is some dialog so leaden, so unsayable, that even Olivier can’t bring it off. Robert Horton, while no Olivier, was perfectly fine on Wagon Train, but here he looked like he was lip-synching in a foreign tongue. 

“Wait a minute — are you telling me that this thing ‘reproduced’ itself inside the decontamination chamber? And, as we stepped up the current, it just … it just grew?”

I remember, years ago, thinking that Robert Armstrong was a stiff. I had only seen him in King Kong and thought he was a wooden plug of an actor (not as bad, perhaps as Bruce Cabot, but still bad. But years later, I’d seen him in other films where he was just fine. Even in Son of Kong, he was decent. But no one, absolutely no one can pull off a line like “Some big, hardboiled egg gets a look at a pretty face and bang, he cracks up and goes sappy!”

Even in a famous movie, clunky dialog can make an otherwise fine actor look lame. Alec Guinness and James Earl Jones may be able to pull off the unspeakable dialog of the original Star Wars, but for years, I thought Mark Hamill was a cardboard cut-out. It was only when seeing him in other projects I realized that Hamill could actually act. I had a similarly low opinion of Harrison Ford because of what he was forced to mouth in those three early franchise films. George Lucas did them no favors. 

There is certainly a range of talent in movies and some genuinely untalented actors who got their parts by flashy looks or sleeping with the producer. But I have come to the opinion that most (certainly not all) actors in Hollywood films are genuinely talented. Perhaps limited, but talented, and given a good script and a helpful director, can do fine work. 

One thinks of Lon Chaney Jr., who is wooden, at best, as the Wolfman. But he is heartbreaking as Lenny in Of Mice and Men — perhaps the only chance he ever got to show off what he was actually capable of. 

“Lon Chaney was a stiff, but he had Lenny to redeem him,” said my brother Craig, when we discussed this question. Craig can be even more critical than me. 

He continued, “I’ve been trying to think of the worst actors ever — someone who has never said a word like a human being. There are a lot of people who got into a movie because they were famous for something else (Kareem Abdul-Jabbar, Joe Louis, Audie Murphy) so it’s hard to judge them fairly as actors, like you can’t criticize a pet dog for barking the national anthem but not hitting the high notes. But even Johnny Weissmuller was pretty effective in the first Tarzan; Elvis had Jailhouse Rock where he actually played a character; and Madonna can point to Desperately Seeking Susan without shame. (Everything else, shameful. There just isn’t enough shame in the world anymore.)

“There are any number of old cowboy stars who couldn’t speak a believable line of dialog and that can’t be totally blamed on the writing. (Gabby Hayes rose above it.) There are bad actors who still had some life and energy about them that made them fun to watch. Colin Clive was silly, and he made me giggle, so he was entertaining. And  Robert Armstrong. But there’s just no excuse for Bruce Cabot.

“I’ve never actually seen a Steven Seagal movie,” Craig said, “but I know enough to say with conviction that he should have been drowned as a baby.”

I said Craig can be tougher than me, but here, I have to concur. 

“It’s probably not fair to pick out silent movie actors for being silly and over the top, but there is Douglas Fairbanks to prove you can be silent and great.”

Silent acting was a whole different thing, and hard to judge nowadays. As different from modern film acting as film acting is from acting live on stage. The styles don’t often translate. John Barrymore was the most acclaimed Shakespearean actor in America in the early years of the 20th century, but his style on celluloid came across as pure ham. (Yes, he was often parodying himself on purpose, but that doesn’t gainsay what I am saying about acting styles). 

Every once in a while, I see some poor slob I always thought was a horrible actor suddenly give an outstanding performance. Perhaps we have underestimated the importance of the casting director. A well-placed actor in a particular part can be surprising perfection. There is creativity in some casting offices that is itself an artform. You find the right look, voice, or body language, and a minor part becomes memorable. Some actors are wonderful in a limited range of roles. I can’t imagine Elisha Cook as a superhero, but he is perfect as a gunsel. 

And Weissmuller was the perfect Tarzan before his clumsy line reading became obvious in the Jungle Jim series. I am reminded of Dianne Wiest in Bullets Over Broadway: “No, no, don’t speak. Don’t speak. Please don’t speak. Please don’t speak.”

Keep in mind, actors are subject to so many things that aren’t in their control. In addition to good writing, they need a sympathetic director, decent lighting, thoughtful editing, even good costume design. Filmmaking is collaborative and it isn’t always the actor’s fault if he comes across like a Madame Tussaud waxwork. I’ve even seen Charlton Heston be good. 

In reality, I think of film actors much as major league ballplayers. The worst baseball player on a major league team may be batting under the Mendoza line, and even leading the league in fielding errors, but in comparison with any other ballplayers, from college level to minor leagues, he is superhumanly talented. Even Bob Uecker managed to hit a home run off Sandy Koufax. I doubt any of us could have done that. And so, we have to know who we’re comparing them to.

I saw a quote from Pia Zadora the other day (she just turned 70) and with justifiable humility, she said, “I am often compared to Meryl Streep, as in ‘Pia Zadora is no Meryl Streep.’” Still, compared to you or me, she is Bob Uecker. 

I have had to reassess my judgment of many actors. I had always thought of John Wayne as a movie star and not an actor. But I have to admit, part of my dislike of his acting was disgust over his despicable political beliefs. And I thought of him as the “cowboys and Indians” stereotype. 

But I have now looked at some of his films with a clearer eye, and realize that, yes, most of his films never asked anything more from him than to be John Wayne — essentially occupying a John Wayne puppet suit — but that when tasked by someone such as John Ford or Howard Hawks, he could actually inhabit a character. “Who knew the son-of-a-bitch could actually act!” Ford himself exclaimed. 

But there it is, in The Searchers, She Wore a Yellow Ribbon, Red River, The Quiet Man, Rio Bravo, The Shootist. Those were all true characterizations. (Does all that cancel out The Alamo or The Conqueror or The War Wagon, or balance all the undistinguished Westerns he made? We each have to judge for ourselves). 

Even in his early Monogram oaters, playing Singing Sandy, he brought a naturalness to his presence that is still exceptional in the genre (and researching Wayne, my gasts were flabbered at how good looking he was as a young man. So handsome he was almost pretty. And that hip-swinging gait that predates Marilyn Monroe. “It’s like Jell-O on springs.” It seems notable that so much feminine could become the model of such lumpen masculinity.)

And even great actors have turned in half-ass performances, or appeared in turkeys. In Jaws: The Revenge, Michael Caine has to utter things like, “Remind to tell you about the time I took 100 nuns to Nairobi!” Caine famously said, “I have never seen the film, but by all accounts it was terrible. However I have seen the house that it built, and it is terrific.”

Even Olivier had his Clash of the Titans.

Actors have to work, and they don’t always get to choose. “An actor who is not working is not an actor,” said Christopher Walken. The more talented actors sometimes get to be picky, but the mid-range actor, talented though he or she may be, sometimes just needs the paycheck. 

I sometimes think of all the jobbing actors, the character actors of the 1930s, working from picture to picture, or the familiar names on TV series from the ’50s and ’60s — the Royal Danos, the John Andersons, the Denver Pyles, dressed as a grizzled prospector for a week on one show, going home at night for dinner and watching Jack Benny on the tube, and then driving back to the set the next morning and getting back into those duds. And then, next week dressing as a banker for another show, putting together a string of jobs to make a living. And all of them complete professionals, knowing what they are doing and giving the show what it needs. A huge amount of talent without ever having their names above the title. 

Royal Dano times four

And so, I feel pity for those actors of equal talent who never broke through, or who were stuck in badly-written movies and couldn’t show off their chops. When I watch reruns of old episodic TV, I pay a good deal more attention than I ever did when I was young, to all the parts that go into making such a show. I notice the lighting, the editing, the directing, and most of all, the writing. The writing seems to separate, more than anything else, the good series from the mediocre ones. And how grateful those working actors must feel when they get a script that gives them a chance to shine. 

Is just being alive enough? It is a question I have been facing, with continued difficulty, ever since I retired a dozen years ago after 25 years as a newspaper writer. 

For all those years, and for the many years before, I held jobs that contributed, in some way — often small, even negligible — to the business of society. I had a sense of being productive. This is not to make any major claim about how important my production was. It was admittedly quite minor. But it was a contribution. 

Doing so was a part of my sense of self, that being a productive member of society was not merely a way of occupying my time, but was actually a moral duty. If I were slacking off, I would be harming my society. And even worse, harming my immortal soul (something I don’t actually believe in).  

This is not something I thought much about on a conscious level. In fact, when I do think about it, I realize it’s quite silly. Society gets along quite well without my input. But it is buried deep down somewhere in my psyche that I must be productive. 

The opposite of being productive is being lazy. And I can’t help but feel that laziness is a moral failing. I have tried to excavate my brain to discover where this sense comes from and I cannot be sure. 

The easy answer comes up, “Protestant work ethic,” and it is true that I was raised in such an environment. But religion has never played an important part of my life. As I have said before, I have no religion; I’m not even an atheist. 

But somehow, I seem to have been injected with this guilt about not always doing something. Making something; teaching something; selling something; performing something. 

It is true that my grandparents, on both sides of the family were quite religious. My father’s parents were even infected with a kind of Lutheran religious mania. They went to church three times a week, prayed constantly, and when they were young, before World War II, my father and his siblings were not allowed to listen to the radio, to music or to dance. In fact, this church-craziness led my father to promise never to inflict this kind of joyless religion on his children. 

And so, although we all went to church on Christmas and Easter, it was only to make my mother’s mother happy. She was religious in a more normal way, and was always kind and loving. But I and my two brothers managed to escape our childhoods without any religious sentiment at all. 

 Or so it seems. While I have no supernatural beliefs — the whole idea of a god or gods seems pointless — something of the culture seems to have leaked in. 

For all of my 25 years at the newspaper, I averaged about three stories per week. I always felt as if I were slacking off and that I should be writing more. My editors constantly told me I was the most productive member of the features staff. But it never felt that way. Even on vacations, I took daily notes before going to bed, and used those notes to write travel stories for the paper when I got back to the office. 

Before I retired, I used the computerized data base to check on my output and discovered I had written something like 3 million words during my tenure. If an average novel is about 90,000 words, it means I wrote the equivalent of more than 30 novels in that time. My last project for the paper was a 40,000 word history of architecture in Phoenix. 

And so, when I left my job, it was like stepping off a moving bus,  racing to a halt and trying to keep my balance. 

My colleagues at the paper bought me a blog site as a retirement gift, and I began writing for it instead of the newspaper. At first, I was writing an average of three blog posts per week, unchanged from my time at work. 

I have slowed down greatly since then, and am now aiming for about three posts a month. I don’t always make that many. But I have written more than 750 blog entries in the 12 years since I left the newspaper. which is still more than one a week. And I also write a monthly essay for the online journal of the Spirit of the Senses salon group of Phoenix. That’s an additional 103 essays, each averaging about 1500 words. Blog and journal, it all adds up to about an additional million and a half words written since giving up employment. Old writers never really retire, they just stop getting paid. 

And none of this is paid work. I write because I cannot not write. When I am not blogging, I am writing e-mails. Old-fashioned e-mails that are more like actual letters than the quick one- or two-sentence blips that constitute most e-mails. Scribble, scribble, scribble, eh, Mr. Nilsen? 

But that all brings me back to my original concern: Is just being alive enough? Can I in good conscience spend an hour or two sitting in my back yard and listening to the dozens of birds chattering on, watching the clouds form and reform as they sail across the sky dome, enjoying the random swaying of the tallest tree branches in the intermittent wind? Thinking unconnected thoughts and once in a while noticing that I am breathing?

In 1662, Lutheran composer Franz Joachim Burmeister wrote a hymn titled Es ist genug (“It is enough”) that Johann Sebastian Bach later wrote into one of his more famous cantatas. It is notable for including a tritone in its melody. And, in 1935 Alban Berg incorporated it in his violin concerto, written “in memory of an angel” after the death of 18-year-old Manon Gropius, daughter of Alma Mahler. It is one of the most heartbreakingly beautiful musical compositions of the 20th century. Es ist genug

I remember reading that in India, the idealized life is understood to be a youth of play, and adulthood of work and an old age of seeking spiritual truths. That one is meant to lay down one’s tools and contemplate what it has all been about. And I take some comfort in the possibility that, at the age of 76, it is now my job no longer to produce, but to absorb all those things that were irrelevant to a normally productive life. To notice my own breathing; to feel the air on my skin; to recognize my tiny spot at the axis of my own infinitesimal consciousness in an expansive cosmos. To attempt to simply exist and to feel the existence as it passes.