I recently misplaced my camera on a trip. I don’t know where; I’ve checked and rechecked several places. So, it’s gone. These things happen. 

Luckily, I have some older cameras stuck in a drawer and I pulled them out, went through an orgy of recharging all their batteries, and now have six working cameras, of various types and talents. Mostly, though, I was shocked to realize how many cameras I have bought and used over the years. Indeed, with all the newly recharged and working ones, the drawer also contained even older cameras that no longer can function at all — outdated technology. And all these were only the digital cameras. My history with the medium goes way back. 

 

I have gone through scores of cameras over the years, including a bunch I had simply for the sake of collecting, including an old Kodak Medalist, the Super Ikonta B, and the Exa 1. I have bought at least a half-dozen Argus C3s — a camera built like a brick with a lens plonked on — that I habitually gave out to friends so they could make pictures for themselves. 

But all that is just akin to collecting antiques. More immediately, it set me to thinking about a chronic disease I have suffered from through most of my life — a psychological problem. I have been a lifelong collector of cameras. But not just for the sake of collecting. It has been a lifetime of trading in what I had been using for what I believed would be a better tool. Of course, on the rational level, one knows perfectly well that the tool is only as good as the workman, and that a fancier camera isn’t going to make better photographs for me. Still, I constantly drooled over whatever was higher up on the photographic food chain. 

And I started pretty low: When I was 10, I got a Kodak Hawkeye Brownie camera, a little plastic box with a tiny fixed lens. It used roll film and when I had shot my 12 pictures, I took the roll to the drugstore to have it developed and printed. What I got back were three-and-a-half inch, deckle-edged prints of fuzzy images with a tiny date printed in the margin. These were the standard family snapshots of the era — the late ’50s and early ’60s. 

 

Then, over summer school vacation in 1965, I accompanied my grandmother on a trip across the Atlantic to visit her birth town in Norway, and I wanted a more “professional” kind of camera to take with me. This was truly the start of my neurosis. I visited the local camera store where the clerk — a doughy old smoker of smelly cigarettes — found me a used Praktica 35mm single lens reflex camera. This was the kind of camera with a flipping mirror behind the lens that popped up as you snapped the shutter button and exposed the film. Even then, most SLRs had a prism affixed so you could look through it to focus and frame. But my Praktica was of an older vintage. You had to hold the camera at chest level and look down into the ground glass viewfinder — a viewfinder exactly an inch by an inch-and-a-half, for a very tiny preview of what you were photographing. As you turned the knob to the next exposure, you also re-lowered your mirror. 

In high school, I was a nerdy sort, and one of the AV team that set up projectors in classrooms for the teachers, or operated the reel-to-reel tape recorders. And I had a special job as the photographer for the school newspaper, and for that I had the school’s Crown Graphic — an old Graflex camera with the 4- by-5-inch plate holders. And I had access to the school darkroom, where I processed the giant negatives and made prints for the paper. 

So, I knew I wanted a better camera for myself. My best friend had his own 35mm SLR, and it was a Miranda D. Oh, how I wanted one of my own, with its pentaprism on top, and a lever to advance the film instead of the windy-knob. I never got one. I couldn’t save up enough money. 

But, with a trade-in of the old Praktica and some saved allowance cash, I could get a used Pentax Spotmatic, of the old, screw-mount lens variety. It got me through college. But there was always in the back of my mind this nagging need to have the best — the Nikon F, the top of the line in Single Lens Reflex technology. But they cost so much. I pined. 

(I always seem to have had Pentaxes as a fallback. I’ve owned maybe a half dozen of them over the years and it was a Pentax that I took with me on our first trip through the West in 1981. A solid workmanlike camera.) 

After I graduated, I got a job as a clerk in a camera store, working alongside a wizened old pro, who smoked (more accurately, chewed on) truly nasty cigars. And with my earnings, I was able to trade my Pentax in for the Nikon (used). You’d think that was enough. The prize with the Nikon came with the 55mm macro lens and the gorgeous 85mm long focus lens. Had to have those. 

But I knew my photographic heroes worked with Leicas, and so, my heart was set on an ever-higher rung and I wanted, somehow to own a Leica. I magically came across an ancient Leica D, vintage 1937, being sold for a ridiculously cheap price (something like $50. The latest price for one on Ebay runs about $1500.) Mine came with the famed f/3.5 Elmar lens, and was an all-black model. I felt I had hit the jackpot. 

The problem was such an old camera was missing some regular amenities, like a flash connection. And then one of our customers, a wealthy businessman who was a collector of Leicas, offered me a more modern Leica IIIf (“red dial,” ca. 1956) in an even trade for my old thing. It was chrome, not black, and had more shutter speeds and a flash connection. I took the deal. 

The problem was that the then-current Leica was a bigger, better M3 Leica and I knew I had to have that. I also began collecting Leica accessories for my IIIf — telephoto and wide angle lenses, a light meter, a leather camera case — and I would have to give all them up for the upgrade. 

And so, I upgraded. Then there was the M4, which was even better. Imagine how good my photos would be if only I had the Leica M4. 

I had a new problem, however. The kind of photographs I wanted to make were highly detailed, sharply focused images, like those of Ansel Adams or Edward Weston, and the small, grainier 35 millimeter film I was using would never, ever achieve that look. I had to have a bigger negative. 

And so began the climb up to the 120 film size with its two-and-a-quarter-inch square negatives. I got a Rolleicord, which was a highly respected twin-lens reflex camera (one lens above the other; one to focus with, one to make the image on the film). But, of course, the Rolliecord was the lower-price version of the even-better Rolleiflex. Huff and puff. More trading and I got the Rolleiflex with the f/2.8 lens. But while I got the bigger negatives, I had regressed to looking down to my waist into the ground glass on the top of the camera, where the mirror sent the image. 

And so, the next insanity was to lust for the medium format SLR, and the top of the heap was the camera we all considered the BMW of cameras: The Hasselblad. This Swedish camera was what the Big Boys used. Avedon; Penn. 

But by this time, I was no longer working at the camera store, and the Hasselblad cost as much as a Volkswagen (or it seemed like it). Eventually, I found a used one and felt I had reached the pinnacle. I owned a Hasselblad. 

Unfortunately, just at that point, I went through the equivalent of a divorce, moved from North Carolina to Seattle, went unemployed for a bit and had to sell my Hasselblad. I went through some hard times, had to sell my darkroom equipment and was left with a series of Pentax cameras (newer vintage, with click-mounted lenses). My life was saved by meeting my second official wife (married for 35 years until her death seven years ago), moved to Phoenix and got my dream job writing for the daily newspaper and taking my own photos for my stories. 

You would think that would be the end. But ye of little faith (or too much). The small 35mm camera was fine for my newspaper work, but for my personal photography, I began to think about large format cameras. Since using the 4X5 Graflex in high school, I had in mind to finally acquire a large-format field camera, and got a really nice Toyo, with the Super-Angulon lens that was my perfect idea of the perfect lens for the perfect camera. I dragged it around the desert to make landscapes. 

But — you knew this was inevitable — Edward Weston used an even bigger camera, one using 8-by-10 inch film. This was truly a camera the size of a Volvo. It weighed as much as a set of barbells. I found an old, beat-up Deardorff with an uncoated lens and three film holders. 

Of course, that is exactly when the photographic world went digital. And the whole stupid thing started all over again. I bought a 2-megapixel Nikon Coolpix 800. Which led to a 3.5-megapixel Coolpix 880, a Canon ELF (the size of a cigarette pack), and on it went. Manufacturers kept upping the megapixelage for sharper images at larger sizes, and I kept up with them. 

I found what felt was the perfect camera, the Nikon Coolpix P300, with its 12.2 megapixels and a high-contrast black-and-white mode available, along with the ability to make panoramic images. I thought it was my final camera. 

And it would have been, I believe, except that after several years, it stopped working. I needed to replace it but by then, it was no longer being made. I had to find something else. What I found was the Panasonic Lumix DMC-FV35 which was the smallest, flattest pocket-size camera I knew. It had an exceptionally wide lens and had a 10 megapixel sensor. I loved that camera. But then, for some reason, it developed a dark spot in the image, like a dust speck on the lens. But no matter how much I cleaned the lens, it wouldn’t go away. I bought the newer 12-megapixel Lumix DMC-FX 48, which was otherwise identical to its predecessor. It eventually got the spot, too. 

I was taken by the idea that a larger sensor might improve the image (larger physically, not just in number of pixels), and bought the Olympus Pen E-PL1, a 12.2 megapixel box with interchangeable lenses. I also bought an Olympus fe 5020 point-and-shoot for my wife (also 12 megapixels, but about one-fifth the size). 

I also needed a full-size SLR, and my old Canon EOS Rebel was quite outdated, and since my wife and I were taking a trip to Alaska, I decided to replace it with an improved model, with extra wide-angle and extra telephoto lenses to go along with it, and got the Canon T5, with its 18 megapixels. It is still my go-to camera for most important things. 

And there it would have ended but for the my need for something smaller than the T5 or the Olympus Pen, something I could slide into my pocket. And so, I ended up with the Canon PowerShot SX610 HS (The names for camera models is just as insane as those for proprietary drugs). It was 20 megapixels. I say “was,” because that is the camera I lost. I hope that it will turn up some day. 

When I was young, I thought that each time I upgraded my camera, my pictures would get better. And, you know, over the years, from 1965 to 2024, it was true that my pictures got better. Strangely, though, I now recognize that any improvement wasn’t because I got better cameras, but because, after tens of thousands of images, I became a better photographer, merely through practice. It never was the camera. (I’ve made some of my favorite images with a $10 Diana toy camera). 

In fact, the fancier new model digital cameras get, the more bells and whistles, the less I want. If only I could find a point-and-shoot small enough for my pocket with a wide-angle lens (no zoom needed), and no extra “filters,” no face recognition, no video mode — just a very basic still camera — I would be very happy. I think. 

Addendum: I know at least a couple of retired newspaper photographers who may recognize this pattern in themselves. I believe it is largely a male thing and extends well beyond merely photographic equipment. I know of men who start out with a Schwinn, have graduated to a VW Beetle, eventually to get a Ford Falcon station wagon (with a mattress in the back), to a Chevy, to a Honda, and on to a BMW and finally to owning two Porsches (one in the back yard for parts to keep the other one running.) 

But it could be anything, from knives or guns, to audiophile stereo systems and expensive speakers. Since I do the cooking in the house, I have gone through something like it with cookware, finally owning top-of-the line skillets and mixing bowls. 

Women may show a similar climb as they move on from husband No. 1, each time getting a better, more mature, responsible and  thoughtful model, finally achieving what is usually called a “keeper.” 

Excelsior! Ever onward and upward. 

I have traveled more widely than most, I believe. I was a travel writer for my newspaper (among other things). But even outside of vocational duties, travel has been central to my life. My late wife and I have been on the move considerably, visiting each of the 48 contiguous states many times, and gone outside the country when we could. In the summer of 1981, for instance, we put 10,000 miles on the car while driving around the country. We were both teachers and the summer gave us the chance to wander. 

There are places, however, that I have gone back to, over and over, throughout my life, not through mere happenstance, but because they have been meaningful. They are destinations for non-religious pilgrimages. 

Meaning can be hard to define. On first thought, one thinks “meaning” implies a second message: “This” means “that.” “I know what he said, but what did he mean?” That sort of thing.

But meaning has a more personal existence, a psychological one. Something can have meaning even if you don’t know what it means. I suppose you might otherwise call this significance. And there are places I visit over and over because they bear the weight of personal significance. They have meaning. And I go back to them. 

Bodhgaya

It isn’t just me. The world is full of such places, some personal, some cultural, some religious or spiritual. They can be sacred spaces or holy ground. They have accrued some emotional hold on those for whom this kind of significance has meaning. It can be the Buddhist bo tree, the Dome of the Rock, the Western Wall, Independence Hall. For a few of us, the list would include Fenway Park. These are places of social significance and people will make the hajj to see them out of devotion or just to absorb the numinous halo surrounding them. Life is empty without meaning, although what we call meaning and where we might discover it is personal. 

Meaningful places needn’t be so to groups, or to have religious importance. I have no belief in ghosts, spirits or ouija boards and I don’t believe that the past hangs on to the present to make itself palpable. But I have several times experienced a kind of emotional resonance when visiting certain famous sites. 

Normandy beach

The thought re-emerged recently while watching the recent commemoration of the 80th anniversary of the D-Day landings in Normandy. I have visited those beaches and had an overwhelming rush of intense sadness. It was inevitable to imagine the thousands of soldiers rushing up the sands into hellish gunfire, to imagine a thousand ships in the now calm waters I saw on a sunny day, to feel the presence in the concrete bunkers of the German soldiers fated to die there manning their guns. 

The effect is entirely psychological, of course. If some child with no historical knowledge of the events that took place there were to walk the wide beach, he would no doubt think only of the waves and water and, perhaps, the sand castles to be formed from the sand. There is no eerie presence hanging in the salt air. The planet does not record, or for that matter, much note, the miseries humans inflict on each other, and have done for millennia. But for those who have a historical sense, the misery reasserts itself. Imagination brings to mind the whole of human agony.  

Perhaps I should not say that the earth does not remember. It can, in certain ways. Visiting the woods of Verdun in France, site of a horrendous battle in World War I, I saw the uneven forest floor, where time has only partially filled in the shell craters. Once the trees were flattened by artillery, leaving a moonscape littered with corpses. The trees have grown back, but the craters are still discernible in the wavy forest floor. The same child who thought of sand castles in Normandy might well think the churned land at Verdun merely a quirk of geology, but the land itself bears its scars. 

The presence of the dead was overwhelming at Antietam, site of the bloodiest battle during America’s Civil War. In one spot alone, a 200-yard stretch called Bloody Lane, 5,000 men were blown apart in a few short hours. 

Before Sept. 17, 1862, the brief dirt drive was called the Sunken Road, and it was a shortcut between two farm roads near Sharpsburg, Md. All around were cornfields rolling up and down on the hilly Appalachian landscape.

The narrow dirt road, depressed into the ground like a cattle chute, now seems more like a mass grave than a road. And it was just that when Confederate soldiers mowed down the advancing Federals and were in turn mowed down. The slaughter was unimaginable.

You can see it in the photographs made a few days after the battle. The soldiers fill the sunken road like executed Polish Jews. It was so bad, as one Union private said, “You could walk from one end of Bloody Lane to the other on dead soldiers and your feet would never touch the ground.”

It is difficult to stand now in Bloody Lane and not feel that all the soldiers are still there, perhaps not as ghosts, but as a presence under your boot-sole, there, as blood soaked into the dirt.

But it needn’t be horror or blood that gives meaning to a place. It may be achieved through personal association with something we lived through, or through esthetic appreciation — the sudden awareness of sublimity — or through an awareness of a less traumatic history born in a landscape. And I thought about places I have gone back to over and over; places that bear significance to me. I’m sure most of us could make such a list, whether long or short. Here are 10 of my personal pilgrimage sites that I have visited and revisited over a long life.  

Walden Pond — I cannot count the number of times I have visited Concord, Mass., or how often I have made the pilgrimage to the glacial kettle lake just outside downtown where Henry David Thoreau built his cabin and lived for two years, and which led him to write his book about the experience. 

The most recent visit was only two years ago, passing through on my way to friends in Maine. The site now has a large parking lot and a visitor center (although the parking is primarily for summertime local beachgoers, who use Walden Pond as a swimming hole). When I first saw it, there was not much there but the water and the woods; I had to park alongside the road. 

I have circumambulated the pond, a walk of just under two miles, the first time in the early morning when a mist hung over the water and the sun slowly burned through. I have read Walden several times, and own several editions, both cheap and deluxe, and Thoreau’s other books, including his Journals, and eaten up his idiosyncratic style of writing with relish. 

Walpi

Hopi Mesas — In northern Arizona, the Hopi have built their towns primarily on three mesas, First, Second, and Third, which are really the southern fingers of the larger Black Mesa. We have visited all three mesas many times, including one snowy Christmas spent with a Hopi family we knew on First Mesa in the village of Walpi. The warmth of the hearthfire in the stone house, and the cookies we were offered, and the smiles on the faces of the children are indelible. 

Another time, we were invited to a social dance on Second Mesa in Old Oraibi, and climbed to the roof of one of the houses with the rest of the Hopi to watch and cheer the Kachina dancers. Another time, driving past New Oraibi, at the foot of Second Mesa, we were caught as traffic was halted so the sacred Kachina dancers could cross the road from where they emerged from the kiva and marched toward the plaza. We were not supposed to be there but couldn’t leave, with the Kachina traffic cop in front of us and several cars behind. We apologized profusely, but the angry cop didn’t seem to care. We shouldn’t have been there, but we also couldn’t have known a sacred dance had been scheduled for that day. 

You can stand at the cliff edge on First Mesa and look south over the Navajo Reservation, nearly to Flagstaff and marvel at the intense beauty of the Colorado Plateau. 

Chartres Cathedral — It wasn’t until our second visit to France that we managed to get to Chartres, but after that we went back over and over. The cathedral is, of course, a World Heritage Site, and a sacred place to many. But it spoke to me less of religion than of history — architecture 800 years old and still functioning for its original purpose. 

Most of the cathedrals and basilicas of Northern France have been restored and reworked (Notre Dame, before the fire, was largely re-imagined in the 19th century as it was restored by Viollet-le-Duc) but Chartres is almost entirely original. If you are sensitive to it, you can feel all that passage of time embodied in the stonework and the interior space. 

The cathedral sits at the top of a hill at the center of town, and can be seen in the distance from miles around. I have spent hours sitting in the transept meditating over the great north rose window, which remains the single most beautiful manmade object I have ever seen. The entire experience engenders awe. 

American Museum of Natural History — I was originally going to list New York City here, but then narrowed it down to Manhattan, but, really, the center of magic for me is the Natural History Museum. I first visited on a third-grade school trip and fell in love with the dinosaurs. But all through my childhood and adolescence I visited the museum as often as I could. I loved the dioramas, the dinosaur bones, the giant stone Olmec head, the huge suspended blue whale — even the “Soil Profiles of New York State.” To say nothing of the room full of chunks of quartz, each with a little typed tag explaining where it was collected. 

I could also have mentioned the George Washington Bridge, which my grandfather helped build; he was an engineer working on the bridge in the 1930s. Through most of my life, when living in or visiting New Jersey, I would take the bus to the bridge and walk across the bridge to 178th St. I have walked the bridge too many times to count. 

The Outer Banks — I went to Guilford College in North Carolina beginning in 1966 and soon met lifelong friend Alexander and we made annual camping trips to Cape Hatteras, usually in winter when the beaches were empty. Later, my first wife and I spent our honeymoon camped directly under the lighthouse. It was February and the regular campsites were closed, and so we pitched our tent in the dunes. 

I have been back over and over, with each succeeding wife or possl-q, although as I got to be old, we tended to sleep in motel rooms. When my brother began teaching in Virginia, he lived at the northernmost bit of the Outer Banks, Sandbridge, and when visiting I often made a side-trip back down to Hatteras and Okracoke. 

Once, with Alexander, we went after a huge storm, and parts of the road were washed out just north of the lighthouse. He got out of the car and walked in front, feeling the asphalt under his feet and leading me safely past the overwash. Another time, at night, we walked down to Cape Point with a Coleman lamp projecting our shadows, like giants, up into the misty black sky. 

Grand Canyon — My brother once observed that unlike most hyped destinations, where you are always at least a tiny bit disappointed when you finally get there, the Grand Canyon is actually more impressive, more overwhelming, when you actually see it live: It never lets you down. It is 200 miles of vast geology, color, and depth. 

We lived in Arizona for 25 years and so I cannot number the times we visited the Canyon, mostly the South Rim, where most of the tourist action happens. But the second time we came, in 1982, we went to the North Rim and when we couldn’t get a room at the hotel there (it is always full), we went out of the park after dusk into the national forest, where it is legal to camp anywhere, and pulled into a side road in the dark. In the morning, we got out of the tent and discovered that if I had backed the car up 10 feet  further, we would have tumbled down into the canyon. We were right on the edge. 

And once, on assignment from the paper, I drove 60 miles off road to the Toroweap Overlook, also on the North Rim, but in pure wilderness where only a rare person ventures, and camped away from all city lights, where the night sky was neon in intensity. 

 

Hudson River Valley — I grew up in northern New Jersey, in the apotheosis of suburbia. But my father’s family had a rustic house — a “bungalow” — in West Park, N.Y., halfway up the Hudson River, and we spent many summer vacations, and at least one winter ski vacation, at the bungalow. It was the town where the once-famous nature writer John Burroughs lived. There was woods, a swimming hole with a waterfall. 

But up and down the river, from Dunderberg through Bear Mountain and the Seven Lakes Drive, it was the escape from the ordinariness of the suburbs. We regularly went swimming in the summer at Lake Welch, or watched ski jumping at Bear Mountain. 

In 1986, we drove up Perkins Drive to the top of Bear Mountain, during an outbreak of gypsy moth caterpillars. The tower at the top was covered in a sheath of hairy worms and the ground was gooey with the squashed ’pillars. It was eerie and more than a little stomach churning. 

The bungalow at West Park

The Nilsen family drove up Route 9W on summer weekends to visit the bungalow, and the three-lane highway had to curl around Storm King Mountain, a section of road that made my cautious father extremely nervous with a sheer drop down the edge to the river. They are gone now. That portion of 9W has now been re-routed and the treacherous third (middle) lane now long gone. Near Haverstraw, the “Ghost Fleet” of WWII-era liberty ships were moored. They are gone now. So is the bungalow. I am sorry for that. 

From the Palisades up through the Catskill Mountains, the Hudson River is holy ground, as far as I am concerned. It glows with the inner light of myth. 

Schoodic Point — Maine is its own mythology, and like the city of New York, I was going to include the whole state in my list, but likewise, I have narrowed it down to the place that I have been back to over and over for the longest. I first visited Maine as an infant when my parents took a trip there. I can’t say I have any memory of then. But subsequently, I have been there many times, now to visit my friends Alexander and Mary Lou. I’ve traveled the whole state over, camped at Mt. Katahdin, driven round Mooselookmeguntic Lake, reached the summit of Cadillac Mountain on Mt. Desert Island, took the tour of the paper mill in Millinocket — and you can get there from here. 

But the mythic center of Maine for me has been the rocky bit of coastline that juts out into Frenchman’s Bay north of Acadia National Park, called Schoodic (with the double “O” pronounced as the “oo” in “good.”) It is one of those windswept romantic landscapes where the spume blows into your face, the waves crash against rock with an explosive boom, and the sky, water, and land still seem of one substance. 

Giverny — I’ve been to Monet’s house and gardens three times, once in the spring and twice in the fall. He and his family lived in the small town 50 miles north of Paris, from 1883 until his death in 1926, and many of his greatest works were made there, primarily his extensive series of water lily paintings. 

I had become familiar with some of those wide canvases in museums in New York and at the Carnegie Museum in Pittsburgh. And I had been inspired to make scores of photographs of waterlilies in imitation of Monet’s paintings. 

 

But the chance to get to see the genuine article in situ, in the grand gardens in Giverny was a pilgrimage from the get-go. One enters, of course, with curiosity, but also with reverence. 

Austin house

Rock Castle Creek, Woolwine, Va. — Some of the happiest and unfettered moments of my life have been spent on the porch of the Austin house along Rock Castle Creek in Virginia. I have been back many times, beginning during my college years, when a group of Guilford students hiked the five or so miles up the creek, fording over logs, and reaching the 1916 farmhouse, with its spring house and barn. 

With my companion, or with a group, we would break into the house (not recommended, for legal reasons) and roll out our sleeping bags on the floor. At night in the summer, the field in front of the house was a galaxy of blinking lightning bugs. 

At the end of Bergman’s film, Wild Strawberries, the old professor Isak Borg, lies in bed after a tumultuous day, full of cares and regrets, and in voice over, says, “If I have been worried or sad during the day, it often calms me to recall childhood memories,” and he thinks of a time he saw his parents together, with his father fishing. It soothes him and he sleeps peacefully. For me, that moment of pure calm happiness that I recall is of sharing a hammock with my beautiful red-headed mate on the second floor porch of the Austin house and watching the sun go down over the creek and meadows beyond. Such moments consecrate a place. 

 

These are all places I have gone back to multiple times. But there are places where I have only ventured once that still have that emotional buzz that signifies a sacralized locale. These may be places set aside by history or by personal experience, or simply by their extraordinary natural setting. 

We all have such places, and they while they may be widely shared, such as the World Trade Center, they are just as often special only to one or a few people. Places where we stand in awe, and may not even be able to speak. 

From top right: Cape of Good Hope; Gaspe; Montauk; Finisterre

For me, many have been where the land runs out and the seas begin. There is something about the extremity, the sense of the limitations of land and the seeming infinity of the watery horizon. I seek them out, such as the Gaspé Peninsula, Long Island’s Montauk Point, Brittany’s Finisterre, or South Africa’s Cape of Good Hope. At my age, I know my existence shall soon run out and the dark infinite void is the horizon, and that I will not likely ever get to visit such places again. 

And I have been to Civil War sites where I felt the ghosts in the soil: places such as Shiloh, Vicksburg, Appomattox, Gettysburg, Petersburg, Bull Run, Chancellorsville, Fredericksburg, and Five Forks. And important places in the Western expansion: Sand Creek, Little Bighorn, Wounded Knee, Washita. Japanese internment camps at Manzanar and Poston. And to the Lorraine Motel in Memphis, where Dr. King was shot; and the slave quarters at the Oakley Plantation in Louisiana. All places where death and suffering are felt in the air breathed there — at least to anyone with an awareness of history. 

Mt. Saint Michel

And I could list many less fraught places with their own resonance that I have only visited once, but that have created space in my psyche: Mt. Saint Michel; Big Bend National Park; the catacombs in Paris; the caves at Lascaux; the town of Moosonee on James Bay in Ontario; the Okefenokee swamp in Georgia; Mt. Angeles in Washington’s Olympic Mountains; Hallingskarvet mountains in the spoon of Norway; Chaco Canyon; Taliesin in Wisconsin; Ice Water Springs in the Smokies; the Salton Sea; Glacier Bay in Alaska. 

Each of these places has become a part of me. They are my sacred spaces. 

Click any image to enlarge

The venerable writer John McPhee wrote a short, episodic memoir for the May 20, 2024 edition of The New Yorker, and in it he discussed proofreading. The piece hit home with me. 

I began my career at The Arizona Republic as a copy editor, which is not exactly the same thing as a proofreader, but many of the duties overlap, and many of the headaches are the same. 

 A proofreader, by and large, works for a book publisher and will double check the galley proofs of a work for typos and grammatical errors. The work has already been typeset and a version has been printed, which is what the proofreader goes over. 

A copy editor works for a magazine or newspaper and is usually one of a team of such editors, who read stories before they are typeset and check not only spelling and grammar, but factual material and legal issues, to say nothing of that great bugbear, the math. English majors are not generally the greatest when dealing with statistics, percentages, fractions — or for that matter, addition or subtraction. 

Arizona Republic staff, ca. 1990

In a newspaper the size of The Republic, a reporter turns in a story (usually assigned by the section editor) and that editor then reads it through to make sure all the necessary parts are included, and that the presentation flows in a sensible manner. Section editors are very busy people, dealing with personnel issues (reporters can be quite prissy); planning issues (what will we write about on July 4 this year); remembering what has been covered in the past, so we don’t duplicate what has been done; dealing with upper management (most of whom have never actually worked as reporters) and who have “ideas” that are often goofy and unworkable; and, god help them, they attend meetings. They cannot waste their time over tiny details. Bigger fish to fry. 

Once the section editor has OKed a piece it goes on to the copy editors, those troglodyte minions hunched over their desks, who then nitpick the story, not only for spelling and style — the Associated Press stylebook can be quite idiosyncratic and counterintuitive — but also for missing bits or mis-used vocabulary, and double-checking names and addresses. A copy editor is a rare beast, expected to know not only how to spell “accommodate,” but also who succeeded Charles V in the Holy Roman Empire (Ferdinand I, by the way). 

The copy editor then hands the story over to the Slot. (I love the more arcane features of any specialized vocation). The Slot is the boss of the copy desk. In the old days, before computers, copy editors traditionally sat around the edge of a circular or oblong desk with a “slot” in the center where the head copy editor sat, collecting the stories from the ring of hobbits surrounding him. He gave the stories a final read-through, catching anything the previous readers may have missed. Later the story would be given a headline by a copy editor and that headline given a final OK by the Slot. Only then would the story be typeset. 

That means a typical newspaper story is read at least four times before it is printed. Nevertheless, there will always be mistakes. Consider The New York Times. Every typo that gets through generates angry letters-to-the-editor demanding “Don’t you people have proof-readers?” Well, we have copy editors. And why don’t you try to publish a newspaper every day with more words in it than the Bible and see how “perfect” you are? Typos happen. “The best laid schemes o’ Mice an’ Men Gang aft agley.” 

When I was first hired as a troglodyte minion, I had no experience in journalism (or very little, having spent time in the trenches of a weekly Black newspaper in Greensboro, N.C., which was a very different experience from a big-city daily) and didn’t fully understand what my job entailed. I thought I was supposed to make a reporter’s writing better, and so I habitually re-wrote stories, often shifting paragraphs around wholesale, altering words and word order, and cutting superfluous verbiage. That I wasn’t caught earlier and corrected tells me I must have been making the stories better. 

There was one particular movie critic who had some serious difficulty with her mother tongue and wrote long, run-on sentences, some of which may have been missing verbs in them, or full of unsupported claims easily debunked. (I hear an echo of her style in the speeches of Donald Trump). I regularly rewrote her movie reviews from top to bottom, attempting to make English out of them. 

One day, I was a bit fed up, and e-messaged the section editor that the critic’s review was gibberish and I included the phrase, “typewriters of the gods.” Unfortunately the reviewer was standing over the desk of the section editor and saw my sarcastic description and became outraged. I had to apologize to the movie critic and stop rewriting her work. 

Lucky for me, the fact that I could make stories better brought me to the attention of the section chiefs and I was promoted off the copy desk and into a position as a writer — specifically, I became the art critic (and travel writer, and dance critic and architecture critic, and classical music critic, and anything else I thought of). I’m sure the other copy editors and the Slot were delighted to see the back of me.

That is, until they had to tackle copy editing my stories. I had a few idiosyncrasies of my own. 

Here I must make a distinction between a reporter and a writer. I was never a reporter, and was never very good at that part of the job. Reporters are interested primarily in collecting information and fact. Some of them can write a coherent sentence, but that is definitely subordinate to their ability to ferret out essential facts and relate them to other facts. A reporter who is also a good writer is a wonder to behold. (In the famous team of Carl Bernstein and Bob Woodward, the latter was a great reporter and mediocre wordsmith — as his later books demonstrate. Bernstein was a stylish writer. Together they functioned as a whole). 

I was, however, a writer, which meant that my primary talent and purpose was to put words into an order that was pleasant to read. I love words. From the second grade on, I collected a vocabulary at least twice as large as the average literate reader, and what is more, I loved to employ that vocabulary. Words, words, words.

And so, when my stories passed through the section editor and got to the copy desk, the minions were oft perplexed by what I had written. Not that their vocabularies were any smaller than mine, but that such words were hardly ever printed in a newspaper. I once used “paradiddle” in a review and the signal went up from the copy desk to the section editor, who came to me. We hashed it out. I proved to her that the word was, indeed, in the dictionary, and the word descended back down the food chain to the copy desk and the word was let alone. 

But this led to a bit of a prank on my part. For a period of about six months (I don’t remember too clearly exactly, back in the Pleistocene when this occurred) I included at least one made-up word in every story I wrote. It was a little game we played. These words were always understandable in context, and were often something onomatopoetic and meant to be mildly comic (“He went kerflurfing around,” or “She tried to swallow what he said, but ended up gaggifying on the obvious lies”). For those six months, a compliant copy desk let me get away with every one of them. Every. Single. One. Copy editors, despite their terrifying reputation, can be flexible. Or at least they threw up their hands and got on to more important matters.

I will be forever grateful to my editors, who basically let me get away with murder, and the copy desk at The Arizona Republic, for allowing me to write the way I wanted (and pretty much the only way I knew how). Editors, of both stripes, will always be my heroes. 

John McPhee

Back to John McPhee. He describes the difficulty of spotting typos. Of course most are easily caught. But often the eye scans over familiar phrases so quickly that mistakes become invisible. In a recent blog, I wrote about Salman Rushdie’s newest book, Knife, and I had its subtitle as “Meditations After and Attempted Murder.” I reread my blog entries at least three times before posting them, in order to catch those little buggers that attempt to sneak through. But I missed the “and” apparently because, as part of a phrase that we use many times a day, the eye reads the shape of the phrase rather than the individual words and letters. 

There is a common saying amongst writers: “Everyone needs a copy editor,” and when I retired from The Republic, I lost the use, aid, and salvation of a copy desk. I had to rely on myself and my re- and re-reading my copy. But typos still get through. And on the day after I post something new, I will sometimes get an e-mail from my sister-in-law pointing out a goof. She let me know about my Rushdie “and,” and I went back into the text and corrected it (something not possible after a newspaper is printed and delivered). She has saved my mistakes many, many times, and has become my de-facto copy editor. 

But my training as both writer and copy editor have stood me well. Unlike so many other blog posters, I double check all name spellings and addresses, my math and my facts. I am quite punctilious about grammar and usage. And even though it is no longer required, I am so used to having AP style drilled into me, I tend to fall in line like an obedient recruit. 

In his story, McPhee details trouble he has had with book covers that sometimes misrepresented his content. And that hit me right in the whammy. One of the worst experiences I ever had with the management class came when I went to South Africa in the 1980s. Apartheid there was beginning to falter, but was still the law. I noticed that racial laws were taken very seriously in the Afrikaner portions of the country, but quite relaxed in the English-speaking sections. 

And I wrote a long cover piece for the Sunday editorial section of the paper about the character of the Afrikaner and the racial tensions I found in Pretoria, Johannesburg and Cape Town. The Afrikaner tended to be bull-headed, bigoted and unreflective. And I wrote my piece about that (and the fascist uniformed storm troopers that I witnessed threaten the customers at a bar in Nelspruit). The difference between the northern and eastern half of South Africa and its southern and western half, was like two different countries. 

As I was leaving the office on Friday evening, I saw the layout for the section cover and my story, and the editor had found the perfect illustration for my story — a large belly-proud Afrikaner farmer standing behind his plow, wiping the sweat from his brow and looking self-satisfied and as unmovable as a Green Bay defensive tackle. No one was going to tell him what to do. “Great,” I thought. “Perfect image.”

But when I got my paper on Sunday, the photo wasn’t there, being replaced by a large Black woman waiting dejectedly at a bus station, with her baggage and duffle. My heart sank. 

When I got back to the office on Monday, I asked. “Howard,” was the reply. Earlier in this blog, I mentioned management, with which the writer class is in never-ending enmity. Howard Finberg had been brought to the paper to oversee a redesign of The Republic’s look — its typeface choices, its column width, its use of photos and its logos — and somehow managed to weasel his way permanently into the power structure. He was one of those alpha-males who will throw his weight around even when he doesn’t know or understand diddly. I will never forgive him. 

He had seen the layout of my story and decided that the big Afrikaner, as white as any redneck, simply “didn’t say Africa.” And so he found the old Black woman that he thought would convey the sense of the continent. Never mind that my story was particularly about white South Africa. Never mind that he hadn’t taken the time to actually read the story. That kind of superficial marketing mentality always drives me nuts, but it ruined a perfectly good page for me. Did I say, I will never forgive him? 

It reminds me of one more thing about management. In the early 2000s, when The Republic had been taken over by the Gannett newspaper chain, management posted all over our office, on all floors, a “mission statement.” It was written in pure managament-ese (which I call “manglish”) and was so diffuse and meaningless, full of “synergies” and “goals” and “leverage” that I said, “If I wrote like that, I’d be out of a job.” 

How can those in charge of journalism be so out of touch with the language which is a newspaper’s bread and butter? 

These people live in a very different world — a different planet — from you and me. I imagine them, sent by Douglas Adams, on the space ship, packed off with the phone sanitizers, management consultants, and marketing executives, sent to a tiny forgotten corner of the universe where they can do less harm. 

One final indignity they have perpetrated: They have eliminated copy editors as an unnecessary cost. When I retired from the newspaper, reporters were asked to show their work to another writer and have them check the work. A profession is dying and the lights are winking out all over Journalandia. 

Click on any image to enlarge

Of all the pop psychology detritus that litters our culture, none bothers me more than the fatuous idea of “closure.” People talk about it as if it were not only a real thing, but an obvious one. But “closure” is a purely literary concept, ill suited to describe the actual events of our lives. 

By “literary,” I mean that it fulfills the esthetic necessity we humans feel to round out a story. A story must have a beginning, a middle, and an end (“but not necessarily in that order,” said French filmmaker Jean-Luc Godard). For each of us, our “life story” is a kind of proto-fiction we create from the remembered episodes of our lives. We are, of course, the hero of our own life story, and the supporting characters all fit into a tidy plot. 

But, of course, actual life is not like that. Rather it is a bee-swarm of interconnecting and interacting prismatic moments seen from the billion points of view of a riotously populated planet. There is no story, only buzzing activity. Eight billion points of view — and that is only counting the human ones. One assumes animals and even plants have their own points of view and no narrative can begin to encompass it all. It is all simply churn. 

Of course, there are anecdotes, which are meant to be stories, and end, usually, with a punchline. Like a joke, they are self-contained. But our lives are not anecdotes, and tragedies, traumas, loss, are not self-contained. There is no punchline.

So, there is a smugness in the very idea that we can write “fin” at the completion of a story arc and pretend it means something real. It is just a structure imposed from outside. 

In his recent book, Knife: Meditations After an Attempted Murder, author Salman Rushdie notes the meaninglessness of the concept of “closure.” After he was attacked by a would-be assassin in 2022, he came desperately close to death, but ultimately survived. The thought that he might face his attacker in court might bring some sort of closure is dismissed. He went through medical procedures and therapy, and even the writing of the book. “These things did not give me ‘closure,’ whatever that was, if it was even possible to find such a thing.” The thought of confronting his attacker in court became less and less meaningful. 

Writers, in general, are put off by such lazy ideas as “closure.” Their job is to find words for actual experience, words that will convey something of the vivid actuality of events. Emily Bernard, author of Black is the Body was also the victim of a knife attack, and her book is a 218-page attempt to come to terms with her trauma: The book opens up a life in connection with the whole world. She never uses the word “closure.” 

Both Bernard and Rushdie to their utmost to describe their attacks with verbal precision and without common bromides. It is what all serious writers attempt, with greater or lesser success. It is easy to fall into patterns of thought, cultural assumption, cliches. It is much harder to express experience directly, unfiltered. 

The need to organize and structure experience is deeply embedded in the human experience. And art, whether literary, musical, cinematic or visual, requires structure. It is why we have sonnets and sonata-form, why we have frames around pictures, why we have three-act plays. 

The fundamental structure of art is the exposition, the development, and the denouement. Stasis; destablization; reestablishment of order. It is the rock on which literature and art is founded. When we read an autobiography, there is the same tripartite form: early life; the rise to success with its impediments and challenges; and finally, the look back at “what we have learned.” 

We read history books the same way, as if U.S. history ended with the signing of the Constitution, or with Appomattox, or the Civil Rights movement, or the election of Reagan. But history is a continuum, not a self-contained narrative. Books have to have a satisfying end, but life cannot. 

Most of us have suffered some trauma in our lives. It could be minor, or it could be life-changing. Most often it is the death of someone we love. It could be a medical issue, or a divorce. We are wrenched from the calm and dropped into a turmoil. It can leave us shattered. 

And the story-making gene kicks in and we see this disruption as the core of a story. We were in steady state, then we are torn apart, and finally we “find closure.” Or not. Really no, never. That is only for the story. The telling, not the experience. 

In truth, the trauma is really one more blow, one more scar on the skin added to the older ones, one more knot on the string. We will all have suffered before, although the sharpness may have faded; we will all suffer again. 

Closure is a lie. All there really is is endurance. As Rushdie put it, “Time might not heal all wounds, but it deadened the pain.” We carry all our wounds with us, adding the new on top of the old and partly obscuring what is buried. 

There are myriad pop psychology tropes. They are like gnats flying around our heads. Each is a simplifying lie, a fabricated story attempting to gather into a comprehensible and digestible knot the infinite threads of a life. 

I have written many times before about the conflation of language and experience, and how we tend to believe that language is a one-to-one mirror of reality, when the truth is that language is a parallel universe. It has its own structure and rules — the three-act play — while those of non-verbal life are quite other. And we will argue — even go to war — over differences that only matter in language (what is your name for the deity?)

Most of philosophy is now really just a branch of philology — it is about words and symbols. But while thoughtful people complain about the insular direction that philosophy has taken, it has really always been thus. Plato is never about reality: It is about language. His ideal bed is merely about the definition of the word, “bed.” As if existence were truly nouns and verbs — bits taken out of context and defined narrowly. Very like the question of whether something is a particle or a wave, when in truth, it is both. Only the observation (the definition) will harness it in one form or the other. It is all churn. πάντα χωρεῖ

A story attempts to make sense of the senseless. I’m not sure life would be possible without stories, from the earliest etiology of creation myth to the modern Big Bang. All those things that surpass understanding can only be comprehended in metaphorical form, i.e., the story. 

But stories also come in forms that are complex or simple, and are true or patently silly. My beef with “closure” is that it isn’t a story that reflects reality, but a lie. A complacent lie. 

Like most of popular psychology, it takes an idea that may have some germ of truth and husks away all the complex “but-ifs” and solidifies it into a commonly held bromide. It is psychobabble. 

That is a word, invented by writer Richard Dean Rosen in 1975, which he defines as “a set of repetitive verbal formalities that kills off the very spontaneity, candor, and understanding it pretends to promote. It’s an idiom that reduces psychological insight to a collection of standardized observations that provides a frozen lexicon to deal with an infinite variety of problems.”

And afternoon TV shows, self-help books and videos, and newspaper advice columns are loaded with it. It is so ubiquitous that the general populace assume it must be legitimate. We toss around words such as co-dependent, denial, dysfunctional, empowerment, holistic, synergy, mindfulness, as though they are actually more than buzz words and platitudes. Such words short-circuit more meaningful understanding. Or a recognition that there may be no understanding to be had. 

(In 1990, Canadian psychologist B.L. Beyerstein coined the word “neurobabble” as an extension of psychobabble, in which research in neuroscience enters the popular culture poorly understood, with such buzz words as neuroplasticity, Venus and Mars gender differences, the 10-percent-of-the-brain myth, and right- and left-brain oversimplifications.)

 As a writer (albeit with no great claim to importance), I know how often I struggle to find the right word, phrase or metaphor to reach a level of precision that I don’t find embarrassing, cheap, or an easy deflection. Trying to find the best expression for something distinct, complex and personal — to try to be honest — is work. 

This is true in all the arts: trying to find just the right brown by mixing pigments,; or the right note in a song that is surprising enough to be interesting, but still makes sense in the harmony you are writing in; or giving a character in a play an action that rings true. We are so mired in habits of thought, of culture, that finding that exactitude is flying through flak.

Recently, Turner Classic Movies ran a cheesy science-fiction film I had never seen before. I grew up on bad sci-fi movies from the 1950s and always enjoyed them, in the uncritical way a 9-year-old watches movies on television: Quality never entered the picture. At that age, oblivious there even was such a thing. It wiggled on the screen; I watched. 

But this film was released in 1968, too late for me. When I had gone off to college, the only films I watched were snooty art films. And so I never got to see The Green Slime. Now, here it was, and it was prodigiously awful. Actor Robert Horton fights an alien invasion of tentacled, red-eyed monsters. 

Everything about The Green Slime was awful: the acting, the lighting, the set design, the special effects — and, of course, the science. Or lack of it. There was the garish color of sets and costumes and the over-use of the zoom lens, the way of made-for-TV movies of the era. I have outgrown my open-hearted love of bad science fiction. I stared in wonder at the horribleness I was seeing on the TV screen. 

And it was the acting, more than anything, that appalled me. Why were these actors so stiff, wooden, even laughable? And something I guess I had always known, but never really thought about jumped to mind: Actors are at the mercy of writers. The dialog in Green Slime was stupid, awkward and wooden.

There is some dialog so leaden, so unsayable, that even Olivier can’t bring it off. Robert Horton, while no Olivier, was perfectly fine on Wagon Train, but here he looked like he was lip-synching in a foreign tongue. 

“Wait a minute — are you telling me that this thing ‘reproduced’ itself inside the decontamination chamber? And, as we stepped up the current, it just … it just grew?”

I remember, years ago, thinking that Robert Armstrong was a stiff. I had only seen him in King Kong and thought he was a wooden plug of an actor (not as bad, perhaps as Bruce Cabot, but still bad. But years later, I’d seen him in other films where he was just fine. Even in Son of Kong, he was decent. But no one, absolutely no one can pull off a line like “Some big, hardboiled egg gets a look at a pretty face and bang, he cracks up and goes sappy!”

Even in a famous movie, clunky dialog can make an otherwise fine actor look lame. Alec Guinness and James Earl Jones may be able to pull off the unspeakable dialog of the original Star Wars, but for years, I thought Mark Hamill was a cardboard cut-out. It was only when seeing him in other projects I realized that Hamill could actually act. I had a similarly low opinion of Harrison Ford because of what he was forced to mouth in those three early franchise films. George Lucas did them no favors. 

There is certainly a range of talent in movies and some genuinely untalented actors who got their parts by flashy looks or sleeping with the producer. But I have come to the opinion that most (certainly not all) actors in Hollywood films are genuinely talented. Perhaps limited, but talented, and given a good script and a helpful director, can do fine work. 

One thinks of Lon Chaney Jr., who is wooden, at best, as the Wolfman. But he is heartbreaking as Lenny in Of Mice and Men — perhaps the only chance he ever got to show off what he was actually capable of. 

“Lon Chaney was a stiff, but he had Lenny to redeem him,” said my brother Craig, when we discussed this question. Craig can be even more critical than me. 

He continued, “I’ve been trying to think of the worst actors ever — someone who has never said a word like a human being. There are a lot of people who got into a movie because they were famous for something else (Kareem Abdul-Jabbar, Joe Louis, Audie Murphy) so it’s hard to judge them fairly as actors, like you can’t criticize a pet dog for barking the national anthem but not hitting the high notes. But even Johnny Weissmuller was pretty effective in the first Tarzan; Elvis had Jailhouse Rock where he actually played a character; and Madonna can point to Desperately Seeking Susan without shame. (Everything else, shameful. There just isn’t enough shame in the world anymore.)

“There are any number of old cowboy stars who couldn’t speak a believable line of dialog and that can’t be totally blamed on the writing. (Gabby Hayes rose above it.) There are bad actors who still had some life and energy about them that made them fun to watch. Colin Clive was silly, and he made me giggle, so he was entertaining. And  Robert Armstrong. But there’s just no excuse for Bruce Cabot.

“I’ve never actually seen a Steven Seagal movie,” Craig said, “but I know enough to say with conviction that he should have been drowned as a baby.”

I said Craig can be tougher than me, but here, I have to concur. 

“It’s probably not fair to pick out silent movie actors for being silly and over the top, but there is Douglas Fairbanks to prove you can be silent and great.”

Silent acting was a whole different thing, and hard to judge nowadays. As different from modern film acting as film acting is from acting live on stage. The styles don’t often translate. John Barrymore was the most acclaimed Shakespearean actor in America in the early years of the 20th century, but his style on celluloid came across as pure ham. (Yes, he was often parodying himself on purpose, but that doesn’t gainsay what I am saying about acting styles). 

Every once in a while, I see some poor slob I always thought was a horrible actor suddenly give an outstanding performance. Perhaps we have underestimated the importance of the casting director. A well-placed actor in a particular part can be surprising perfection. There is creativity in some casting offices that is itself an artform. You find the right look, voice, or body language, and a minor part becomes memorable. Some actors are wonderful in a limited range of roles. I can’t imagine Elisha Cook as a superhero, but he is perfect as a gunsel. 

And Weissmuller was the perfect Tarzan before his clumsy line reading became obvious in the Jungle Jim series. I am reminded of Dianne Wiest in Bullets Over Broadway: “No, no, don’t speak. Don’t speak. Please don’t speak. Please don’t speak.”

Keep in mind, actors are subject to so many things that aren’t in their control. In addition to good writing, they need a sympathetic director, decent lighting, thoughtful editing, even good costume design. Filmmaking is collaborative and it isn’t always the actor’s fault if he comes across like a Madame Tussaud waxwork. I’ve even seen Charlton Heston be good. 

In reality, I think of film actors much as major league ballplayers. The worst baseball player on a major league team may be batting under the Mendoza line, and even leading the league in fielding errors, but in comparison with any other ballplayers, from college level to minor leagues, he is superhumanly talented. Even Bob Uecker managed to hit a home run off Sandy Koufax. I doubt any of us could have done that. And so, we have to know who we’re comparing them to.

I saw a quote from Pia Zadora the other day (she just turned 70) and with justifiable humility, she said, “I am often compared to Meryl Streep, as in ‘Pia Zadora is no Meryl Streep.’” Still, compared to you or me, she is Bob Uecker. 

I have had to reassess my judgment of many actors. I had always thought of John Wayne as a movie star and not an actor. But I have to admit, part of my dislike of his acting was disgust over his despicable political beliefs. And I thought of him as the “cowboys and Indians” stereotype. 

But I have now looked at some of his films with a clearer eye, and realize that, yes, most of his films never asked anything more from him than to be John Wayne — essentially occupying a John Wayne puppet suit — but that when tasked by someone such as John Ford or Howard Hawks, he could actually inhabit a character. “Who knew the son-of-a-bitch could actually act!” Ford himself exclaimed. 

But there it is, in The Searchers, She Wore a Yellow Ribbon, Red River, The Quiet Man, Rio Bravo, The Shootist. Those were all true characterizations. (Does all that cancel out The Alamo or The Conqueror or The War Wagon, or balance all the undistinguished Westerns he made? We each have to judge for ourselves). 

Even in his early Monogram oaters, playing Singing Sandy, he brought a naturalness to his presence that is still exceptional in the genre (and researching Wayne, my gasts were flabbered at how good looking he was as a young man. So handsome he was almost pretty. And that hip-swinging gait that predates Marilyn Monroe. “It’s like Jell-O on springs.” It seems notable that so much feminine could become the model of such lumpen masculinity.)

And even great actors have turned in half-ass performances, or appeared in turkeys. In Jaws: The Revenge, Michael Caine has to utter things like, “Remind to tell you about the time I took 100 nuns to Nairobi!” Caine famously said, “I have never seen the film, but by all accounts it was terrible. However I have seen the house that it built, and it is terrific.”

Even Olivier had his Clash of the Titans.

Actors have to work, and they don’t always get to choose. “An actor who is not working is not an actor,” said Christopher Walken. The more talented actors sometimes get to be picky, but the mid-range actor, talented though he or she may be, sometimes just needs the paycheck. 

I sometimes think of all the jobbing actors, the character actors of the 1930s, working from picture to picture, or the familiar names on TV series from the ’50s and ’60s — the Royal Danos, the John Andersons, the Denver Pyles, dressed as a grizzled prospector for a week on one show, going home at night for dinner and watching Jack Benny on the tube, and then driving back to the set the next morning and getting back into those duds. And then, next week dressing as a banker for another show, putting together a string of jobs to make a living. And all of them complete professionals, knowing what they are doing and giving the show what it needs. A huge amount of talent without ever having their names above the title. 

Royal Dano times four

And so, I feel pity for those actors of equal talent who never broke through, or who were stuck in badly-written movies and couldn’t show off their chops. When I watch reruns of old episodic TV, I pay a good deal more attention than I ever did when I was young, to all the parts that go into making such a show. I notice the lighting, the editing, the directing, and most of all, the writing. The writing seems to separate, more than anything else, the good series from the mediocre ones. And how grateful those working actors must feel when they get a script that gives them a chance to shine. 

Is just being alive enough? It is a question I have been facing, with continued difficulty, ever since I retired a dozen years ago after 25 years as a newspaper writer. 

For all those years, and for the many years before, I held jobs that contributed, in some way — often small, even negligible — to the business of society. I had a sense of being productive. This is not to make any major claim about how important my production was. It was admittedly quite minor. But it was a contribution. 

Doing so was a part of my sense of self, that being a productive member of society was not merely a way of occupying my time, but was actually a moral duty. If I were slacking off, I would be harming my society. And even worse, harming my immortal soul (something I don’t actually believe in).  

This is not something I thought much about on a conscious level. In fact, when I do think about it, I realize it’s quite silly. Society gets along quite well without my input. But it is buried deep down somewhere in my psyche that I must be productive. 

The opposite of being productive is being lazy. And I can’t help but feel that laziness is a moral failing. I have tried to excavate my brain to discover where this sense comes from and I cannot be sure. 

The easy answer comes up, “Protestant work ethic,” and it is true that I was raised in such an environment. But religion has never played an important part of my life. As I have said before, I have no religion; I’m not even an atheist. 

But somehow, I seem to have been injected with this guilt about not always doing something. Making something; teaching something; selling something; performing something. 

It is true that my grandparents, on both sides of the family were quite religious. My father’s parents were even infected with a kind of Lutheran religious mania. They went to church three times a week, prayed constantly, and when they were young, before World War II, my father and his siblings were not allowed to listen to the radio, to music or to dance. In fact, this church-craziness led my father to promise never to inflict this kind of joyless religion on his children. 

And so, although we all went to church on Christmas and Easter, it was only to make my mother’s mother happy. She was religious in a more normal way, and was always kind and loving. But I and my two brothers managed to escape our childhoods without any religious sentiment at all. 

 Or so it seems. While I have no supernatural beliefs — the whole idea of a god or gods seems pointless — something of the culture seems to have leaked in. 

For all of my 25 years at the newspaper, I averaged about three stories per week. I always felt as if I were slacking off and that I should be writing more. My editors constantly told me I was the most productive member of the features staff. But it never felt that way. Even on vacations, I took daily notes before going to bed, and used those notes to write travel stories for the paper when I got back to the office. 

Before I retired, I used the computerized data base to check on my output and discovered I had written something like 3 million words during my tenure. If an average novel is about 90,000 words, it means I wrote the equivalent of more than 30 novels in that time. My last project for the paper was a 40,000 word history of architecture in Phoenix. 

And so, when I left my job, it was like stepping off a moving bus,  racing to a halt and trying to keep my balance. 

My colleagues at the paper bought me a blog site as a retirement gift, and I began writing for it instead of the newspaper. At first, I was writing an average of three blog posts per week, unchanged from my time at work. 

I have slowed down greatly since then, and am now aiming for about three posts a month. I don’t always make that many. But I have written more than 750 blog entries in the 12 years since I left the newspaper. which is still more than one a week. And I also write a monthly essay for the online journal of the Spirit of the Senses salon group of Phoenix. That’s an additional 103 essays, each averaging about 1500 words. Blog and journal, it all adds up to about an additional million and a half words written since giving up employment. Old writers never really retire, they just stop getting paid. 

And none of this is paid work. I write because I cannot not write. When I am not blogging, I am writing e-mails. Old-fashioned e-mails that are more like actual letters than the quick one- or two-sentence blips that constitute most e-mails. Scribble, scribble, scribble, eh, Mr. Nilsen? 

But that all brings me back to my original concern: Is just being alive enough? Can I in good conscience spend an hour or two sitting in my back yard and listening to the dozens of birds chattering on, watching the clouds form and reform as they sail across the sky dome, enjoying the random swaying of the tallest tree branches in the intermittent wind? Thinking unconnected thoughts and once in a while noticing that I am breathing?

In 1662, Lutheran composer Franz Joachim Burmeister wrote a hymn titled Es ist genug (“It is enough”) that Johann Sebastian Bach later wrote into one of his more famous cantatas. It is notable for including a tritone in its melody. And, in 1935 Alban Berg incorporated it in his violin concerto, written “in memory of an angel” after the death of 18-year-old Manon Gropius, daughter of Alma Mahler. It is one of the most heartbreakingly beautiful musical compositions of the 20th century. Es ist genug

I remember reading that in India, the idealized life is understood to be a youth of play, and adulthood of work and an old age of seeking spiritual truths. That one is meant to lay down one’s tools and contemplate what it has all been about. And I take some comfort in the possibility that, at the age of 76, it is now my job no longer to produce, but to absorb all those things that were irrelevant to a normally productive life. To notice my own breathing; to feel the air on my skin; to recognize my tiny spot at the axis of my own infinitesimal consciousness in an expansive cosmos. To attempt to simply exist and to feel the existence as it passes. 

There is a class of movie that deserves special mention. The films aren’t necessarily the best, although they tend to be decent. They don’t usually show up on Top 10 lists or all time greats. But the fact is, that when they show up on TV, often late at night, we will watch them over and over. I don’t necessarily tune in on purpose, and don’t set the DVR to record, but if I tune in halfway through, I’ll see them out to the end. 

These are movies we know almost by heart. There is an amiability to them. Like a favorite tune we like to hum along with, I’ll recall the dialog or the set pieces. A good tune never wears out its welcome. 

I thought about this one night when I was clicking the clickerator and came upon Support Your Local Sheriff. It was bedtime and I was about to turn off the tube, but instead, I sat back down and saw the thing through. Not a notably good movie, but just so pleasant, that I watched yet again to see Walter Brennan do his Walter Brennan imitation. And there’s Bruce Dern and Jack Elam, and Gene Evans and Henry Jones and Harry Morgan and Walter Burke. All great character actors doing what they do best: carefully etched characters, albeit caricatures, but all memorable and distinct. 

This is not a claim that the movie is one of the great classics of cinema, but I can’t help but just enjoy the heck out of it whenever it’s on. Old friends I’d drop in on and visit. 

And it’s far from the only such film. There’s a whole class of them. Among notable “rewatchables” are My Cousin Vinny or Key Largo or The Blues Brothers. Such a list will be entirely personal, although there are probably movies that show up on a majority of lists, the consensus rewatchables. 

There are movies I will choose to watch again and again. They are favorites and I will seek them out. But this list isn’t about that, but about happening on one when channel surfing and finding one that is an old shoe, comfy, familiar. I have the dialog memorized, and no matter if it’s just starting or soon ending, I will keep it on and watch, under various levels of engagement, until it ends. Not so much movies I choose to watch, but that I happen upon and stay with. 

There are movies that, because of this habit, I have seen the end of many times, but seldom see the beginning. For all the times I’ve seen the beginning of Airplane! or The Fifth Element, I’ve seen their endings at least a dozen times. You catch these films mid-flight and ride until they land. 

(There is a subset of films where it is only the beginning that I watch over and over — If Turner Classic Films is showing 2001: A Space Odyssey, I will watch the prehistoric beginning but then tune out. Not that I don’t think the rest of the movie is good, but because it is only the opening that has this over-and-over quality of a favorite song that scratches a certain cinematic itch.)

When I consider what makes a movie rewatchable by this standard, there are a few things that seem to be true. 

First, plot doesn’t matter much. Movies that I will stay to watch are composed of memorable set pieces rather than a story with a goal-oriented ending. It is the set-pieces that I want to see, each scene a mini-story in itself. 

Second, they feature memorable dialog. Snappy chatter and witty responses. 

Third, they feature memorable characters, whether germane to the plot or not, and usually played by memorable character actors. 

Sometimes the attraction is none of the above, but just how bad the movie is. My brother says, “Growing up, I’d watch any movie with robots in it. Still will. I’ll visit most any ’50s movie with a monster or a rocket ship (or monster in a rocket ship). Stupid and cheesy and incompetent don’t matter.” 

And so, Plan 9 From Outer Space is a Class-A dip-in-at-any-time film (I hesitate to even use the word “film” in this context, as the word implies a certain level of craftsmanship famously missing in this “classic.”) But it has memorably dippy dialog (“We are all interested in the future, for that is where you and I are going to spend the rest of our lives. And remember, my friend: Future events such as these will affect you in the future.”) It has memorable characters, like Vampira or Tor Johnson. And it has character actors, such as Lyle Talbot and cowboy star Tom Keene, doing their best with the unspeakable script. 

At the opposite end of the quality spectrum is Citizen Kane, which is the acme of episodic great dialog with wonderful actors. Lots of scenes to remember in discrete chunks, any of which can be pulled out and dissected line by line and feel complete in themselves. 

I came up with a list of about 40 films that fill the bill and I know there are at least that many again I have forgotten to include. Among them are The Bride of Frankenstein (mostly for the scenes with Ernest Thesiger), Them!, Duck Soup, Dracula, Rio Bravo, and Beetlejuice. There is no average quality level, they run from Seventh Seal to Harold and Kumar Go to the White Castle

The most important quality of most of the films on my list (although not all of them) is that episodic structure. Francis Coppola’s Godfather is often described as “operatic,” and that is dead-on: Like opera, the rewatchable film is made up of recitatives, arias and choruses. And the same way you can make a concert program of favorite arias, you can do the same with favorite movie scenes. 

I will watch any black-and-white Fred Astaire film for the dance scenes. And any film with a Busby Berkeley extravaganza in it, although, once the plot creaks back into action, I’ll tune out. Each Berkeley choreography is an esthetic whole complete in itself.

The opening 20 minutes of Tarkovsky’s Solaris is intensely beautiful and I will set my DVR for it, just for those minutes, I don’t often take on the whole, long film that trails behind. 

Bogart and Charles Waldron, upper left; with Sonia Darrin, upper right; with Dorothy Malone, lower left; with Lauren Bacall, lower right

The essential set-piece rewatchable film is Howard Hawks’ The Big Sleep. You cannot watch it for the plot. As a whodunnit, it is hopeless. But each scene is a carefully crafted gem, beginning with perhaps my favorite, Bogart’s interview with the old General Sternwood. (“If I seem a bit sinister as a parent, Mr. Marlowe, it’s because my hold on life is too slight to include any Victorian hypocrisy. I need hardly add that any man who has lived as I have and who indulges, for the first time, in parenthood, at my age, deserves all he gets.”) Includes verbal fencing with Lauren Bacall and Martha Vickers (“Your not very tall, are you?” “I try to be.”) Snappy parrying with Sonia Darrin (“You do sell books. Hmm?” “What do those look like, grapefruit?”) A racy scene with Dorothy Malone skirting the boundaries of the Code, and lines with the cab driver Joy Barlow, John Ridgely (Eddie Mars), Regis Toomey, Charles D. Brown (Norris) and Louis Heydt (Joe Brody), to say nothing about some really cruel lines given to Bob Steele as Canino. 

In the end, you don’t really care who did what to whom, but you are grabbed by the gloss and flash of the individual scenes. Which makes Big Sleep the champ of rewatchable movies. 

Pulp Fiction is another film built from scene-blocks, in this case all shuffled around. Is there anything more memorable — or more extraneous to the plot — than Christopher Walken explaining the provenance of a watch? It seems that the best parts of the film are all those that are completely unnecessary for the story. “You know what they call the Big Mac in France?” 

A film like North by Northwest might seem to be about a through-driven story, but really, it is also just a series of memorable scenes strung together. Each scene — the cropduster attack; the auction scene; the Mount Rushmore scene; and the final dirty joke — are all just pearls on a string.

Many of the series movies from Hollywood in the ’30s and ’40s are endlessly rewatchable, in part because what plots they have are practically interchangeable. “I’ll watch any Charlie Chan,” says my brother, and TCM devotee, Craig. ”I’ll watch Mr. Moto, but they are a rung below Charlie Chan, and the Falcon movies are a rung below that, and Boston Blackie, another rung down, but, hell, I’ll still watch them.” 

You just want to soak up the cinematic ambience of their docksides and back alleys. The fog, the boat horns, the apartment staircases, the eavesdropping at closed doors. 

“Mostly, my list taunts me, saying ‘You are a man of Low Tastes,’ and I guess it’s true,” Craig says. “And my list seems to be almost all American, and old. But these are just the movies that occur to me off the top of my old and balding head. There are a ton of movies that could be on my list, if I could remember them.”

The first movie I began watching endlessly was King Kong, which I first saw when I was in first or second grade and was shown over and over on New York’s Channel 9 (WOR-TV). In the seven decades that have followed, I must have seen it close to a hundred times — maybe more. I will still watch it whenever I catch it being played. And that despite the creaky borrowed plot (mainly from the silent Lost World) the stilted dialog, and the acting, where Bruce Cabot shows off all the acting prowess of a loblolly pine. 

It was Kong that showed to me the possibility that a movie was worth watching multiple times. There are those who don’t partake, for whom the main interest in the film is the plot and having once seen it, “I know how it ends, so why would I watch it again?” And, indeed, there are many movies for which that is the main draw: The story line pulls you along and having once satisfied your need to know “what happens next,” you have emptied the film of its meaningful content. 

But, for me, the movies I’m talking about are more like music. You can listen to Beethoven’s Fifth many times, drawing something fresh from it with each hearing. Or listen to the Beatles’ Hey Jude over and over, and each time, it tickles just the spot that needs the stimulus. Bingo. Dead on. 

Who ever heard of someone who didn’t want to hear their favorite song again because “I’ve already heard it?” (I remember that bastion if intellectual curiosity Ronald Reagan saying “You’ve seen one redwood tree, you’ve seen them all.”) 

I will never get enough of any of the Thin Man movies, even Song of. Nor will I turn down The Thing with James Arness, nor M with Peter Lorre, nor Touch of Evil, nor Time Bandits

You see this is an eclectic list of movies, and not based on quality alone, nor on subject matter or genre, but entirely on that subjective and personal sense of rewatchability. 

What is on your list? 

Click on any image to enlarge

I’ve been to the Louvre in Paris a number of times, but no matter how long I spend there, I never feel as if I’ve seen more than two percent of it. It is vast. It is the largest museum in the world, with 782,910 square feet of floor space (topping the No. 2 museum, St. Petersburg’s Hermitage, by more than 60,000 sq. feet) and a collection of more than 600,000 pieces. 

It’s where you go to find the Mona Lisa, the Winged Victory of Samothrace and the Venus de Milo.

It’s one of the oldest museums around, but never seems quite finished. It began as a royal palace in the 12th century, and has been added on to, parts burned down, parts replaced, and even a glass pyramid added to the top. 

When Louis XIV moved the court from the Louvre to Versailles in 1682, the building became a warehouse for kingly treasures and much of his art collection. in 1699, the first “open house,” or salon was held, and for a century, the royal academy of art was located there. 

The French Revolution ended the monarchy, and all the art once owned by the king became public property, and in 1793, the new government decreed that the Louvre should be open to the citizens as a free art museum. 

But soon after, the collection expanded exponentially, as Napoleon Bonaparte conquered half of the continent, and sent back to Paris a good deal of the art from conquered lands. He even had the museum renamed Musée Napoléon. That didn’t last, but neither did Napoleon. 

Over the 19th century, the museum collection grew, from bequests, purchases and colonial expropriations. For a while, it included a whole section of Pre-Columbian art from the New World, but that spun out into its own museum, leaving the Louvre for the Musée d’Ethnographie du Trocadéro in 1887; in 1945, the Louvre’s extensive collections of Asian art were moved to the Guimet Museum; and by 1986, all the museum’s art made after 1848, including Impressionist and Modernist work, was transferred to the Musée d’Orsay, a refurbished railways station. It seemed the Louvre kept bursting its seams. 

Then came François Mitterrand. Serving as French president from 1981-1995, Mitterrand conjured up the Grand Project to transform the cultural profile of Paris, with additional monuments, buildings, museums, and refurbishment of existing locations. Taxes were raised to accomplish this project, said to be on a scale that only Louis XIV had attempted. 

Part of this plan was the Grand Louvre, to remodel and expand the museum, and to regularize (as much as possible) the maze and warren of galleries in the old accretion of palace rooms. The most visible of the changes was the addition of the glass pyramid in the center courtyard of the palace. It was designed by architect I.M. Pei and although it has long become part of the landscape of the museum, it still angers many of the country’s more conservative grouches. In 2017, The American Institute of Architects noted that the pyramid “now rivals the Eiffel Tower as one of France’s most recognizable architectural icons.” 

The entire central underground of the courtyard was remodeled to create a new entrance, and to attempt to make sense of the confusion of corridors, rooms, staircases and doorways. It was completed in 1989. 

Now, one cannot think of the Louvre without its pyramid, but speaking as a visitor, while the Hall Napoléon (the underground foyer) has made some sense of the confusion, I cannot honestly claim the chaos has been tamed. The museum remains a labyrinth and you can be easily lost. 

And, unless you have budgeted a month or more to spelunk the entire museum, you will need to prioritize what you want to see in a visit — or two, or three. 

Quick word: Forget the Mona Lisa. It’s a tiny little painting of little artistic note, buried under a Times Square-size crowd of tourists all wanting to see the “most famous painting in the world.” It is what good PR will get you. It may be a historically noteworthy piece as one of the very few paintings Leonardo completed, but there is much better to be seen in the museum. Don’t exhaust yourself in the mêlée

Seek out the unusual, like Jan Provost’s Sacred Allegory, from about 1490, which I like to call “God’s Bowling Ball;” or The Ascension, by Hans Memling, from the same time, which shows Christ rising into heaven, but shows only his feet dangling from the clouds. There’s some quirky stuff on the walls of the Louvre. 

One of the goals of the museum is to collect, preserve, and display the cultural history of the Western world. This is our art, the stuff we have made for more than 3,000 years, from Ancient Sumer and Egypt, through classical Greece and Rome, wizzing past the Middle Ages and brightening with the Renaissance and the centuries that followed. You get the whole panoply and see what tropes have persisted, the ideas that have evolved, the stuff of our psychic landscape. 

(See how the fallen soldier in Jacques-Louis David’s Intervention of the Sabine Women echoes in Picasso’s Guernica. One way of looking at all cultural history is as an extended conversation between the present and the past. The reverberations are loud and clear.)

You can look at the paintings on the wall and see them for the beauty of their colors and brushwork, or the familiar (or not-so-familiar) stories they depict; or you can see them as the physical embodiment of the collective unconscious. 

I have always been a museum-goer. From my earliest times as a boy going to the American Museum of Natural History in New York, through my days as an art critic, rambling through the art museums of the U.S. and abroad. There is little I get more pleasure from. 

One soaks up the visual patterns, makes connections, recognizes the habits of humankind. Recognizes the shared humanity. The differences between me and Gilgamesh are merely surface tics. When I see the hand of the Roman emperor, it is my hand. I feel kinship with all those whose works and images appear in the galleries. 

And so, if it is two percent of the Louvre I have managed to absorb, I know the rest is there, and that it is me, also. 

Click on any image to enlarge

It was 74 degrees today in the Blue Ridge Mountains and spring is edging its way in. There is still some cold weather coming — Monday night is predicted to drop to the mid-20s. But the signs of shifting seasons  are all over. 

The daffodils have popped, the Bradford Pears are white lace, and the empty winter tree branches are feathering out with buds. 

According to the calendar, the new year begins mid-winter, but in practical terms it is the reawakening of nature that lets us know that we can all start over again. The year circles around to the beginning and we can put our overcoats back into the closet. 

It is a comforting thought, but the fact is, the recurrence of spring sits in equipoise with the hurtling forward of age. The trees come alive again, but I only get older. 

I have seen 76 springs, and when I was a boy, each season lasted years. Summer vacation seemed endless and the next school year might as well begin in a science-fiction future, eons away. As a grown-up, the year passed by almost unnoticed. Winter just meant sloppy roads; summer just meant sweat and iced tea. I went to my job every day, no matter. 

But I am old now, and the season change has yet another meaning. 

One of the impenetrable facts of being 76 years old, even in decently good health, is that I have a limited number of springs ahead of me. 

I have to face the possibility that this one now could be the last; there is no counting on next year or the year after that. 

It’s not that I am anxious over the likelihood of my existence being cut short, after all, over eight decades, from 1948 to now, it has been a long and I hope fruitful life. 

But the uncertainty of future springs makes this one more necessary. I am paying attention more than ever before, and although I have always enjoyed the spring, it feels closer to the bone this year. 

I don’t want to miss a moment of it. 

Click on any image to enlarge

We were camping at Huntington Beach in South Carolina, and I woke up before dawn and walked down to the ocean. The sky was beginning to brighten to the east and I watched for the coming sunrise. 

When the sun broke the horizon, its motion was noticeable and I watched it slowly lift from the water. But then, something happened: The sun stopped dead in its tracks and my frame of reference shifted involuntarily and instead of the sun moving up, the earth I was standing on jerked forward, as if I were coming over the top of a ferris wheel and I nearly lost my balance. It seemed the ground was moving away from under my feet, toward the immobile sun. 

At the same time, seawaves reflected the bright copper sheen and the shadowed portions of the water formed a network of glossy black, making the entire landscape before me into a shimmering enameled lattice and more, it seemed not so much to reflect the sun, but rather to be glowing from within. 

The magic lasted only a few moments and the earth stood still again and the sun began climbing once more. I felt that I had been given a chance to see how things really were — a stationary sun and a rotating earth — and the whole, with its copper and black waves, was unutterably beautiful.

Such visions are epiphanies. 

Of course, “Epiphany” means different things. In the Roman church, it is the name for the visitation of the wise men; in the Orthodox churches, it marks the baptism of Jesus and the descent of the dove; according to some early Church fathers, it marks the miracle at the wedding feast at Cana; and for Syriac Christians, it celebrates the rising light of dawn, as expressed in Luke 1:78. In all these versions, it refers to the recognition of divinity as it shines forth. 

But I take the word for its otherwise secular meanings. It is a sudden recognition of reality, or the momentary transformation of the ordinary into something strange, or the psychological state of overlaying the personal in registration with the objective world, the way you might orient a map to match the landscape in front of you. Then the two meld into a single thing. In any version, you experience a moment out of time. 

It is the word James Joyce used when referring to such experiences, usually something quite ordinary, but seen in a new, illuminating way. A theophany with no theos

In his early novel, Stephen Hero, he defined these epiphanies as “a sudden spiritual manifestation, whether in the vulgarity of speech or of gesture or in a memorable phase of the mind itself.” And he “believed that it was for the man of letters to record these epiphanies with extreme care, seeing that they themselves are the most delicate and evanescent of moments.” 

In an early manuscript, Joyce collected 22 pages of these moments he found in his own life, and used many of them later in his finished works. At one point, there were at least 71 epiphanies written down in Joyce’s own handwriting. 

Later in his career, the term grew in meaning and significance, and tends to mean moments of behavior observed or experienced that seem to metaphorically summarize some insight or contain “meaning” in some way or other. 

In ordinary usage, “meaning” is a term of translation: “This means that,” but it has another purpose: significance. An experiences doesn’t have to “mean” something that we can express in paraphrase, or take as a lesson we have learned, but can have meaning, unexplainable except in terms of itself, as when a dream feels meaningful even if you don’t know why. 

I believe we all have such moments. They tend to stick with us. I know I have had them throughout my life. The first one I can remember was at the age of four or five and driving with my family along the Palisades at night, looking across the Hudson River at the constellation of lights in the darkened Manhattan buildings. It was my first remembered experience of something I would call beauty. I couldn’t wait for the next time we visited my grandparents so I could see those lights again. 

Often we function as actors in a stage set, with the world as backdrop. Our focus is on the particular action or conversation, with the set merely happenstance; it could easily be some other set. But the epiphany is when you step back and see actor, set, words, as a single unit, all of a piece. We can live our lives barely noticing the world we walk through, except as it helps or hinders us — it is functional. But that moment comes when the boundary between us and the rest of it all evaporates and we sense ourselves as part of a whole. That instant is the epiphany and for it, time stops, even as the clock keeps moving. It is an uncanny feeling.

It feels as if you are taken out of the real world for a moment, but actually, you are dropped into it. The illusion of separateness is dispelled and you become face to face with something bigger. 

When I was about 10, my younger brother, Craig, and I thought to follow the brook that ran through our property in New Jersey, through the woods behind the house, to see where it went. It ended as it fed into the Hackensack River. We then followed the river to the Oradell Reservoir and followed the railroad tracks. We were crossing a little bridge when a train arrived and we ducked under the bridge, sitting on the concrete abutment  not more than a couple of feet from the screaming wheels of the train as it passed over. Time may have stopped, but the train didn’t. It was thrilling. It was untameably real. 

In high school, I spent one summer vacation in Europe, crossing over the Atlantic on a steamship. After days of faceless unchanging ocean horizon, one  night came when on deck I looked out and saw pinpricks of light in the darkness, maybe 8 or ten miles away. It was the Orkney Islands and I was dumbstruck at their remoteness. They were ghostly lights strung out along the horizon in a seemingly infinite blackness. They seemed unmoored to this now. 

There is often sense of the uncanny, of something we don’t quite see, but feel it is there. 

On night, I was driving up the Big Sur, between San Luis Obispo and Monterey. With the sun finally below the horizon, it was completely black, but with grades of black showing in front of me. The blackest black is rock, rising in cliffs to the right side of the road. The glossiest black was the ocean on the left below. As I whipped along the road with my up-beams gleaming back at me from the reflectors on the road stripe, I could occasionally see a flash of light in the corner of my eye. When I looked, there is nothing, but when I turned back to the road, it flashed again. But there seemed to be something riding beside my car. I called it the “God of the Nighttime Highway.” 

It turned out to be my own running lights reflecting off the guard rail at the edge of the road. But for 10 minutes or so, until I figured it out, the experience was eerie and I almost believed in a spirit world that I don’t believe in. 

I imagine it must be episodes like this that gave rise to the myths and folklores of the ancients. The experience feels so real, it must be real. 

And these epiphanies are not especially rare. I’ve had many in my life. My wife and I had left Yellowstone National Park early on a gray, rainy day, driving eastward on the North Fork Highway through Wyoming’s Shoshone Canyon. At the canyon’s mouth the land broadened out and dipped down into vast plains with the Buffalo Bill Reservoir in the distance. We had just turned on the radio and the skies suddenly parted and the scene before us was drenched in sunlight just as the radio began pouring out the early morning sign-on music of “America the Beautiful,” and the Mormon Tabernacle Choir sang, “Oh beautiful for spacious skies and amber waves of grain…” And there it was, before us, just as in the song, and we had to laugh, but also we had to recognize the emotional power of what we were seeing. 

Once, camping at the Outer Banks with my friend Sandro, we walked along the beach at Hatteras Point at night, carrying a Coleman lantern. The air was so humid that it was on the edge of becoming fog. And the light we carried threw our shadows up into the sky, among the stars, and we could see we were giants. 

Or, visiting Verdun in France, my wife and I drove through the old World War I battlefields that had been blasted into moonscape by artillery fire, but now had grown back into woodlands. But there, between the tree trunks, the shell craters were still there, pock-marking the ground nearly a hundred years later. 

I’ve had that strange recognition many times when visiting old battle sites — as if the past is always present. I’ve had it at Antietam, at the Little Bighorn, at Shiloh, at the Normandy beaches, at Wounded Knee, at Appomattox. The epiphany that breaks through isn’t just history as you read it in books, but rather the persistence of events: that what once happened is still happening; wave ripples running out through time. 

I once spent the night alone on the North Rim of the Grand Canyon in Arizona. My campsite was a good 30 miles from any other human being and the sky was darker than any I had seen before or since and the stars were spilled like beach sand across the expanse, with the Milky Way splitting the dome in two. About 3 in the morning, I woke up, left the tent and sat on the hood of my car, staring up at the infinity. I stared for maybe 20 minutes or a half hour, and a kind of hypnosis took over and I no longer felt like I was on my back staring up, but rather as if I were at the forward point of a planet racing through infinite space toward those stars. The planet was at my back, and I could almost feel the wind on my face as this planetary vehicle was racing forward toward the lights. 

And, of course, this is exactly what was happening. The ordinary sense of terra firma under a wide sky is the illusion. The recognition of a giant ball of earth and water raging through an infinite void is the reality. Sometimes we see it that way. 

And that is the epiphany.