Archive

Monthly Archives: May 2024

The venerable writer John McPhee wrote a short, episodic memoir for the May 20, 2024 edition of The New Yorker, and in it he discussed proofreading. The piece hit home with me. 

I began my career at The Arizona Republic as a copy editor, which is not exactly the same thing as a proofreader, but many of the duties overlap, and many of the headaches are the same. 

 A proofreader, by and large, works for a book publisher and will double check the galley proofs of a work for typos and grammatical errors. The work has already been typeset and a version has been printed, which is what the proofreader goes over. 

A copy editor works for a magazine or newspaper and is usually one of a team of such editors, who read stories before they are typeset and check not only spelling and grammar, but factual material and legal issues, to say nothing of that great bugbear, the math. English majors are not generally the greatest when dealing with statistics, percentages, fractions — or for that matter, addition or subtraction. 

Arizona Republic staff, ca. 1990

In a newspaper the size of The Republic, a reporter turns in a story (usually assigned by the section editor) and that editor then reads it through to make sure all the necessary parts are included, and that the presentation flows in a sensible manner. Section editors are very busy people, dealing with personnel issues (reporters can be quite prissy); planning issues (what will we write about on July 4 this year); remembering what has been covered in the past, so we don’t duplicate what has been done; dealing with upper management (most of whom have never actually worked as reporters) and who have “ideas” that are often goofy and unworkable; and, god help them, they attend meetings. They cannot waste their time over tiny details. Bigger fish to fry. 

Once the section editor has OKed a piece it goes on to the copy editors, those troglodyte minions hunched over their desks, who then nitpick the story, not only for spelling and style — the Associated Press stylebook can be quite idiosyncratic and counterintuitive — but also for missing bits or mis-used vocabulary, and double-checking names and addresses. A copy editor is a rare beast, expected to know not only how to spell “accommodate,” but also who succeeded Charles V in the Holy Roman Empire (Ferdinand I, by the way). 

The copy editor then hands the story over to the Slot. (I love the more arcane features of any specialized vocation). The Slot is the boss of the copy desk. In the old days, before computers, copy editors traditionally sat around the edge of a circular or oblong desk with a “slot” in the center where the head copy editor sat, collecting the stories from the ring of hobbits surrounding him. He gave the stories a final read-through, catching anything the previous readers may have missed. Later the story would be given a headline by a copy editor and that headline given a final OK by the Slot. Only then would the story be typeset. 

That means a typical newspaper story is read at least four times before it is printed. Nevertheless, there will always be mistakes. Consider The New York Times. Every typo that gets through generates angry letters-to-the-editor demanding “Don’t you people have proof-readers?” Well, we have copy editors. And why don’t you try to publish a newspaper every day with more words in it than the Bible and see how “perfect” you are? Typos happen. “The best laid schemes o’ Mice an’ Men Gang aft agley.” 

When I was first hired as a troglodyte minion, I had no experience in journalism (or very little, having spent time in the trenches of a weekly Black newspaper in Greensboro, N.C., which was a very different experience from a big-city daily) and didn’t fully understand what my job entailed. I thought I was supposed to make a reporter’s writing better, and so I habitually re-wrote stories, often shifting paragraphs around wholesale, altering words and word order, and cutting superfluous verbiage. That I wasn’t caught earlier and corrected tells me I must have been making the stories better. 

There was one particular movie critic who had some serious difficulty with her mother tongue and wrote long, run-on sentences, some of which may have been missing verbs in them, or full of unsupported claims easily debunked. (I hear an echo of her style in the speeches of Donald Trump). I regularly rewrote her movie reviews from top to bottom, attempting to make English out of them. 

One day, I was a bit fed up, and e-messaged the section editor that the critic’s review was gibberish and I included the phrase, “typewriters of the gods.” Unfortunately the reviewer was standing over the desk of the section editor and saw my sarcastic description and became outraged. I had to apologize to the movie critic and stop rewriting her work. 

Lucky for me, the fact that I could make stories better brought me to the attention of the section chiefs and I was promoted off the copy desk and into a position as a writer — specifically, I became the art critic (and travel writer, and dance critic and architecture critic, and classical music critic, and anything else I thought of). I’m sure the other copy editors and the Slot were delighted to see the back of me.

That is, until they had to tackle copy editing my stories. I had a few idiosyncrasies of my own. 

Here I must make a distinction between a reporter and a writer. I was never a reporter, and was never very good at that part of the job. Reporters are interested primarily in collecting information and fact. Some of them can write a coherent sentence, but that is definitely subordinate to their ability to ferret out essential facts and relate them to other facts. A reporter who is also a good writer is a wonder to behold. (In the famous team of Carl Bernstein and Bob Woodward, the latter was a great reporter and mediocre wordsmith — as his later books demonstrate. Bernstein was a stylish writer. Together they functioned as a whole). 

I was, however, a writer, which meant that my primary talent and purpose was to put words into an order that was pleasant to read. I love words. From the second grade on, I collected a vocabulary at least twice as large as the average literate reader, and what is more, I loved to employ that vocabulary. Words, words, words.

And so, when my stories passed through the section editor and got to the copy desk, the minions were oft perplexed by what I had written. Not that their vocabularies were any smaller than mine, but that such words were hardly ever printed in a newspaper. I once used “paradiddle” in a review and the signal went up from the copy desk to the section editor, who came to me. We hashed it out. I proved to her that the word was, indeed, in the dictionary, and the word descended back down the food chain to the copy desk and the word was let alone. 

But this led to a bit of a prank on my part. For a period of about six months (I don’t remember too clearly exactly, back in the Pleistocene when this occurred) I included at least one made-up word in every story I wrote. It was a little game we played. These words were always understandable in context, and were often something onomatopoetic and meant to be mildly comic (“He went kerflurfing around,” or “She tried to swallow what he said, but ended up gaggifying on the obvious lies”). For those six months, a compliant copy desk let me get away with every one of them. Every. Single. One. Copy editors, despite their terrifying reputation, can be flexible. Or at least they threw up their hands and got on to more important matters.

I will be forever grateful to my editors, who basically let me get away with murder, and the copy desk at The Arizona Republic, for allowing me to write the way I wanted (and pretty much the only way I knew how). Editors, of both stripes, will always be my heroes. 

John McPhee

Back to John McPhee. He describes the difficulty of spotting typos. Of course most are easily caught. But often the eye scans over familiar phrases so quickly that mistakes become invisible. In a recent blog, I wrote about Salman Rushdie’s newest book, Knife, and I had its subtitle as “Meditations After and Attempted Murder.” I reread my blog entries at least three times before posting them, in order to catch those little buggers that attempt to sneak through. But I missed the “and” apparently because, as part of a phrase that we use many times a day, the eye reads the shape of the phrase rather than the individual words and letters. 

There is a common saying amongst writers: “Everyone needs a copy editor,” and when I retired from The Republic, I lost the use, aid, and salvation of a copy desk. I had to rely on myself and my re- and re-reading my copy. But typos still get through. And on the day after I post something new, I will sometimes get an e-mail from my sister-in-law pointing out a goof. She let me know about my Rushdie “and,” and I went back into the text and corrected it (something not possible after a newspaper is printed and delivered). She has saved my mistakes many, many times, and has become my de-facto copy editor. 

But my training as both writer and copy editor have stood me well. Unlike so many other blog posters, I double check all name spellings and addresses, my math and my facts. I am quite punctilious about grammar and usage. And even though it is no longer required, I am so used to having AP style drilled into me, I tend to fall in line like an obedient recruit. 

In his story, McPhee details trouble he has had with book covers that sometimes misrepresented his content. And that hit me right in the whammy. One of the worst experiences I ever had with the management class came when I went to South Africa in the 1980s. Apartheid there was beginning to falter, but was still the law. I noticed that racial laws were taken very seriously in the Afrikaner portions of the country, but quite relaxed in the English-speaking sections. 

And I wrote a long cover piece for the Sunday editorial section of the paper about the character of the Afrikaner and the racial tensions I found in Pretoria, Johannesburg and Cape Town. The Afrikaner tended to be bull-headed, bigoted and unreflective. And I wrote my piece about that (and the fascist uniformed storm troopers that I witnessed threaten the customers at a bar in Nelspruit). The difference between the northern and eastern half of South Africa and its southern and western half, was like two different countries. 

As I was leaving the office on Friday evening, I saw the layout for the section cover and my story, and the editor had found the perfect illustration for my story — a large belly-proud Afrikaner farmer standing behind his plow, wiping the sweat from his brow and looking self-satisfied and as unmovable as a Green Bay defensive tackle. No one was going to tell him what to do. “Great,” I thought. “Perfect image.”

But when I got my paper on Sunday, the photo wasn’t there, being replaced by a large Black woman waiting dejectedly at a bus station, with her baggage and duffle. My heart sank. 

When I got back to the office on Monday, I asked. “Howard,” was the reply. Earlier in this blog, I mentioned management, with which the writer class is in never-ending enmity. Howard Finberg had been brought to the paper to oversee a redesign of The Republic’s look — its typeface choices, its column width, its use of photos and its logos — and somehow managed to weasel his way permanently into the power structure. He was one of those alpha-males who will throw his weight around even when he doesn’t know or understand diddly. I will never forgive him. 

He had seen the layout of my story and decided that the big Afrikaner, as white as any redneck, simply “didn’t say Africa.” And so he found the old Black woman that he thought would convey the sense of the continent. Never mind that my story was particularly about white South Africa. Never mind that he hadn’t taken the time to actually read the story. That kind of superficial marketing mentality always drives me nuts, but it ruined a perfectly good page for me. Did I say, I will never forgive him? 

It reminds me of one more thing about management. In the early 2000s, when The Republic had been taken over by the Gannett newspaper chain, management posted all over our office, on all floors, a “mission statement.” It was written in pure managament-ese (which I call “manglish”) and was so diffuse and meaningless, full of “synergies” and “goals” and “leverage” that I said, “If I wrote like that, I’d be out of a job.” 

How can those in charge of journalism be so out of touch with the language which is a newspaper’s bread and butter? 

These people live in a very different world — a different planet — from you and me. I imagine them, sent by Douglas Adams, on the space ship, packed off with the phone sanitizers, management consultants, and marketing executives, sent to a tiny forgotten corner of the universe where they can do less harm. 

One final indignity they have perpetrated: They have eliminated copy editors as an unnecessary cost. When I retired from the newspaper, reporters were asked to show their work to another writer and have them check the work. A profession is dying and the lights are winking out all over Journalandia. 

Click on any image to enlarge

Of all the pop psychology detritus that litters our culture, none bothers me more than the fatuous idea of “closure.” People talk about it as if it were not only a real thing, but an obvious one. But “closure” is a purely literary concept, ill suited to describe the actual events of our lives. 

By “literary,” I mean that it fulfills the esthetic necessity we humans feel to round out a story. A story must have a beginning, a middle, and an end (“but not necessarily in that order,” said French filmmaker Jean-Luc Godard). For each of us, our “life story” is a kind of proto-fiction we create from the remembered episodes of our lives. We are, of course, the hero of our own life story, and the supporting characters all fit into a tidy plot. 

But, of course, actual life is not like that. Rather it is a bee-swarm of interconnecting and interacting prismatic moments seen from the billion points of view of a riotously populated planet. There is no story, only buzzing activity. Eight billion points of view — and that is only counting the human ones. One assumes animals and even plants have their own points of view and no narrative can begin to encompass it all. It is all simply churn. 

Of course, there are anecdotes, which are meant to be stories, and end, usually, with a punchline. Like a joke, they are self-contained. But our lives are not anecdotes, and tragedies, traumas, loss, are not self-contained. There is no punchline.

So, there is a smugness in the very idea that we can write “fin” at the completion of a story arc and pretend it means something real. It is just a structure imposed from outside. 

In his recent book, Knife: Meditations After an Attempted Murder, author Salman Rushdie notes the meaninglessness of the concept of “closure.” After he was attacked by a would-be assassin in 2022, he came desperately close to death, but ultimately survived. The thought that he might face his attacker in court might bring some sort of closure is dismissed. He went through medical procedures and therapy, and even the writing of the book. “These things did not give me ‘closure,’ whatever that was, if it was even possible to find such a thing.” The thought of confronting his attacker in court became less and less meaningful. 

Writers, in general, are put off by such lazy ideas as “closure.” Their job is to find words for actual experience, words that will convey something of the vivid actuality of events. Emily Bernard, author of Black is the Body was also the victim of a knife attack, and her book is a 218-page attempt to come to terms with her trauma: The book opens up a life in connection with the whole world. She never uses the word “closure.” 

Both Bernard and Rushdie to their utmost to describe their attacks with verbal precision and without common bromides. It is what all serious writers attempt, with greater or lesser success. It is easy to fall into patterns of thought, cultural assumption, cliches. It is much harder to express experience directly, unfiltered. 

The need to organize and structure experience is deeply embedded in the human experience. And art, whether literary, musical, cinematic or visual, requires structure. It is why we have sonnets and sonata-form, why we have frames around pictures, why we have three-act plays. 

The fundamental structure of art is the exposition, the development, and the denouement. Stasis; destablization; reestablishment of order. It is the rock on which literature and art is founded. When we read an autobiography, there is the same tripartite form: early life; the rise to success with its impediments and challenges; and finally, the look back at “what we have learned.” 

We read history books the same way, as if U.S. history ended with the signing of the Constitution, or with Appomattox, or the Civil Rights movement, or the election of Reagan. But history is a continuum, not a self-contained narrative. Books have to have a satisfying end, but life cannot. 

Most of us have suffered some trauma in our lives. It could be minor, or it could be life-changing. Most often it is the death of someone we love. It could be a medical issue, or a divorce. We are wrenched from the calm and dropped into a turmoil. It can leave us shattered. 

And the story-making gene kicks in and we see this disruption as the core of a story. We were in steady state, then we are torn apart, and finally we “find closure.” Or not. Really no, never. That is only for the story. The telling, not the experience. 

In truth, the trauma is really one more blow, one more scar on the skin added to the older ones, one more knot on the string. We will all have suffered before, although the sharpness may have faded; we will all suffer again. 

Closure is a lie. All there really is is endurance. As Rushdie put it, “Time might not heal all wounds, but it deadened the pain.” We carry all our wounds with us, adding the new on top of the old and partly obscuring what is buried. 

There are myriad pop psychology tropes. They are like gnats flying around our heads. Each is a simplifying lie, a fabricated story attempting to gather into a comprehensible and digestible knot the infinite threads of a life. 

I have written many times before about the conflation of language and experience, and how we tend to believe that language is a one-to-one mirror of reality, when the truth is that language is a parallel universe. It has its own structure and rules — the three-act play — while those of non-verbal life are quite other. And we will argue — even go to war — over differences that only matter in language (what is your name for the deity?)

Most of philosophy is now really just a branch of philology — it is about words and symbols. But while thoughtful people complain about the insular direction that philosophy has taken, it has really always been thus. Plato is never about reality: It is about language. His ideal bed is merely about the definition of the word, “bed.” As if existence were truly nouns and verbs — bits taken out of context and defined narrowly. Very like the question of whether something is a particle or a wave, when in truth, it is both. Only the observation (the definition) will harness it in one form or the other. It is all churn. πάντα χωρεῖ

A story attempts to make sense of the senseless. I’m not sure life would be possible without stories, from the earliest etiology of creation myth to the modern Big Bang. All those things that surpass understanding can only be comprehended in metaphorical form, i.e., the story. 

But stories also come in forms that are complex or simple, and are true or patently silly. My beef with “closure” is that it isn’t a story that reflects reality, but a lie. A complacent lie. 

Like most of popular psychology, it takes an idea that may have some germ of truth and husks away all the complex “but-ifs” and solidifies it into a commonly held bromide. It is psychobabble. 

That is a word, invented by writer Richard Dean Rosen in 1975, which he defines as “a set of repetitive verbal formalities that kills off the very spontaneity, candor, and understanding it pretends to promote. It’s an idiom that reduces psychological insight to a collection of standardized observations that provides a frozen lexicon to deal with an infinite variety of problems.”

And afternoon TV shows, self-help books and videos, and newspaper advice columns are loaded with it. It is so ubiquitous that the general populace assume it must be legitimate. We toss around words such as co-dependent, denial, dysfunctional, empowerment, holistic, synergy, mindfulness, as though they are actually more than buzz words and platitudes. Such words short-circuit more meaningful understanding. Or a recognition that there may be no understanding to be had. 

(In 1990, Canadian psychologist B.L. Beyerstein coined the word “neurobabble” as an extension of psychobabble, in which research in neuroscience enters the popular culture poorly understood, with such buzz words as neuroplasticity, Venus and Mars gender differences, the 10-percent-of-the-brain myth, and right- and left-brain oversimplifications.)

 As a writer (albeit with no great claim to importance), I know how often I struggle to find the right word, phrase or metaphor to reach a level of precision that I don’t find embarrassing, cheap, or an easy deflection. Trying to find the best expression for something distinct, complex and personal — to try to be honest — is work. 

This is true in all the arts: trying to find just the right brown by mixing pigments,; or the right note in a song that is surprising enough to be interesting, but still makes sense in the harmony you are writing in; or giving a character in a play an action that rings true. We are so mired in habits of thought, of culture, that finding that exactitude is flying through flak.

Recently, Turner Classic Movies ran a cheesy science-fiction film I had never seen before. I grew up on bad sci-fi movies from the 1950s and always enjoyed them, in the uncritical way a 9-year-old watches movies on television: Quality never entered the picture. At that age, oblivious there even was such a thing. It wiggled on the screen; I watched. 

But this film was released in 1968, too late for me. When I had gone off to college, the only films I watched were snooty art films. And so I never got to see The Green Slime. Now, here it was, and it was prodigiously awful. Actor Robert Horton fights an alien invasion of tentacled, red-eyed monsters. 

Everything about The Green Slime was awful: the acting, the lighting, the set design, the special effects — and, of course, the science. Or lack of it. There was the garish color of sets and costumes and the over-use of the zoom lens, the way of made-for-TV movies of the era. I have outgrown my open-hearted love of bad science fiction. I stared in wonder at the horribleness I was seeing on the TV screen. 

And it was the acting, more than anything, that appalled me. Why were these actors so stiff, wooden, even laughable? And something I guess I had always known, but never really thought about jumped to mind: Actors are at the mercy of writers. The dialog in Green Slime was stupid, awkward and wooden.

There is some dialog so leaden, so unsayable, that even Olivier can’t bring it off. Robert Horton, while no Olivier, was perfectly fine on Wagon Train, but here he looked like he was lip-synching in a foreign tongue. 

“Wait a minute — are you telling me that this thing ‘reproduced’ itself inside the decontamination chamber? And, as we stepped up the current, it just … it just grew?”

I remember, years ago, thinking that Robert Armstrong was a stiff. I had only seen him in King Kong and thought he was a wooden plug of an actor (not as bad, perhaps as Bruce Cabot, but still bad. But years later, I’d seen him in other films where he was just fine. Even in Son of Kong, he was decent. But no one, absolutely no one can pull off a line like “Some big, hardboiled egg gets a look at a pretty face and bang, he cracks up and goes sappy!”

Even in a famous movie, clunky dialog can make an otherwise fine actor look lame. Alec Guinness and James Earl Jones may be able to pull off the unspeakable dialog of the original Star Wars, but for years, I thought Mark Hamill was a cardboard cut-out. It was only when seeing him in other projects I realized that Hamill could actually act. I had a similarly low opinion of Harrison Ford because of what he was forced to mouth in those three early franchise films. George Lucas did them no favors. 

There is certainly a range of talent in movies and some genuinely untalented actors who got their parts by flashy looks or sleeping with the producer. But I have come to the opinion that most (certainly not all) actors in Hollywood films are genuinely talented. Perhaps limited, but talented, and given a good script and a helpful director, can do fine work. 

One thinks of Lon Chaney Jr., who is wooden, at best, as the Wolfman. But he is heartbreaking as Lenny in Of Mice and Men — perhaps the only chance he ever got to show off what he was actually capable of. 

“Lon Chaney was a stiff, but he had Lenny to redeem him,” said my brother Craig, when we discussed this question. Craig can be even more critical than me. 

He continued, “I’ve been trying to think of the worst actors ever — someone who has never said a word like a human being. There are a lot of people who got into a movie because they were famous for something else (Kareem Abdul-Jabbar, Joe Louis, Audie Murphy) so it’s hard to judge them fairly as actors, like you can’t criticize a pet dog for barking the national anthem but not hitting the high notes. But even Johnny Weissmuller was pretty effective in the first Tarzan; Elvis had Jailhouse Rock where he actually played a character; and Madonna can point to Desperately Seeking Susan without shame. (Everything else, shameful. There just isn’t enough shame in the world anymore.)

“There are any number of old cowboy stars who couldn’t speak a believable line of dialog and that can’t be totally blamed on the writing. (Gabby Hayes rose above it.) There are bad actors who still had some life and energy about them that made them fun to watch. Colin Clive was silly, and he made me giggle, so he was entertaining. And  Robert Armstrong. But there’s just no excuse for Bruce Cabot.

“I’ve never actually seen a Steven Seagal movie,” Craig said, “but I know enough to say with conviction that he should have been drowned as a baby.”

I said Craig can be tougher than me, but here, I have to concur. 

“It’s probably not fair to pick out silent movie actors for being silly and over the top, but there is Douglas Fairbanks to prove you can be silent and great.”

Silent acting was a whole different thing, and hard to judge nowadays. As different from modern film acting as film acting is from acting live on stage. The styles don’t often translate. John Barrymore was the most acclaimed Shakespearean actor in America in the early years of the 20th century, but his style on celluloid came across as pure ham. (Yes, he was often parodying himself on purpose, but that doesn’t gainsay what I am saying about acting styles). 

Every once in a while, I see some poor slob I always thought was a horrible actor suddenly give an outstanding performance. Perhaps we have underestimated the importance of the casting director. A well-placed actor in a particular part can be surprising perfection. There is creativity in some casting offices that is itself an artform. You find the right look, voice, or body language, and a minor part becomes memorable. Some actors are wonderful in a limited range of roles. I can’t imagine Elisha Cook as a superhero, but he is perfect as a gunsel. 

And Weissmuller was the perfect Tarzan before his clumsy line reading became obvious in the Jungle Jim series. I am reminded of Dianne Wiest in Bullets Over Broadway: “No, no, don’t speak. Don’t speak. Please don’t speak. Please don’t speak.”

Keep in mind, actors are subject to so many things that aren’t in their control. In addition to good writing, they need a sympathetic director, decent lighting, thoughtful editing, even good costume design. Filmmaking is collaborative and it isn’t always the actor’s fault if he comes across like a Madame Tussaud waxwork. I’ve even seen Charlton Heston be good. 

In reality, I think of film actors much as major league ballplayers. The worst baseball player on a major league team may be batting under the Mendoza line, and even leading the league in fielding errors, but in comparison with any other ballplayers, from college level to minor leagues, he is superhumanly talented. Even Bob Uecker managed to hit a home run off Sandy Koufax. I doubt any of us could have done that. And so, we have to know who we’re comparing them to.

I saw a quote from Pia Zadora the other day (she just turned 70) and with justifiable humility, she said, “I am often compared to Meryl Streep, as in ‘Pia Zadora is no Meryl Streep.’” Still, compared to you or me, she is Bob Uecker. 

I have had to reassess my judgment of many actors. I had always thought of John Wayne as a movie star and not an actor. But I have to admit, part of my dislike of his acting was disgust over his despicable political beliefs. And I thought of him as the “cowboys and Indians” stereotype. 

But I have now looked at some of his films with a clearer eye, and realize that, yes, most of his films never asked anything more from him than to be John Wayne — essentially occupying a John Wayne puppet suit — but that when tasked by someone such as John Ford or Howard Hawks, he could actually inhabit a character. “Who knew the son-of-a-bitch could actually act!” Ford himself exclaimed. 

But there it is, in The Searchers, She Wore a Yellow Ribbon, Red River, The Quiet Man, Rio Bravo, The Shootist. Those were all true characterizations. (Does all that cancel out The Alamo or The Conqueror or The War Wagon, or balance all the undistinguished Westerns he made? We each have to judge for ourselves). 

Even in his early Monogram oaters, playing Singing Sandy, he brought a naturalness to his presence that is still exceptional in the genre (and researching Wayne, my gasts were flabbered at how good looking he was as a young man. So handsome he was almost pretty. And that hip-swinging gait that predates Marilyn Monroe. “It’s like Jell-O on springs.” It seems notable that so much feminine could become the model of such lumpen masculinity.)

And even great actors have turned in half-ass performances, or appeared in turkeys. In Jaws: The Revenge, Michael Caine has to utter things like, “Remind to tell you about the time I took 100 nuns to Nairobi!” Caine famously said, “I have never seen the film, but by all accounts it was terrible. However I have seen the house that it built, and it is terrific.”

Even Olivier had his Clash of the Titans.

Actors have to work, and they don’t always get to choose. “An actor who is not working is not an actor,” said Christopher Walken. The more talented actors sometimes get to be picky, but the mid-range actor, talented though he or she may be, sometimes just needs the paycheck. 

I sometimes think of all the jobbing actors, the character actors of the 1930s, working from picture to picture, or the familiar names on TV series from the ’50s and ’60s — the Royal Danos, the John Andersons, the Denver Pyles, dressed as a grizzled prospector for a week on one show, going home at night for dinner and watching Jack Benny on the tube, and then driving back to the set the next morning and getting back into those duds. And then, next week dressing as a banker for another show, putting together a string of jobs to make a living. And all of them complete professionals, knowing what they are doing and giving the show what it needs. A huge amount of talent without ever having their names above the title. 

Royal Dano times four

And so, I feel pity for those actors of equal talent who never broke through, or who were stuck in badly-written movies and couldn’t show off their chops. When I watch reruns of old episodic TV, I pay a good deal more attention than I ever did when I was young, to all the parts that go into making such a show. I notice the lighting, the editing, the directing, and most of all, the writing. The writing seems to separate, more than anything else, the good series from the mediocre ones. And how grateful those working actors must feel when they get a script that gives them a chance to shine.