Archive

Monthly Archives: March 2026

Dimitri Drobatschewsky was the most erudite man I ever knew. He spoke, wrote and read in German, French, and English. Born in Berlin and raised mostly in Luxembourg, his French and German were native, down to idiom, argot and accent. He was also conversant in Spanish, Italian and Polish (at least, he said, he knew several dirty jokes in Polish). 

He was born in 1923, fled the Nazis with his family, joined the French Foreign Legion, deserted to fight with the Free French forces in Italy in World War II. Later, he became the classical music critic with The Arizona Republic, where he and I became friends. 

(Once, when confronted by a musician who had gotten a bad review, he was challenged on his credentials. “What do you think is the most important qualification for a classical music critic?” the musician demanded. “Well,” Dimitri said. “He must have a long and unpronounceable name.”) 

Because he was still a boy when moving to Luxembourg, he was able to learn French as a native. “I had to,” he told me. “French girls wouldn’t date you if you didn’t speak perfect French.” And that’s why, after the war, when he emigrated to the U.S., he kept his accent. “American girls loved a foreign accent.” 

Dimitri felt that French was the most beautiful language, by the sounds it makes in your mouth. But for Dimitri, the best poetry was in German, and further, the greatest poet was Johann Wolfgang von Goethe. For this, I had to take his word.

Because I don’t read German, and when I approach Goethe in translation, he sounds earthbound, even banal. 

I try to hear the German in my mind to catch its melody, but I am walled out by my English. All I can gain from the reading is a commonplace. 

“Little rose, little rose, little red rose

Little rose of the heath.”

It sounds better when set to Schubert’s music, but still, in English, the words are a touch sappy, and the sentiment pedestrian.

“You have to read him in German,” Dimitri said. “The sound of words, the language is unbelievably beautiful.”

Röslein, Röslein, Röslein rot,

Röslein auf der Heiden.”

So, I’m afraid Goethe is closed to me. I’ve read Faust several times in several translations, and it never seems to quite get airborne, yet everyone who knows the original feels it is one of the greatest works of literature ever, and that Goethe is the equal of Shakespeare. 

I have the same problem with Quintus Horatius Flaccus, or Horace, in English. In English, his poetry is flat as yesterday’s ginger ale. “You have to read him in Latin,” says my friend Alexander, whose degree is in Classical languages. “In Latin, he is truly exceptional — lapidary perfection.”

Again, I have to take his word for it. Shakespeare may have had “small Latin and less Greek,” but my Latin is even smaller than the Bard’s. I studied it in eighth grade, and mostly what I recall is “agricola.” 

I freely confess it is my loss. But there it is; I am stuck with it. 

There are those who hold that all literature is untranslatable, that you have to read it in the original language, and while I concede that you can never get all of a poem in a translation, nevertheless, I feel there is a class of work that functions perfectly well shapeshifted. 

I can read my Homer not only in English, but in multiple translations, from Chapman to Pope to Fitzgerald to Fagles and I am sucked in by the poetry every time. It may very well be better in Greek, but it’s the best thing I’ve ever read even in English. I reread the Iliad once a year, and try to find a new translation each time. (I read the Odyssey, too, and I especially love the translation by T.E. Lawrence — Lawrence of Arabia. Who knew?)

The same thing happens with Dostoevsky. I’m sure it’s better in Russian, but even a good translation moves swiftly and powerfully and I am rapt by the story and moved by the humanity. There is a swift current underneath the surface of language. 

It can make a difference which translation you read. I am told by those who know, that the Scott Moncrieff translation of The Remembrance of Things Past is closest to the quality of Proust’s French, yet I find his English stuffy and outdated. The newer translations — by a range of translators for Viking (Swann’s Way is translated by Lydia Davis) — is easier to digest and flows with the quickness that ensures pleasure in the reading. But am I getting the pith of Proust? My French is better than my German, but it is still small beer. 

Constance Garnett gave us English versions of what must be every Russian novel ever written. She was a factory. And her versions are still the most widely read. But the more recent by Richard Pevear and Larissa Volokhonsky are much easier going. The duo now seem to be challenging Garnett also for the shear number of volumes converted.

Tolstoy’s War and Peace, in the English of Louise and Aylmer Maude, is the most profound and moving piece of literature I have ever read, despite the profusion of names. How much better would it read if I could understand it in Russian (and French, let’s not forget)? Its power transcends its tongue. 

This all raises the question, however, of why Homer or Tolstoy can be read in translation and Horace cannot. And the reason, I believe, is that greatness in writing comes on two essential levels: content and style. That is, how deeply it connects with our human-ness, on one hand, and on the other, how deeply it connects with its medium. This is not an either-or situation; there should always be awareness of both sides. But in practice, one side or the other tends to predominate. The more it is the universal connection with life and experience that we read, the easier the literature can travel. The more it is the words themselves, the more insular the audience.

It would be difficult to illustrate this dichotomy if we try to look at examples of foreign literature translated to English; we would need to be conversant with the original language to see how it morphs in the conversion. But consider attempting to translate several English authors into some other language.

Shakespeare tends to travel well. His plays are valued in many lands and many languages. There are famous examples of Macbeth in Swahili, of Hamlet in Russian, and dozens of operatic versions in Italian, French and German. They all pack a wallop. And Shakespeare is loved in all those languages by their native speakers.

On the other hand, how in hell can you translate John Milton into French? You can tell the story of Paradise Lost, sure, but how can you convey the special organ-tone quality of his language.

“Round he throws his baleful eyes.”

Translate it into French and it comes out as the equivalent of: “He looks around malevolently.” Not the same thing, all the poetry is gone out of it. Deflated; a flat tire.

Or: “When I consider how my light is spent.”

It is only in English that the word “spent” has the two meanings: a spent taper; or money (or life) spent. The word in the opening of his sonnet “On his Blindness” has a nimbus of ambiguity about it. The primary meaning is that he is now blind, but he spreads the halo out from the word “spent” by following it up with several other financial words: “the one Talent which is death to hide” where a talent is also a biblical monetary denomination, and brings to mind the New Testament story of the servants and the talents, and the poor servant who is “cast into the outer darkness, where there will be much weeping and gnashing of teeth.” And then there is, “present my true account,” and its hint of double entry bookkeeping. It is this expansiveness in language that is the key to Milton’s greatness. He is large; he contains multitudes. But they are bound in English, anodized, as it were, not separable. How do you work that magic in French? Or German? Or Japanese?

These things are untranslatable, and hence, Milton can never have the global currency of Shakespeare. 

Or consider translating Chaucer from his own time to ours. The poetry — The sound of the words, phrases, sentences and stanzas — cannot hypnotize us as the original does. Yes, we get the sense, but we miss the art.

“And smale fowles maken melodye, that slepen all the night with open ye.” 

Or imagine James Joyce in German. The melody is gone. “Stattlich rundlich Buck Mulligan…” 

If I turn to a poet I love very deeply, and whose language I can parse, it survives translation very well. Pablo Neruda’s Spanish is so transparent, that the ideas embodied in it are clearly seen in any lingo. It is that Neruda’s primary concern in his poetry is not language, but experience. They are real pears and plums in his poetry, real life and death, real love, real sex, real toes and real stones. The poetry is about the things of this world, and not the way we express them.

The poetry is highly wrought, and in Spanish, there is a linguistic layer Neruda also cares about, but the power of the poems come from Neruda’s connection with his own life, his own experience, and that it is possible to share in any language.

Quiero conocer este mundo,” “I want to know this world,” he says in his Bestiario/“Bestiary.”

“The spider is an engineer,/ a divine watchmaker./ For one fly more or less/ the foolish can detest them:/ I wish to speak with spiders./ I want them to weave me a star.”

Language is a mask. Behind it there is a world. You can concentrate on the language, or on the world. It is easy to be lulled into forgetting the difference, to think that words describe the world, and that the best language is the most accurate lens on the things of this world — este mundo — but they are not the same, but rather, parallel universes, and what works in words does not necessarily explain how the world functions. In reality, there are no nouns, no participles. There is only “is.” Can you squeeze that “is” through words? We try. And we try again.

Like most critics, I’ve written my share of Top Ten lists over the years. Most of them, whether about music or movies or books, tend to be made from what I consider the best, deepest, or most meaningful entries — classics. Consensus choices made by informed critics who have read, seen and heard enough of their subjects to make their lists meaningful. The best on these lists can perhaps make you a better person, but not necessarily happier. 

So, there can be another list, not of the highest and best, but of those things we simply enjoy, for whatever reason. After all, what we simply enjoy isn’t always the most profound or most brilliantly written, acted, or edited. And sometimes we have to admit there are things we just like. And we will watch them over and over again. 

Yes, I know the idea of watching a movie over and over doesn’t make sense to some people. I have discussed this with someone who wondered, “I’ve seen it and know how it ends, so why would I watch it again?” As if the point of a film were its plot. 

And there are films that function only on a story level, and perhaps once you’ve learned the plot twist, or uncovered the killer, there is no further reason to return to the movie. I have movies like that: I enjoyed them well enough the first go-through, but have no overriding desire to take that ride a second time. 

But there are movies I want to see over and over, the way you like hearing a favorite tune. You don’t say, “I’ve heard that song, so why would I listen to it again?” It’s a tune. It’s fun to hear again. And don’t call me Shirley. 

A list of such films will be personal. I don’t expect everyone to jump on the bandwagon. Such a list is almost a Rorschach test, explaining the personality of its maker. Make your own list and see it as a mirror. 

And so, here’s my list of top favorite movies that never stop satisfying. Some are movies I watch over and over and just enjoy every time; and others I don’t have to watch all the way through, but just love particular scenes and if I am channel surfing and come across them on Turner Classics, even if I catch them in the middle, I will watch through to the end, just to catch some of those scenes. Some of these are genuine classics, but others just tickle a certain place in my brain. They are fun. 

Number One on my list is a perfect example. No one would claim it has great acting or brilliant dialog. In fact, it is embarrassing on both counts. But it hits a sweet spot in the mythological nerve button in my psyche. I have seen the 1933 King Kong over a hundred times. 

Admittedly, this includes all the times I watched it as a 5-year-old from behind the couch to hide from the scary parts, when it was being shown a dozen times a week on WOR-TV’s “Million Dollar Movie” on New York television. I chalked up a boatload of views in the years before I even went to high school. 

Even now, 70 years later, I will still tune in when it shows up on the TV listings. Its appeal is the same as those wonderful Gustav Doré wood engravings of dark forests and the light that shines through. 

So, King Kong is first on my list. Second couldn’t be further from the spirit of the Big Monkey picture: The Seventh Seal by Ingmar Bergman. 

I first watched it, like Kong, as a boy when it was just a cool movie about Medieval knights. It was on TV, and since it was released in 1957, I had to be at least 9 years old before I saw it. Probably a few years after that. I next saw it as part of my college movie series, along with a raft of other art films. That’s when it hit home. 

The movie gets shown a lot, both on TV and in various film series, and so I have had the chance now to see it probably 30 times or so. It is pretty much the defining title of the “art film.” I became a foreign film junkie in college and most of my favorite films are either in French or Swedish, with Italian clocking in third. 

No. 3 on my list is French, and it is the film that I have both seen many times over the years and has changed drastically over multiple seeings. I first saw Children of Paradise in that college film series, and at that age, it was the yearning idealism of Baptiste Deburau that spoke most directly to me. I was Baptiste. Yes, I know that’s embarrassing now, but then, his earnestness seemed the very nugget of truth. And my heart went pitter-pat for Garance:  “Love is simple,” she said. And so it seemed to one of my tender years. We are all idiots at that age. 

Later, I came to identify with the actor Frédérick Lemaître, accommodating and joyfully cynical. Of course, that, too, was just a costume to try on. The same, later on, with the antisocial Lacenaire. 

As I sped through the years, seeing the film differently each time, I finally came rather to see the characters as comprising a whole, and I identified with their shared humanness, each suffering and causing suffering in turn and trying to make a way through life. 

Next, a movie that never changes, but delivers the goods every time: My Man Godfrey, with William Powell and Carole Lombard. Of all the great screwball comedies from the 1930s, it is the most perfect. Perfect plot; perfect casting; perfect dialog; perfect direction. 

I do not know how many times I’ve watched Godfrey, but it never wears out its welcome. Of all the films on this list, Godfrey most approximates the comparison with the favorite tune where you perk up on hearing it and it just brightens your day. 

Rounding out No. 5 is Akira Kurosawa’s Seven Samurai, which I first saw in the common butchered 141-minute trim that first made the rounds of the U.S. Of course, the original 207-minute version has been restored and only a barbarian would choose the mutilated version. 

I have watched Seven Samurai too many times to count, and it was Takashi Shimura, as the samurai leader, rather than Toshiro Mifune who grabbed my attention. Shimura was Kurosawa’s mainstay actor, appearing in more of his movies than anyone else (and also in Godzilla). Mifune could sometimes be a bit buffoonish in his roles. Shimura had a much greater range.  

In 1978, when I was living in Seattle and unemployed, I went to a bar one night with my friend, Alice. Turns out, they were setting up a projector to show a 16mm print of Seven Samurai. We decided to watch at least the beginning of the film — Alice had never seen it, and we knew it would be more than three hours of movie — so we didn’t expect to stay. But neither of us could turn away and we watched till the end. There are no slack parts. 

Those are my top 5, but there are more. How can I have seen these movies so many times? Well, first there were VHS tapes and then DVDs. I have them all now on disc. (King Kong was initially a problem, unavailable on disc, apparently over a rights issue, but I managed a bootleg tape recorded off a TV showing. Now, I have the Warner box set, also with Son of Kong and Mighty Joe Young.) 

Then, there is Turner Classic Movies, the one great treasure of cable TV, which shows most of these movies periodically. Often when channel-surfing, I will come upon one of my faves and pick it up mid-stream and watch till the end. 

There are films on this list that I often come across this way, and don’t feel the need to watch beginning-to-end, but have such delightful scenes in them, that when I catch them, I watch for those moments. 

Pulp Fiction, for instance. A great film overall, but scene-by-scene even better. I can watch for certain set-pieces without feeling I need to do the whole thing. 

Same with My Cousin Vinny. The courtroom scenes are a great tune, but I don’t need the set-up. Just give me some Marisa Tomei attitude, some Joe Pesci and the best role that Fred Gwynne ever had. 

If we count those as Nos. 6 and 7 on this list, that takes us to:

The Baker’s Wife, a film I saw years ago and then it disappeared. No DVD, no TCM. I scoured Amazon for a Region 2 disc, and eventually found a miserable, low-rez copy, the kind with subtitles whited out by the background. Eventually, years later, a restored version became available. This 1938 Marcel Pagnol comedy stars Raimu as a provincial French baker whose young wife has run off with a younger man. The baker is so dejected, he stops baking and the village tries everything to get the wife back so they can have their bread. It’s a great film. 

No. 9 would then be Metropolis. Several of these movies were among those I first saw as a boy, and so I have watched them repeatedly over six or seven decades. Many years ago, the local New York NET channel (pre-PBS) had a film series that included Fritz Lang’s Metropolis, albeit in a shortened 90-minute cut, but it hypnotized me. I later saw a version on TV with an electronic score that seemed utterly surreal matched with the images. 

I have sought out ever-more complete versions of the film, now clocking in at two-and-a-half hours, with a few stills edited in to account for missing footage. It is still mesmerizing. (I have written about it extensively; link here). 

And bringing up the rear of this Top Ten list would be Key Largo, a picture made the same year I was born. It is certainly not the best Bogey-Bacall film, but one I first watched as a boy, before I know who Bogart was, or anything about his mythic persona. For some reason it clicked in my memory, and I found it seemed to show up on TV over and over, without my asking. 

Even now, I’ll watch it. It massages a familiar place in my brain. And it isn’t the stars who I watch for: Claire Trevor’s drunk moll is the best thing in the movie. She deserved the Oscar she won for a movie that normally would not even be mentioned by the Academy. 

That rounds out the Top Ten, but in all honesty, I have to admit they really should not be ranked at all. Rather they are in a very large pool of films that I watch repeatedly. I can’t tell how many times I’ve watched The Big Sleep, or any of many parts of it (I have practically memorized the opening scene with General Sternwood). Or Casablanca. Or even To Have and Have Not. Or any of the William Powell films, including any of the Thin Man series. Or Roland Young’s Topper. Most any Buster Keaton film, short or feature. 

Or, to spread the love, I have watched uncounted times: Airplane!; Blazing Saddles; This Is Spinal Tap; O Brother, Where Art Thou? — that mostly for the tunes. 

Really, the list gets ridiculous. Any Almodovar, any Bergman, any Renoir. Any screwball comedy, any black-and-white Fred Astaire (he aged well, his later movies haven’t). I have a soft spot for any of the non-spaghetti Westerns of Clint Eastwood. Who’d a thunk it? Josey Wales, Hang ’Em High, High Plains Drifter, Pale Rider. Oddly, I still haven’t seen Unforgiven. I’ll get around to it, eventually, but first, TCM is showing Godfrey again. 

The established conventions of movie monsters have changed over the years, as have the monsters themselves. But if the rules have evolved, it is still rules that define monsters. Ask any 12-year-old boy; they will be able to recite you chapter and verse, the way a Supreme Court clerk can quote the Constitution. Silver bullets, crucifixes, wooden stakes, wolfsbane, mirrors, the whole concatenation of parameters that define the world inhabited by the undead, the re-dead and the soon-to-be-dead. 

I know this because when I was 12 years old, I had a subscription to Famous Monsters of Filmland, a fan magazine about horror films put out by noted monsterologist Forrest J Ackerman. It was still early days of television, and local independent TV stations, with no network to support them, had to scramble to fill air time. They found old cartoons, old Three Stooges shorts, old Our Gang comedies, old Westerns — and horror films. I must have seen Frankenstein, Dracula and all their permutations, from “Son of …” to Abbott and Costello, maybe, a hundred times. 

And I knew then, all the rules — the defining conventions of each genre. The Frankenstein monster couldn’t talk; fire was his kryptonite; Dracula was terrified of crosses; the Wolfman turned hairy with the full moon. If a vampire or a werewolf bit you, you turned into one; if Frankenstein’s monster bit you: nothing. You got a tetanus shot.

Then, there are zombies. Originally a minor player in the Hollywood monster movie, they have become, since George Romero, one of the most common forms of monster. I recently saw someone who had the perfect solution to the zombie problem. It depends on the recent brain-eating conventions of zombiehood. Why no one had thought of this before, I don’t know. It is so obvious. 

If your community is plagued by such zombies, all you need to do to survive is to dress up like a zombie yourself, put on some rags, apply the whitish, ghostly makeup, with some ketchup drooling from the corner of your mouth. Zombies don’t attack other zombies. I don’t know why, but they don’t. So, act like one, and be let alone. Of course, you will also need to avoid the living human population, who have a dismaying tendency to blow the heads off zombies with shotguns. But other than that, home free. 

When they first made their debut in celluloid, zombies were Haitian and they were essentially sleepwalkers. They were derived from popular understandings of vodou (aka “voodoo”), where a bokor, or sorcerer, could raise the dead to act as slaves. In 1929, author William Seabrook published The Magic Island, and described a sensationalized version of vodou and zombies. 

In 1932, Hollywood produced White Zombie, in which Bela Lugosi is the evil sorcerer who puts the heroine under his spell when she visits Haiti. Lugosi’s character, “Murder” Legendre, uses zombie labor to operate his sugar plantation. And commit murder. 

Plantations worked by enslaved Africans gave rise to zombie mythology in Haiti and the Caribbean, but many of the Hollywood versions feature non-African zombies, although one of the best, I Walked with a Zombie (1943), sets the zombie back into its proper African-Caribbean context. Still, its main victim remains a White woman. 

But zombies were a minor subshoot of the monster movie, giving pride of place to Frankenstein, Dracula and the Wolfman (and perhaps the Mummy). Only a few zombie films in the ’30s and ’40s (the heyday of the classic monster) were made, and they never really cracked the nut of cultural ubiquity. That didn’t happen until Romero reinvented them in 1968 in Night of the Living Dead and rewrote the rules for the genre. Now, they were ghoulish undead that shuffled along in rags and killed and ate the living. A bite could turn you into one of them. Tetanus shots didn’t help.

The prototype for Romero’s shambling zombies can be found in the “Wandering Sickness” in Things to Come, a 1936 movie made from the H.G. Wells novel. Zombies are more recently allowed to be fast-moving, which makes them harder to avoid. And zombies en masse are in practice unkillable, as shown by the never-ending 10-year run of The Walking Dead on the AMC network. 

With Night of… and its many sequels and rip-offs, the zombie briefly became the primary movie boogie-man. The rules have been tweaked by subsequent writers and directors, so that now the popular conception is of a reanimated corpse who eats brains. Why brains? I don’t know. Nothing wrong with liver and a nice chianti. 

Even the running zombies gave way to the rise in teen-age exploitation of the genre. Now, zombies can be attractive young zombies in Pushing Daisies, Warm Bodies and iZombie, which is “in many ways the same transformation [of the zombies] that we have witnessed with vampires since the 1931 Dracula represented Dracula as essentially human—a significant departure from the monstrous representation in the 1922 film Nosferatu,” noted writer Scott Rogers, pointing out that nowadays both zombies and vampires can be hot teen idols. 

This changing of the rules is common, as creators need to find ways to freshen up the cliches, only to make new cliches. Eventually, each monster genre ends up in parody. Young Frankenstein, Teen Wolf, Abraham Lincoln: Vampire Hunter. And so, you have Shaun of the Dead

With Hammer Films in the 1960s, monsters were given a new garish color makeover, with lots of bodice ripping and jiggles to entice the young testosterone-soaked adolescent males. And the classics have never really left. Wikipedia lists 236 werewolf films made since the silent era; 119 of them just since 2000. I couldn’t fully count the number of vampire films, including the astonishing number of naked lesbian vampires that came out of Italy. 

A few of my favorites (I can’t help but list some of these titles): Billy the Kid Versus Dracula (1966); Dracula’s Dog, aka Zoltan … Hound of Dracula (1977); Uncle Was a Vampire (Italian, 1959); The Vampire and the Ballerina (Italian, 1960); Samurai Vampire Bikers from Hell (1992); A Polish Vampire in Burbank (1985); Mom’s Got a Date With a Vampire (2000); and My Babysitter’s a Vampire (2010). 

Vampires have gone through four major transmogrifications. Originally, in folklore, they were ghouls, ugly monsters. But after Bram Stoker’s Dracula (1897), the vampire became a seductive man with hypnotic charms over beautiful women. Or, in a kind of reverse ploy, sex starved women in heavy makeup craving the blood of handsome men — or in the lesbian vampire films, pneumatic young women. 

(The subgenre of lesbian vampires is extensive. Wikipedia lists more than 50 such films, beginning with 1936’s Dracula’s Daughter and continuing through Vampyros Lesbos (1971); The Hunger (1983), a classy film with David Bowie and Catherine Deneuve; and Lesbian Vampire Killers (2009), a genuine turkey that starred late night TV’s James Corden. While many monster films have been adopted by the LGBTQ world as metaphors of queerness, the lesbian vampire is more transparently so.)

The canonic vampire persisted in the Hammer Films pictures with Christopher Lee. Sunlight could kill them; they couldn’t be seen in mirrors (or photographs); wolfsbane or garlic was a prophylactic; they slept in coffins; they could be killed with a wooden stake through the heart, or more garishly, with the wooden stake and a quick beheading. 

Then, of course, it all changed with Ann Rice. She took the side of the vampires — “these elegant, tragic, sensitive people,” she called them. Oh, they suffered, cursed as they are with immortality. Rice’s vampires are “loquacious philosophers who spend much of eternity debating the nature of good and evil,” according to Susan Ferraro of The New York Times. “Rice turns vampire conventions inside out.”

She also transports them from England and Germany to New Orleans, which adds its own patina of the gothic. Goodbye Transylvania, hello red blood and Rice. 

None of which prepares us for the recent incarnation of teenage moony-eyed vampires by Stephenie Meyer. They have no trouble with daylight, indeed they sparkle. But it is so hard being a vegetarian vampire. 

There are also werewolves in the Twilight books (and films), but it is hard to tell the difference between the feuding paired vampires-and-werewolves of Twilight and the Sharks and the Jets of West Side Story

(I have actually watched Twilight, when my twin 10-year-old granddaughters wanted to see it on TV and made me sit with them through it. They loved it. Me? Well, I love my granddaughters.) 

From the very beginning, horror movies have been aimed primarily at the young and prepubescent. Which is the age where the fascination with regalia and insistence in genre rule consistency become hardened like old cheese left out to dry. It is the motivating impulse of cosplay, Comic-Con and arguments over whether the Batman outfit should have nipples or not. 

(If you want to start a fight, just complain about the Marvel Universe versus the DC Universe. Each is a self-contained cosmos with its own physics and set of back stories. And woe to him who mixes up Marvel’s Sub-Mariner with DC’s Aquaman. Or cannot tell the difference among Green Arrow, Green Lantern and Green Hornet.) 

In the CBS sitcom, The Big Bang Theory, the minutiae of competing comic book universes is often a plot point. Which makes it fun how often fans like to point out inconsistencies in the Sheldon Cooper Universe between BBT and Young Sheldon. Can’t these writers keep their stories straight? 

When young men are trying to figure out the rules of life, it must be comforting to find these worlds where coherence and consistency is part of the deal. One of the reasons that superheroes have overtaken monsters in the movie world must surely be that the DC and Marvel universes are so absolutely clear, even hide-bound, about their rules. The monsters have their laws of physics, but the rules tend to morph over time. Adolescence craves something more permanent to depend upon. 

Young women have their say in all this, too. But the monsters they fantasize about tend to be more like Beauty and the Beast: Can the rough monster be tamed by love? Their guiding genius is not Bram Stoker but rather, Jean Cocteau. Their corollary is not the Marvel Universe, but the land of Romance Novels. In each one, a monster is tamed by love. Underneath is a prince of a guy. Hence, Twilight

In the case of either gender, there is comfort in the consistency of the conventions, of the rules. 

Frontispiece from Mary Shelley’s “Frankenstein” 1831 edition

The great granddaddy of all the monsters is, of course, Frankenstein. The first film version was made in 1910 and the most recent just came out with Maggie Gyllenhaal’s The Bride. Overall, some 469 known feature films have been made, and also 236 short films, 93 TV series and 394 TV episodes feature some version or interpretation of the Frankenstein character. Some have been attempts to tell some variation of the story created by Mary Shelley in her 1818 novel, others just transport the monster into unrelated plots, or perhaps put the monster on the Moon or into World War II. 

The 1910 version, made by the Edison Studio, has the monster literally cooked up in a steaming cauldron. The movie lasts only about 16 minutes but includes many of the recurring tropes. It can be watched on YouTube. Recommended. 

The most famous was the 1931 film, directed by James Whale and starring Boris Karloff as the monster. The makeup for the Karloff became, for decades, the defining look of the monster, with his flat head and neck bolts. The look lasted even until Fred Gwynne wore it in TV’s The Munsters from the 1960s. 

The UK’s Hammer Films gave the monster a total re-imagining beginning in 1957 with Curse of Frankenstein, dousing him googly-eyes and a what seemed to be a terminal case of eczema. The series of five sequels featured garish color photography and lots of heaving bosoms and gushing blood. 

A compost heap of exploitation films brought the monster to teenagers in the ’60s and beyond with titles such as Frankenstein Meets the Space Monster (1965), Jesse James Meets Frankenstein’s Daughter (1966), Frankenstein and the Monster from Hell (1974), Frankenstein’s Mother-in-Law (1983), or Frankenstein: The College Years (1991). 

And the monster showed up in various TV episodes as a familiar cultural icon, from the Colgate Comedy Hour in 1951 through the Carol Burnett Show in 1972 and more than a hundred times since then, even through South Park last year.  

There have also been earnest attempts at making grown-up versions of the book, to varying degrees of fidelity. Kenneth Branagh tried in 1994 with Mary Shelley’s Frankenstein, to a lack of enthusiasm on Rotten Tomatoes, with Robert De Niro playing the monster. The cast was so loaded to the brim with familiar British actors you might have thought you were watching Midsomer Murders.

Van Helsing (2004) reanimated the monster and joined him up with Dracula, who unaccountably is both vampire and werewolf. It got a measly 24 percent on Rotten Tomatoes and critics complained about too much obvious CGI. Really, it was just silly. 

Last year’s Frankenstein by Guillermo del Toro took the story as seriously as Mary Shelley, and although changed quite a lot from the book, nevertheless was the closest in spirit to the original I have ever watched. 

And currently The Bride tries to tell the story from a sort-of feminist point of view and as a musical (sort of). I have not seen it yet. It is getting mixed reviews, but is clearly a serious take on the story. 

These last two have gotten most of the PR, but it should be noted that 2025 also gave us The Abominations of Frankenstein by Eric Yoder, I Am Frankelda, a Mexican stop-motion animation, and Stitch Head, an animation by Steve Hudson.

And coming soon to theaters near you — or most likely streaming or direct to DVD: The Monster Hop; Frankenstein by Micah Ignacio; and Frankenstein in Romania by Radu Jude.  

I counted a dozen feature films with the single-name title Frankenstein and scores more with the name in the title: Son of…; Terror of…; Bride of…; Curse of…; Revenge of…; etc. and those like Frankenstein meets…; Lady Frankenstein; Frankenstein: The True Story; or Frankenstein: Italian Style. It never seems to go out of fashion. There are good ones, bad ones, comic ones, drive-in ones, as surprising number of porn ones. The run from tacky to artful. But they all follow a dependable set of conventions, even if the rules evolve over time. 

And we shouldn’t forget the best reinterpretation of Mary Shelley’s story, Young Frankenstein, the best made of all Mel Brooks’ movies, with true reverence for the craftsmanship of the old Universal films, but, you know — funny.  

There is a great literature around the psychology of attraction we have for monsters. I leave that to the experts. Is it the metaphor for the Id? The fear of death? The recognition of the threat the outside world presents? A parable of the societal outsider? The Aristotelian projection of terror and pity? Probably all these things at various times. But I am suggesting that one of the pulls of the genre is its suggestion of stability, that the monster itself will abide by the rules, and that, after the stake through the heart or the silver bullet, things will always go back to normal — until the sequel.

Fifty-seven years ago, while on the Apollo 9 mission orbiting Earth, astronaut Rusty Scheickart was floating outside the capsule in his space suit and had a moment to look out through his “fishbowl” helmet at the planet under him. 

“And you look down there,” he said, “and you can’t imagine how many borders and boundaries you cross, again and again and again [as you orbit the Earth every 90 minutes]. And you don’t even see them.” 

It’s a common refrain from those who have gone to space. Moon astronaut Buzz Aldrin said, “From space there were no observable borders between nations, no observable reasons for the wars we were leaving behind.”

Senator Bill Nelson, who flew on the Space Shuttle Columbia in 1986, said, “In space, you don’t see boundaries or borders. We are all citizens of Earth.”

The blue marble

We draw lines where there aren’t any. National borders are just one case. We draw distinct lines between species in biological taxonomy, we name historical eras, invent racial exclusions — we talk about these arbitrary lines as if they build fences between property lines. But they are legal fictions, and continually malleable. Political borders shift; taxonomy rejumbles categories; Red politicians vs blue politicians? Really they are all just grey men in blue suits. Their squabbles are parochial at best. 

The issue is that we understand the world in discrete chunks, but nature comes in indistinct swathes. In order to discuss or argue, we pretend there are clear lines. Perhaps we have to; everything all together at once is confusing. 

Robert Rauschenberg “Lucky Dream”

Nature, however draws few lines. It spreads and includes. It changes constantly; it is never static. “Everything flows,” as Heraclitus put it. Seed into sprout into flower and into seed into … 

I’m not saying there are no differences at all, but rather that the lines we draw tend to be arbitrary or at least, blurry. Are the red people for smaller government or a more powerful presidency? Such issues shift over time and it’s impossible to pin them down. 

Take the past, for instance. Historians like to take big chunks of time and give them names: Classical, Postclassical, Late Medieval, Romantic, and so on. Then they argue over it all, because any good academic historian knows that the names we give big chunks of time are misleading. But, as they say, whatcha gonna do? We seem to be stuck with them. 

The Middle Ages, for instance. Middle of what? Homo sapiens developed something like — in a common low-end estimate — 300,000 years ago, putting the start of the Middle Ages somewhere approximately in the last 15/3000ths of human history. Not exactly the middle.

And the dates we give the Middle Ages vary widely. Where do you draw the line? It came after the Roman Empire. But when did the Roman Empire fall? Well, you can say that the final collapse came in 1453 with the fall of Constantinople. For some people, that is already the Renaissance, squeezing out the Middle Ages entirely. And no one really believes the Byzantine Empire was genuinely Roman. They spoke Greek, for god’s sake. They were Christian.

Usually, when we talk of the fall of Rome, we mean the Western Roman Empire and the sad reign of Romulus Augustulus, which came to an end in AD 476. But really, the Western Roman empire at the time consisted only of most of Italy, a tiny bit of France, and Dalmatia (later aka Yugoslavia, later still — well, you know).

And you could easily argue that Rome ceased to be Roman after Constantine converted to Christianity and legalized it in AD 313. After that, the slow slide from Roman imperialism into Medieval feudalism began its ambiguous transubstantiation.

It is the great paradox of scholarship: The more you read, the more your ignorance grows: The more you learn about something, the more you discover how little you know. 

We think of our current era as modern. But when did that begin? It is a slippery question. I am reminded of the time, some 50 years ago, when I first drove west from North Carolina. I had never seen the great American West and eagerly anticipated finding it. It must be so different, I thought, so distinct.

We were living in Boone, N.C., named for Daniel, who trod those mountains in the 1700s, when anything beyond the Blue Ridge was the West. When George Washington surveyed the Northwest Territory in the late 1740s, he was measuring out what became Ohio.

Blue Ridge

So, when I was driving, I knew I had already pushed my own frontier past such things, and knew in my heart that the West began on the other side of the Mississippi River. But, when I crossed the river into Arkansas, it hardly seemed Western. It didn’t look much different from Tennessee, in my rear view mirror. Yet, Arkansas was home to the “Hanging Judge” Isaac Parker and where Jesse James robbed trains.

But surely Texas was the West, but driving through flat, bland Amarillo on I-40 was as exciting as oatmeal. The first time we felt as if we had hit the West was at the New Mexico line, when we first saw a landscape of buttes and mesas. Surely this was the West.

Maybe, but we hadn’t yet crossed the Continental Divide. All the waters of all the rivers we crossed emptied into the Atlantic Ocean. Finally, crossing the Divide near Thoreau, N.M., we felt we had finally made it.

Yet, even when we got to Arizona, we knew that for most of the pioneers who crossed this country a century and a half ago, the desert was just one more obstacle on the way to California. In some sense it still wasn’t the West.

When we got as far as we could in a Chevy, and stared out at the Pacific Ocean, we knew that there was still something farther: Hawaii, Japan, China, India, Africa — and eventually across the Atlantic to Cape Hatteras and back to North Carolina.

So, the West wasn’t a place you could ever really reach, but a destination beyond the horizon: Every point on the planet is the West to somewhere else.

When we look to find the beginnings of Modernity, the horizon recedes from us the same way. Perhaps it began with World War I, when we entered a non-heroic world and faced a more sober reality.

Modern Art began before that, however, perhaps with Stravinsky’s Rite of Spring in 1913, perhaps with Debussy’s Afternoon of a Faun in 1894. Some begin with the first Impressionist exhibition in 1874.

Politically, maybe it begins with Bismarck and the establishment of a new order of nations and the rise of the “balance of power.”

You can make a case that Modernism begins with the Enlightenment in the 18th century, when a rising Middle Class began to fill concert halls and Mozart became an entrepreneur instead of an employee of the aristocracy.

Or before that, in 1648, with the Treaty of Westphalia, and the first recognition of national boundaries as something more than real estate owned by the crown.

You can set your marker down with Luther, with Gutenberg, with Thomas Browne, Montaigne, Caravaggio — or Giotto.

For many, Modernism began with the Renaissance, but when did the Renaissance begin? 15th century? The Trecento? Or did it begin further north with the Gothic around AD 1150, which is really the first sparking of a modern way of thinking. 

Perhaps, though, it is the Roman republic that divides modern political organization from more tribal eras before. Or you could vote for the democracy and philosophy of ancient Greece. Surely the time before that and the the time after are distinctly different. We recognize the near side of each of these divides as more familiar than the distant side.

You might as well put the starting line with the discovery of agriculture in the steppes of Anatolia and the river plains of Iraq. An argument can be made for any of these points on the timeline — and arguments could be made for many I haven’t room to mention.

Which leaves us the ultimate question: Is Modernism now over? Done with? Have we moved on, or is what we deem Postmodernism really just the next manifestation of the Modern? Perhaps AI is the new line drawn in history. 

Perhaps the horizon should be recognized for what it is: an ever-moving phantasm. For those peasants digging in the manorial dirt in the Ninth Century, the times they were living in were modern. The first person recorded to use the term “modern” for his own age was the Roman writer Cassiodorus in the 6th Century. Each moment is the new modern.

Scholars know all this very well, and make their arguments in books and treatises, almost always with the caveat about drawing lines hard and fast. But the convenience of giving names is too seductive, and leaves the popular imagination with images like Monty Python’s “Bring out yer dead” or Elizabeth Taylor’s Cleopatra. Can we talk about the past without the labels we give it? 

We need to understand the world is not binary, but a borderless spectrum of experience. Hawaii is now part of North America and Iceland is part of Europe. Electrons are particles and waves. Poland grew immense and shrunk, disappeared completely and reappeared and picked up its skirts and moved 200 miles to the west. And so, I wince every time I hear a red politician tell us with misbegotten certainty what gender roles should be, or that “male” and “female” are hard, definable categories with no subtleties. Or that “Left” and “Right” are hard-and-fast places where we must construct our redouts. 

I admit we need words, categories, borders, and definitions to be able to communicate. We need to cut up our steak in order to eat it. But I would wish we could be more humble about their actual reality. 

Imagine nothing. Got it? Now, imagine that not even nothing exists.  For after all, nothing is something. At the very least “nothing” implies its opposite, and I’m asking you to imagine a time before opposites are even possible, before time is possible. 

Then, imagine a point, the way geometry defines a point, with no dimensions. This point is something. But it can exist for only a billion-trillionth of a second — although a second is something that doesn’t really exist yet. The word “yet” implies that a future does exist, however, and in that infinitesimal fraction of eternity the point — which is everything that exists or ever will exist — physicists tell us that the point “expanded,” although that word cannot adequately express the explosion. In fact, the universe ejaculated into both something and nothing. It gave rise to particles and antiparticles and we were off to the races.

Chaos 1

It is important to note that the “point” did not expand into a great big empty nothingness but rather something and nothing together expanded — and they keep expanding, even as we sit sipping our tea and watching Big Bang Theory in endless reruns on TV. There is math to show this, but you wouldn’t understand it. I certainly don’t. It’s complicated. 

“Alice laughed. ‘There’s no use trying,’ she said. ‘One can’t believe impossible things.’

“‘I daresay you haven’t had much practice,’ said the Queen. ‘When I was your age, I always did it for half-an-hour a day. Why, sometimes I’ve believed as many as six impossible things before breakfast.’”

As it says in the Tao Te Ching, “Thus something and nothing produce each other.”

Now, it is 13.799 billion years later, and the universe is still expanding, ever faster and faster. And we are riding on one meager little mote in that great soup, called the planet Earth. It is something. Now, “nothing” is what exists between the bits of “something.”

That is our Creation Myth. 

By calling it a myth, I am not implying it is not true, or not factual. Myth does not mean something is untrue, but means it is our way of comprehending what is beyond our actual understanding. 

Myth is our explanation to ourselves of something. It may be factual, it may be fantastical. It may be taken literally or it may be understood as metaphor. Either way, it is an approach to the comprehension of something too complex to be held in the mind any other way. 

Chaos 2

A physicist may be able to put the math together and parse out the myth in non-mythic terms (I use the word “may” advisedly), but for the rest of us, we take it on faith that our creation myth is scientifically verifiable and therefore, factual. It is the myth we believe in, i.e., the story we take as true. (That it is true is irrelevant to its function as myth). 

We mistakenly tend to look on myth as something from the past: Zeus or Achilles, or Odin, or Indra fighting Vritra, or Quetzalcoatl, or the Chinese dragon. It is something we condescend to, having learned better. We know that thunder isn’t clouds crashing together. But such an attitude misunderstands myth and its function. We all live by myth, even now. 

Chaos 3

There are things we do not or cannot understand. Either too complicated to grasp or just plain unknowable. We need a metaphor to help us come to grips with such things. Language cannot describe such things with the precision of a dictionary, but rather it has to fall back on not “what it is,” but “what is it like.” We tell a story.

The Big Bang is our story. When we assume our superiority, we fail to understand that for most of us, we are relying on the argument from authority no less than the Middle Ages did. We must accept that the physicist knows what we merely accept. (I am making the assumption that a physicist has a more complete understanding than even an educated lay person).

And since we cannot know every corner of relativity or quantum mechanics, we simplify it all into a comprehensible story. The Big Bang. 

Chaos 4

I am not claiming what science has parsed out is false, but that our understanding as non-scientists is a mythological understanding, not a literal one. And for that matter, I doubt any scientist is equally conversant in all aspects of the field — relativity, quantum theory, the math and the particle physics. Perhaps he or she has a good grasp on black holes, but how much has he or she published on quarks with spin? Specialization is necessary for modern science, and even a scientist has to rely on and trust the work of others. 

And it is important to remember that not all scientists agree. The popular version of the Big Bang is certainly wrong, or at the very least wildly simplified. New theories are always coming forth. No version is entirely consistent and coherent, not even the Bible’s. 

Chaos 5

All of which takes me off point: Creation myth. There are so many of them, from the Chinese cosmic egg to the Mesopotamian butchery of the sea goddess Tiamat. The one we in the West are most familiar with is that of Genesis. 

“In the beginning God created the heaven and the earth. And the earth was without form and void; and darkness was upon the face of the deep. And the Spirit of God moved upon the face of the waters. And God said, Let there be light.”

We are so used to the organ tones of the King James translation that sometimes putting it into modern English takes away some of the majesty. 

“When God began creating the sky and earth, the earth was formless and empty.”

Chaos 6

There are many believers who take this story literally, just as most of us take the Big Bang. For most of us, the Bible story is a story. So, if we had to stake our lives on it, we would more likely defend physics — even if we were Christian believers — and accept that ancient Middle-Eastern poetry is just that. The King James Genesis is transcendent poetry. But so is our story of the Big Bang.  

“Mythology opens the world so that it becomes transparent to something that is beyond speech, beyond words, in short, to what we call transcendence,” said scholar Joseph Campbell.

“The energies of the universe, the energies of life, that come up in the sub-atomic particle displays that science shows us, are operative. They come and go. Where do they come from? Where do they go? Is there a where?”

Which returns us to the Big Bang. 

Chaos 7

Physicist Paul Dirac in 1930 imagined a where: Now called the “Dirac Sea,” it is an infinite or unnumbered source of subatomic particles that exist “beneath” our visible world. An electron may pop up anywhere, as quantum physics has shown, and may disappear also. Where they come from, where they go is the Dirac Sea. Using the nautical term is another case of mythology making familiar what cannot be grasped otherwise. 

Imagine a billiard table, he said, completely covered in balls, leaving no room. Place another ball on top and it sits there, on top of the rest. But push it down and that forces another ball to pop up elsewhere. We cannot predict where. The one above the rest is the one we see and measure, the rest, below, are the “Dirac Sea,” unavailable for study in the visible universe. 

Yes, it’s a story. Most of us would run screaming from the math involved in a more proper explanation. 

Chaos 8

“The ultimate ground of being transcends definition, transcends our knowledge,” said Campbell. “When you begin to ask about ultimates, you are asking about something that transcends all the categories of thought, the categories of being and non-being. True, false; these are, as Kant points out in The Critique of Pure Reason, functions of our mode of experience. And all life has to come to us through the esthetic forms of time and space, and the logical ones of the categories of logic, so we think within that frame,” he wrote.

“But what is beyond? Even the word beyond suggests a category of thought. So transcendence is literally transcendent.”

Chaos 9

Vedic mythology has many creation stories, but the one most widely seen has the Brahman, or the ultimate ground of reality, as the source of all. However as it says in the Upanishads, the Brahman is just a word, and already it is a distortion of the ultimate, which is beyond words, beyond category, beyond comprehension.

True “of all knowledge,” Campbell said, that is beyond comprehension. “In the Kena Upanishad, written back in the seventh century BC, it says very clearly, ‘that to which words and thoughts do not reach.’ The tongue has never soiled it with a name. That’s what transcendent means. And the mythological image is always pointing toward transcendence and giving you the sense of riding on this mystery.”

Pillars of Creation

So, we look at the Hubble image of a portion of the Eagle Nebula and have named it “The Pillars of Creation.” It is a transcendent image, and fills most of us with genuine awe. But of course, it is a photograph in false color: It would not look that way if seen by a human eye through a telescope. It is a myth. Again, I am not saying it is not true — even the false color is true in its way — it provides a way to see wavelengths that cannot register in a human eye, but are there nonetheless.

But let us go back again to that bit before “something” and before “nothing” — those pairs of opposites. 

In many recorded myths, before anything, there was Chaos. We should not be fooled by modern science’s version of Chaos Theory. In that, chaos is just something so complex it cannot be predicted by mathematical formula. But mythological Chaos is something else again: It is to order what eternity is to time — which is not simply forever, but rather outside of time altogether. 

Likewise Chaos in myth is not a lack of order, but something outside the very idea of order. It is before the organization of “categories of thought,” and cannot be described either in words or algebra. 

Chaos 10

The word comes from the Greek χάος meaning “emptiness, vast void, chasm, abyss,” related to the verbs χάσκω and χαίνω “gape, be wide open,” from Proto-Indo-European cognate that gives rise, millennia later, to the English “yawn.”

In Shelley’s Prometheus Unbound, the Spirit of the Hour talks about “The loftiest star of unascended heaven,/ Pinnacled dim in the intense inane.” And here the poet’s unfortunate word choice doesn’t mean insipidity, but rather translates from the Latin word inanis, which meant “empty,” or “void.” And so I like to think of chaos as an intense emptiness. 

Chaos 11

My favorite Creation myth is found in the opening of Ovid’s Metamorphoses: “Before the sea was, and the lands, and the sky that hangs over all, the face of Nature showed alike in her whole round, which state have men called chaos: a rough unordered mass of things, nothing at all save lifeless bulk and warring seeds of ill-matched elements heaped in one.” 

In his De Rerum Natura (“On  the Nature of Things”), the Roman writer Lucretius (ca. 99-55 BC) comes very close to both modern astrophysics and to quantum mechanics, although told in mythic terms rather than mathematical formula. 

Chaos 12

For Lucretius, the universe has always existed. Nothing can be created from nothing, he wrote, nor can it be destroyed — anticipating the conservation of matter and energy. But the universe originally was an undifferentiated mass of atoms, all traveling in straight lines, he wrote — anticipating Newton’s First Law of Motion — but oddly the atoms had an irrational  tendency to “swerve.” This unaccounted divergence of the atoms’ direction led them to bump into each other, to make concentrations of matter in some localities and voids of matter in others — very like the astrophysicists’ explanation of how the cooling of the Big Bang led to unequal distribution of matter in the early universe through gravity and density fluctuations. 

Map of cosmic background microwave radiation

We have an image of this in the map of the cosmic microwave background radiation, discovered by accident in 1965, which supported the Big Bang theory — and Lucretius — with actual data. The two scientists who found this at first thought the data was “noise” in the signal caused by pigeons nesting in the Holmdel Horn Antenna, in New Jersey, where they did their work. Turned out, no, it was evidence of the universe before condensing into protons and electrons. 

And so, the Big Bang is now part of the wider public’s sense of where the universe came from. Physicists and cosmologists worry over the many corners that don’t neatly fit, and reams of arcane mathematical formulae are published to support one idea or another. Standard model relativity doesn’t fit, tab-into-slot, with quantum mechanics, and scientists scratch their heads and try new ideas. And they sometimes torture the English language with things like “temperature inhomogeneities,” and torture math with paragraph-long formulas with no actual numbers in them. Looks good on a white board, but it’s Greek to me. 

Someday a newer common myth will be created to refine or supplant the version of Big Bang that the educated laymen currently accept. Again, it doesn’t mean the myth isn’t true, but that it is a story that attempts to make the unexplainable facts comprehensible to the puny human mind, which evolved, after all, when we were still banging rocks together. Sometimes it’s like watching a monkey trying to open a jar of pickles.