Archive

Tag Archives: modernity

In 1956, psychologist Benjamin Bloom published his Taxonomy of Educational Objectives, a hierarchical ranking of thought processes, often recast as “Bloom’s Taxonomy.” It has been often revised and recast, but most often, at the bottom were simple tasks such as memorizing, at the top came creativity. 

My late wife, who was at least as smart as Bloom, had her own version of this taxonomy, and for her, the lowest level was “naming.” She taught school for more than 30 years and saw brain-burn at the individual level. Being able to say, “Horsie” or “Duckie” is naming. This is simple rote. Learn the name and repeat it when appropriate. 

Naming also shades into the second level — the level most people get stuck in — that of sorting. Finding categories and shunting the names into silos to contain them. As if that explained anything. 

The greater part of what we do with our brains is to sort things out. To put cats over here and dogs over there. When we learn, most of what we mean by that is to understand that Claude Monet was an Impressionist and that Luis Buñuel was a Surrealist. These are mere sortings. Important for a file clerk, perhaps, but more a form of busy work than of actual thinking. 

We learn a whale is not a fish, and that a spider is not an insect. We have separate categories for them, and when we recognize the categories, we believe we have actually said something meaningful about our whale or spider, when really, all we have done is play with words. 

Categories, are, after all, quite fugitive, quite fungible — squishy. When zoologists first tried to classify lions, for instance, they placed them in the genus “Felis,” for they are some kind of cat. But later, it was decided they were big cats, not small ones, and so they became “Panthera.” Oh, but that wasn’t good enough, and so a new genus was established, dividing them from tigers and leopards, making them “Leo.” New category, new silo. 

For a brief time, I worked at a zoo, and had the opportunity to walk behind the cages and get up close to many of the animals and I can tell you that standing with his zookeeper two feet from a male lion to feed him,(separated from Leo by the cage bars), the lion’s head seemed to be the biggest thing I had ever seen, shaggy and furry, with a very particular smell, and a sense that this beast could swallow my head as if it were an M&M. And then it “purred.” A low, gutteral roar expressing satisfaction at the afternoon meal, that made the ground rumble under my feet. It was one of the most impressive things I have ever witnessed and it mattered not a whit whether I was seeing a Felis or a Panthera or a Leo. The name was rather beside the point. The experience had a physical existence and it didn’t need a name. 

Language is not reality. And the experience — the feel of it in the palm of your hand, or in your nostrils, or under your feet — is worth all the words in the world. Words can be a barrier keeping us from what is real. 

And yet, we spend so much of our time arguing over these categories, as if they mean anything. As if they were a reality. Is Joe Biden a Socialist? Did Elon Musk actually reach outer space? Is a tomato a fruit or a vegetable? So much thought and energy to such meaningless ends. Think of all the dark money spent in political campaigns to paint the opposition into a category-corner that makes the opponent a one-dimensional boogeyman. The world and its things are infinite. 

My late wife took animals to class with her so her pupils would have actual experiences — the twitching nose of a bunny, the blank stare of a hen, the brittle carapace of a hermit crab — and then gave the kids paper and paints and let them express what they had experienced. If names were mentioned, they were the names the kids gave the animals — a rabbit named Tiffany Evelyn or a crab named Eloise. What mattered was physical reality of the experience. Anything else is just language. Names. Categories. 

Historians like to take big chunks of time and give them names: Classical, Postclassical, Late Medieval, Romantic, and so on. Then they argue over it all, because these categories are misleading and constantly changing — being redefined. But, as they say, whatcha gonna do?

Take the Middle Ages. Middle of what? Homo sapiens developed something like — in a common low-end estimate — 300,000 years ago, putting the start of the Middle Ages somewhere approximately in the last 15/3000ths of human history. Not exactly the middle.

But the dates we give the Middle Ages vary widely. It came after the Roman Empire. When did the Roman Empire fall? Well, you can say that the final collapse came in 1453 with the fall of Constantinople. For some people, that is already the Renaissance, squeezing out the Middle Ages entirely. But no one really believes the Byzantine Empire was genuinely Roman. They spoke Greek, for god’s sake. They were Christian.

Usually, when we talk of the fall of Rome, we mean the Western Roman Empire and the sad reign of Romulus Augustulus, which came to an end in AD 476. But really, the Western Roman empire at the time consisted only of most of Italy and Dalmatia (later aka Yugoslavia) and a tiny bit of southern France.

And you could easily argue that Rome ceased to be Roman after Constantine converted to Christianity and legalized it in AD 313. After that, the slow slide from Roman imperialism into Medieval feudalism began its ambiguous transubstantiation.

It is the great paradox of scholarship: The more you read, the more your ignorance grows: The more you learn about something, the more you discover how little you know.

Are Picasso’s paintings Modern art? His first big Cubist painting, Les Damoiselles d’Avignon was painted in 1907. That is closer in time to the reign of Catherine the Great in Russia than it is to us. Closer to George Washington’s Farewell Address. To the Louisiana Purchase. 

So, what do we mean by “modern?” and when did modernity take over? It is a slippery question. And really it is simply an issue of definition — words, not experience. We let the words stand in for reality and then let the debates begin. Reality flows uninterrupted and continuous. Categories are discrete and they start and stop. 

The more you attempt to define the categories, the more they slip away. The history of academic scholarship is often the history of proving the categories wrong. It is historians who argue over the dates of the Renaissance. Or the fall of Rome, or the birth of Modernism. 

Categories are a convenience only. They are a name for the nameless.

I am reminded of the time, some 40 years ago, when I first drove west from North Carolina with my genius wife. We had never seen the great American West and eagerly anticipated finding it. It must be so different, we thought, so distinct. The West is a category. 

We were living in Boone, N.C., named for Daniel, who trod those mountains in the 1700s, when the Blue Ridge was the West. When George Washington surveyed the Northwest Territory in the late 1740s, he was measuring out what became Ohio.

So, when I was driving, I knew I had already pushed my own frontier past such things, and knew in my heart that the West began on the other side of the Mississippi River. But, when I crossed the river into Arkansas, it hardly seemed western. It didn’t look much different from Tennessee, in my rear view mirror. Yet, Arkansas was home to the “Hanging Judge” Isaac Parker and where Jesse James robbed trains. Surely that must be the West. But no, James looked more like a hillbilly than a cowboy. 

Then came Texas, which was the real West, but driving through flat, bland Amarillo on I-40 was as exciting as oatmeal. The first time we felt as if we had hit the West was at the New Mexico line, when we first saw a landscape of buttes and mesas. Surely this was the West.

Maybe, but we hadn’t yet crossed the Continental Divide. All the waters of all the rivers we crossed emptied into the Atlantic Ocean. Finally, crossing the Divide near Thoreau, N.M.,  we felt we had finally made it.

Yet, even when we made it to Arizona, we knew that for most of the pioneers who crossed this country a century and a half ago, the desert was just one more obstacle on the way to California. In some sense it still wasn’t the West.

When we got as far as we could in a Chevy, and stared out at the Pacific Ocean, we knew that there was still something farther: Hawaii, Japan, China, India, Africa — and eventually back to North Carolina.

So, the West wasn’t a place you could ever really reach, but a destination beyond the horizon: Every point on the planet is the West to somewhere else.

When we look to find the beginnings of Modernity, the horizon recedes from us the same way. Perhaps it began with World War I, when we entered a non-heroic world and faced a more sober reality.

Modern Art began before that, however, perhaps with Stravinsky’s Rite of Spring in 1913, perhaps with Debussy’s Afternoon of a Faun in 1894. Some begin with the first Impressionist exhibition in 1874.

Politically, maybe it begins with Bismarck and the establishment of a new order of nations and the rise of the “balance of power.”

You can make a case that Modernism begins with the Enlightenment in the 18th century, when a rising Middle Class began to fill concert halls and Mozart became an entrepreneur instead of an employee of the aristocracy.

Or before that, in 1648, with the Treaty of Westphalia, and the first recognition of national boundaries as something more than real estate owned by the crown.

You can set your marker down with Luther, with Gutenberg, with Thomas Browne, Montaigne, Caravaggio — or Giotto.

For many, Modernism began with the Renaissance, but when did the Renaissance begin? 15th century? The Trecento? Or did it begin further north with the Gothic, which is really the first sparking of a modern way of thinking.

Perhaps, though, the Roman republic divides modern political organization from more tribal eras before. Or you could vote for the democracy and philosophy of ancient Greece. Surely the time before that and the the time after are distinctly different. We recognize the near side of each of these divides as more familiar than the distant side.

You might as well put the starting line with the discovery of agriculture in the steppes of Anatolia and the river plains of Iraq. An argument can be made for any of these points on the timeline — and arguments could be made for many I haven’t room to mention.

Perhaps the horizon should be recognized for what it is: an ever-moving phantasm. For those peasants digging in the manorial dirt in the Ninth Century, the times they were living in were modern. The first person recorded to use the term “modern” for his own age was the Roman writer Cassiodorus in the 6th Century. Each moment is the new modern.

These are all just categories, and spending our time sorting things into their file folders should not be mistaken for actual knowledge. It is words about the knowledge. 

Now, I will concede that the words help us discuss the real things, and that it is probably useful to know the difference between cats and dogs, or butterflies and moths. But categories and sorting are just a second level of thinking. After these baby steps, there is so much more that the human brain can begin working on, much more grist to be ground. And a good deal of thought that outreaches the ability of words to capture. 

The level I have been most thinking about recently is that of observing, of paying attention. Not deciding anything, or sorting anything, but just noticing. The world opens up like a day lily; so much that was invisible is made visible — things that the rush of daily life, moving things from in-box to out-box, have made too inconsequential to waste time with. There is a richness to the world that becomes a glowing glory when attention is paid. 

In the days before the transcontinental railroad, a Cheyenne father would take his 10- or 11-year-old son out into the prairie and have him lie down on his belly. “Just look,” he would say. “Don’t talk, don’t decide, don’t name, just look.” And he would leave his son there for the day, not moving a whit. And when he came back to retrieve the boy he would not ask, “What did you see.” He would say nothing. He would not need to. 

So much of value is beyond words, beyond category.

apostle 1When I was leaving the theater after seeing Robert Duvall’s The Apostle, way back in 1997, a loud woman in the back of the crowd screamed out, somewhat redundantly, ”That’s the worst movie I have ever seen … in my entire life.”

At first, I couldn’t understand her reaction. It was a very good film, a quiet, intense character study of a Southern preacher. Perhaps, I thought, there were not enough car crashes in it, not enough glowing, cherry-red petro-explosions.

Certainly the film had not fulfilled her expectations.

And that was the sticking point. I have thought about it long and hard. Was The Apostle an outlier or a harbinger? There have been many articles written about the death of irony, yet, irony refuses quite to go away. The attacks of Sept. 11, 2001, led to a brief hiccup in our otherwise comforting embrace of the snarky, but it soon returned. If we briefly took a breath and said to ourselves, some things are too real, too important to sniff at, well, then it didn’t stop Stephen Colbert, it didn’t put an end to The Onion.

But there was still something in Duvall’s film. The singular quality of the film is its lack of irony. Everything is presented utterly straight, with no snide comments under the breath, no revelation of hypocrisy, no hidden agenda. Duvall neither makes fun of the Apostle’s deeply held religion, nor does he proselytize for it: It is not a “Christian” film, but a sober look at the complexities of a Christian life, fully rounded, and not a summation of a generic Christian life, but rather only this one person. Irony depends on stereotypes, on “classes” of people, not on individuals.

This straightforwardness is rare in Hollywood, perhaps unique, where we expect a cushion of irony to protect us from messy experience. hangover 1

Irony, narrowly defined, is saying one thing but meaning another. As when we see a friend green-skinned and hung over in the morning and say to him, ”You look bright and chipper today.”

In that, we are both in on the joke. Often, though, an audience is split between those who get it and those who don’t. Irony is thus used frequently as a kind of shibboleth for a clique. Those who ”get it” are in, those who don’t move to a retirement community in Florida.

Irony is also a literary trope, which means, its expectations are linguistic and not experiential. Most Hollywood movies set up a form and audiences know where the story is going. A gun flashed in an early scene will by expectation be used in a later scene. The surprise we wait for is the when.

But The Apostle never quite does this. Each time we spot an obvious plot development, the movie goes elsewhere, and where it goes is closer to what might happen in real life than what we would normally expect in a movie.

All setups are frustrated.

Unlike almost any mainstream Hollywood film, there was no ”in joke” to be in on.

Instead, the story of the Apostle E.F. is given to us as an esthetic construct, something to apprehend and appreciate, to hold in our mind, whole, as we might hold in our hands a glass orb, rotating it and seeing it from all angles.

In its lack of irony, The Apostle is an odd fit for our cultural moment. The 20th Century was a century of irony; irony has been our lingua franca. But, there are some indications that as we descend into the 21st, irony has begun to wear out its welcome. It is still pervasive, but oftentimes, it seems to come by rote, as in so many sitcom pilots, seemingly written from some formula. Irony is tired; it wants to put up its feet and rest. We expect the irony, but we don’t really believe in it anymore. It’s just the norm, which we also are too tired to give up.

This shift away from irony has happened before: It is clearest in the change from the 18th to 19th centuries, from the irony of Alexander Pope to the sincerity of William Wordsworth.

daffodilsOne has only to compare the mock epic tone of The Rape of the Lock with the straightforwardness, almost blandness of “I wandered lonely as a cloud/ That floats on high o’er vales and hills,/ When all at once I saw a crowd,/ A host, of golden daffodils.”

A younger generation back then, tired of the artificiality of the older and sought to substitute an authenticity for the artifice.

There were things that were important to be said, the younger generation thought, and to be said clearly and meaningfully. The century that followed Wordsworth was a century without irony — and almost, at times, it seemed without a sense of humor.

Eventually, the century gagged on its own sincerity, so that when the new one began, the page flipped back. Poets such as T.S. Eliot and Ezra Pound stoked their verses heavily with irony, never saying quite what they meant, always approaching their subject obliquely.

We no longer trusted the Great Truth spelled out in large, direct letters, and for good reason. Too many Great Truths turned out to be miserable lies. Colonialism, Imperialism, racism, purity, idealism. There have been many deaths. picasso violin

This wasn’t true only in literature. Music turned from Tchaikovsky’s grand passions to Stravinsky’s tweaked noses, art from grand historical paintings to pasted bits of daily newspapers and deconstructed violins.

One has only to compare the historical straggler, such as D.W. Griffith’s sentimental Way Down East with Ernst Lubitsch’s brassy Ninotchka. It is the same change. You can see the pendulum swing, saeculorum decursum, over and over.

Between the irony and the directness there is constant battle, for neither is sufficient. Each mode has both its strengths and weaknesses. Direct sentiment soon devolves into Victorian sentimentality, so that we laugh now at the mawkishness of much of it. But irony declines into mere cleverness, so that we admire an author’s wit, without much regard for his sense.

This has certainly been the case in Hollywood. It is rare to find a film in which actors behave the way any real people behave or feel the feelings of real people. Instead, they speak in catch phrases that ring with bell-like cleverness. The plots are artificial; their resolutions preposterous.

”Hasta la vista, baby!”

”Go ahead, punk, make my day!”

”Show me the money.”

“I have had it with these motherfucking snakes on this motherfucking plane!”

On television, it is even thicker. Seinfeld was a wonderfully clever sitcom, but it was, by its own admission, about ”nothing.” All style, no substance.

Most sitcoms are the same, and most hourlong dramas are numbingly formulaic. Forrest gump

Yet, there is a hunger for substance. It shows up in such mainstream places as movies like Forrest Gump, where the sincerity and lack of irony of its main character seems like a breath of life. The movie itself was mildly ironic, but the character was guileless. And what is more, his earnestness — that is, his ”pure heart” — won him all his prizes. (I am not defending the film as a whole, but only making a point about its underlying proposal of directness and sincerity — many people despise the film for this very reason).

In that, the tone of the movie was completely at odds with its predecessor, Being There, where we were all in on the great in-joke, as the idiot gardener, Chance, fools all the supposedly smart stuffed shirts into finding profundity in his inanities. Chauncey Gardner

And just as a clever century distrusts an earnest one, the pendulum swings back and we are beginning to be unsatisfied by the cleverness. The deeper Quentin Tarantino dives into genre film pastiche, the more irrelevant he becomes. His first films were about something — the deaths in Pulp Fiction, however clever in terms of plot, were real deaths with consequences; in Kill Bill, the deaths are just tin ducks in a shooting gallery. They carry no punch.

This great cultural sea change may be due, but it hasn’t become pervasive yet. Still, there are warning signs: Sincerity has also brought us political correctness; it has brought New Age philosophy; it has brought us any part of a Tyler Perry movie that isn’t Madea.

For, while irony requires a modicum of intelligence, sincerity is democratic: Everyone is invited — no brain too small. It runs the gamut from genius to imbecility. Not every 19th century poet was Wordsworth; heck, even Wordsworth was only Wordsworth on a good day.

The watchword for irony is skepticism; for sincerity, credulity. Blind faith in alternative medicines, UFOs and astrology is only possible in a time when our irony is eroding.

Yet, irony doesn’t get off the hook so easily, either. There are reasons some people feel compelled to give it up as the new century reaches its teen years.

The first is that irony is words, not life. It is essentially linguistic. That is, its rules and habits are linguistic rules, not experiential rules.

With irony, as with a joke, you have to have the setup and punch line come in the right order, followed by the rim shot. Out of sequence, they fall flat and meaningless.

Real life has other demands, but with irony, we translate the experience of life into the language. Language is a kind of parallel universe, divorced from reality, but somehow accepted as its mirror: When we are laughing at a joke on a sitcom, we are laughing not at life, but at language.

It is at the core of what is called Modern Art, that the process becomes the subject: The painter paints paintings about paint, the playwright constructs dialogue about speech, the sculptor shows us the raw surface of stone. Modernism has been about the tools it uses.

And that is why, at the end of the Modern century, the armor of irony that has protected our egos from the embarrassment of our sentiment has begun to fall off. We demand real experience.

When that woman yelled out her frustration at The Apostle, she was complaining that her linguistic expectations — the language of film we have all become accustomed to — were violated. Robert Duvall was doing something different.

But our culture now requires of all of us that we rise above our comfortable irony and attempt to see what is actually out there, floating in reality.

And deal with it.