Pointe du Raz, Brittany

Pointe du Raz, Brittany

Finistère. The end of the world. It is one of my favorite places, wherever it can be found.

La fin de la terre.

Today’s finistère is the western end of Brittany, at the Pointe du Raz. The rocks dip down to the surf, the ocean rises over them and washes off in a white foam. Off a mile into the water, several large rocks stick up – large enough to call islands, and three lighthouses rise, each further away from the shore; the middle one seems to grow out of the surf itself, with hardly enough rock to form a foundation.

There is a permanent mist in the air, graying the far cliffs and dimming the contrast. The plateau above the rocks is a heath covered with thistle, nettle and teazel. Low grasses blow in the constant wind. Seabirds circle overhead and insects buzz in the undergrowth. Heath flowers, tiny and bright – blue, yellow, white – are so insignificant, you have to be looking for them to find them. But when you do, they burst in your eye like the sweet, acid first bite of a piece of fruit.

I love these places where the earth ends, where the rock weathers, the tideline is speckled white with barnacles and black with kelp. It doesn’t matter much whether it is here in Brittany, Land’s End in Cornwall or the Cape of Good Hope in South Africa, where the Atlantic and Indian oceans clash.

Though they don’t bear the name, the same rocks and tide can be had at Canada’s Gaspe Peninsula or Schoodic Point in Maine.

Schoodic Point, Maine

Schoodic Point, Maine

Cape Hatteras in North Carolina doesn’t have the rocks, but it has the storms, the pounding waves and a similar sense of being where the planet gives out.

That is the main point: That there is some place that the banality gives way to the immensity, that the office cubicles and mowed front lawns are forgotten behind us as we front a broad, tide-swept unknown.

The grayer the better, the masses of swirling whites of seafoam against the black shadows in the water and the gray clouds overhead swifting in the wind.

Here, the elements are butting heads like rutting rams. Some fights reach a conclusion, a winner is named, a peace concluded. But this is perpetual. The fight is the goal, not the means.

You stand, even in summer, with a brisk wind making your face numb with cold and the horizon is nothing by the limit of your own perception.

Finistère. The end of the world, where the old maps say monsters must live. Where fishermen bob in boats while pulling in nets of sardines or mackerel, where the houses are plain except for their blue window shutters, where no trees grow, where the sky is never static, but always blown across with grizzle, mizzle and chapping cold.

Cape of Good Hope, South Africa

Cape of Good Hope, South Africa

When we find ourselves repeating the patterns of suburban living, barely noticing the stars at night, or the sunrise in the morning, it is good to shake ourselves clean where nothing matters as real except the climate, the weather, the rocks, the seawater and the air.

Lincoln seated

What we know best of Abraham Lincoln is his face. Gaunt and furrowed, with hollow cheeks, dark, shadowed eyes and a cowlick in his hair, as if he didn’t quite know what to do with a comb.

It may be partly that we know him from black-and-white photographs, but it is a gray face that already seems to be cut from stone. It is a monumental face.

It is also an ambiguous face. Perhaps because of the longer exposures needed to make those portrait photos in the 1860s, his expression is blank. We don’t have images of Lincoln smiling or talking. He is always sitting still, waiting for Matthew Brady to say, “OK, thank you, Mr. President. We are done.” And he can then relax, stretch his long legs and perhaps crack a joke. “Mr. Brady, it reminds me of the one about …”

But the static pose means we look at the face and read into it what we want: fatigue, care, wisdom, faith, despair, grief or distraction. Maybe just boredom.

Lincoln 1865

And it isn’t only in the photographs that Lincoln is more mirror than figure. Everything in his life is ambiguous, and lets us find the Lincoln that reflects the image of ourselves. What we know best is his face, but it tells us the least.

Nathaniel Hawthorne called it “this sallow, queer, sagacious visage, with the homely human sympathies that warmed it.”

The words are from Library of America’s Lincoln Anthology, a collection of writings about our 16th president edited by Lincoln scholar Harold Holzer. It contains a century and a half of writing, from the recollections of his contemporaries to recent ruminations and lucubrations by writers such as Gore Vidal and Garry Wills. Each has his own Lincoln.

In it, you can read the shifting versions of the man. For some he is the preserver of freedom, for others he is a bloody tyrant. Some saw a sober man of deep reflection, others a jokester who would jibe at the most inappropriate moments. Some saw him as a statesman, others as an opportunistic politician.

For some he was a pious Christian; for others, he was an atheist.

You can find proof of any of these views in the more than 10,000 books that have been written about him.

Did he prosecute the Civil War to end slavery or to preserve the Union? There is evidence for all these Lincolns, and more.

Recent books have suggested everything from a closeted gay Lincoln to one afflicted with one or another form of undiagnosed genetic neurological infirmity.

Perhaps nothing illustrates the problem better than Lincoln’s attitude toward race. His early writings can be hard to read nowadays, from a post-Obama perspective: “There is a physical difference between the white and black races which I believe will forever forbid the two races living together on terms of social and political equality. And I … am in favor of having the superior position assigned to the white race,” he said in a speech in 1858. He was an avid user of the N-word, hardly excused by the fact that almost everyone did back then.

Yet freed slave Frederick Douglass, who frequently held Lincoln’s feet to the fire over slavery, after meeting him in 1863 wrote in his autobiography that Lincoln was “the first great man that I talked with in the United States freely who in no single instance reminded me of the difference between himself and myself, of the difference of color.”

“He was big enough to be inconsistent,” wrote W.E.B. Du Bois in 1922. “Cruel, merciful; peace-loving, a fighter; despising Negroes and letting them fight and vote; protecting slavery and freeing slaves. He was a man – a big, inconsistent, brave man.”

There are three explanations for Lincoln’s inconsistencies.

The first and easiest is that he was a politician and said to people what they wanted to hear. This is certainly true in many cases.

The second is that his views changed over time, and that what he said in 1858 cannot accurately reflect what he believed in 1863. This is also true. Few presidents show so much growth in office as Lincoln.

But the third is more important: He was simply a complex man. He seemed able to hold opposing points of view in mind at the single moment.

Unlike many later presidents, who grasp unwaveringly to their ideologies, Lincoln approached the world as a multifarious and varied place in which a single point of view cannot encompass. It allowed him a pragmatic flexibility.

One of the lies of history is that it follows a script. We tend to see history as a story with a beginning, middle and end. We know how the Civil War turned out. But in the midst of it, no one knew how it would turn out, and Lincoln faced new problems every day and no single policy would work in every case.

And every day presented choices that could lead to an infinity of results: Behind every door, a dozen more waited to be opened.

“I claim not to have controlled events, but confess plainly that events have controlled me,” he said in 1864. His genius – his political and moral genius – it to have been a great surfer of events, turning this way or that to maintain his balance and herd those events toward the general goal he held firm to.

“My policy is to have no policy,” he once said.

It is what made him the right man for the job in 1860: An abolitionist could not be elected; a Southern sympathizer would only have prolonged the agony, like his predecessor, James Buchanan. Lincoln’s steadfastness of purpose was matched with a flexibility that many saw as being unprincipled. Yet, that flexibility is what worked.

“The scars and foibles and contradictions of the Great do not diminish but enhance their worth,” Du Bois wrote.

“I love him not because he was perfect, but because he was not and yet triumphed.”

LINCOLN

patroclus and achilles

Why should the first book written in the Western tradition still be the best book ever written in the Western tradition?

I’m talking, of course, of the Iliad, a book so beautiful, so profound, and so inclusive, as to remain unsurpassed by Tolstoy, Proust or even Dave Barry.

I try to reread it at least once a year, and in different translations. It never fails to delight me. More, it never fails to move me deeply.

For the ancient Greeks, Homer was as close as they had to a bible. It seemed to them everything one needed to know was included in the two books he is credited with writing, and the hymns he is supposed to have composed.

It reminds one of Caliph Omar, blamed by some for burning the great library of Alexandria, who supposedly said that the library was not needed, because everything we need to know is in the Koran, and if the books in the library contradict the Koran, they should be destroyed, and if they agree with the Koran, they are superfluous.

Clearly, not everything is in the Iliad: It is a little thin on women, for instance. The Odyssey does better by that count. But it is astonishing how much actually is included.

And what is also astonishing is how clear-eyed Homer is: How unblinking in the face of both good and evil. It is all there, and narrated with hardly a nod to one side or the other. This is the world that is, not the world as we should like it to be.

There is, it is true, a good deal of the Hellenic world view that is foreign to us today, but that hardly matters. The book seems modern anyway, even cinematic. (Not that any movies made of the Trojan war are anything but embarrassing — blame that on Hollywood, not on Homer.)

He writes like a movie camera: Written details play in our minds as if we were seeing them on a screen.

In Homer’s Iliad, when the Trojan warrior Hector has killed one of his Achaean (Greek) enemies, he “planted a heel against Patroclus’ chest, wrenched his bronze spear from the wound, kicked him over flat on his back.”

That’s cinematic.

Critic Roger Ebert talks about a movie cliche he calls the “fruit cart,” when a falling kung-fu fighter or a careening car knocks over a table or fruit cart and spills produce all over the screen.

Odysseus killing the suitors, by John Flaxman

Odysseus killing the suitors, by John Flaxman

In Homer’s Odyssey, the hero comes home to find his estate infested with villains. He kills them all, starting with the head bad-guy.

“Odysseus aimed and shot Antinous square in the throat with the arrow’s point stabbing clean through to the nape of the neck and out the other side. Antinous pitched to the side, his cup dropped from his grasp as the shaft sank home, and the man’s life blood came spurting from his nostrils in thick red jets. His foot jerked forward and kicked the table and food showered across the floor, bread and meat soaked in a swirl of bloody filth.”

Fruit cart!

It is especially in the area of graphic violence that Homer anticipates Hollywood.

There was a time in movies when the bad guy got shot, grabbed his chest and keeled over. In 1967, Bonnie and Clyde turned death by gunshot into a slow-motion ballet of bodies jerked like marionettes punctuated by squibs popping like bubble wrap.

Since then, Hollywood has upped the ante, and the ballet of graphic gore has gotten more sophisticated, more precise and more messy. No one can be shot nowadays without a shower of blood spattering the wall behind him like spray paint.

In just 20 lines of the Iliad, Homer kills off half a dozen heroes in bloody style. Here’s a sampling:

“Thrasymedes stabbed Antilochus right in the shoulder and cracked through the bony socket, shearing away the tendons. Then he wrenched the whole arm out and down thundered Antilochus and darkness blanked his eyes. …

“Peneleos hacked Lycon’s neck below the ear and the sword sank clean through, leaving Lycon’s head hanging on his body by only a flap of skin. The head swung wide and Lycon slumped to the ground. …

“Idomoneus skewered Erymas straight through the mouth, the spearpoint raking through, up under the brain to split his glistening skull, teeth shattered out, both eyes brimmed to the lids with a gush of red and both nostrils spurting, mouth gaping, blowing convulsive sprays of blood. He was a corpse as he hit the ground.”

Tarantino is playing catch up.

But it isn’t merely in gore that Homer is realistic. He describes everything from the food to the landscape as if he were a gobbling camera, eating up the full existence of life. And not, like some novelist, in different chapters, but in a single sentence he can telescope from the entire battlefield down to the iris of a bee’s eye, and then back out again in the space of five or 10 words. It leaves one not with the grand view and not with the microcosm, but with a clear sense that they co-exist in a single space, a single comprehension.

Something else that should be said is that Homer survives translation. There are many great poets who live so completely in their native soil, they cannot be shipped overseas without loss of savor. Goethe in his mother tongue is the great poet; Goethe in English seems like the bearer of bromides and platitudes. Horace cannot survive the journey from Latin to English without sounding rather like Polonius.

But Homer works whether in a straight interlinear or in Pope’s heroic couplets or in Stephen Mitchell’s newest colloquial English translation. The power of Homer is in his sweep, not in his preciosity.

Pope Iliad

It is what Longinus praised in his On the Sublime.

He even works in the quaint antique style of George Chapman. In fact, Chapman is a great treat for someone who loves the Queen’s English.

Outside the King James Bible, there is probably no English translation of anything more famous than Chapman’s Homer.

But it is famous for being praised in John Keats’ sonnet, rather than for itself. Hardly anyone alive has actually read Chapman.

Chapman issued his translations, first of the Iliad and then of the Odyssey, from 1598 through 1615, overlapping the publication of the King James Bible in 1611. Like that Bible, it is written in a knotty Elizabethan-Jacobean style, when English was first stretching its muscles and testing its power. It is profligate in its verbal extravagance.

Keats wrote that Chapman speaks out “loud and bold,” and he certainly does.

 

“The man, O Muse, inform, that many a way

Wound with his wisdom to his wished stay; 

That wandered wonderous far when he the town

Of sacred Troy had sacked and shivered down.”

 

I love that: “sacked and shivered down.”

Of course, you lose something to gain something. Chapman’s word inversions, to fit his meter, make the words sound as archaic as the more obscure parts of King James, and can interfere with understanding the sense. You keep having to stop and reparse the sentence to understand just what he means to say.

On the other hand, there is a strength and nobility to the stylized expression.

Modern translations, even as wonderful as the recent one by Robert Fagles (which I recommend to first-time readers over Chapman), may be clearer, and it has its own felicities, but it doesn’t give us anything as palpable as that shivering city: “Sing to me of the man, Muse, the man of twists and turns driven time and again off course, once he had plundered the hallowed heights of Troy.”

The course between elevated style and easy comprehension has to be charted by each of Homer’s translators. Do you try to make him into an adventure novel, like W.H.D. Rouse, or do you look for the astonishing words that make poetry?

The opening of the Iliad has a line about how the anger of Achilles has caused the death of many men. An interlinear and literal translation says that the hero “prematurely sent many brave souls of heroes to Hades and made them prey to dogs and to all birds of prey.”

It is a grim image, of corpses littering the battlefield and being chewed on by animals.

Samuel Butler translates the same passage as “pushing brave men under the sod, feeding young men to dogs and to vultures.”

Alexander Pope, in one of the most enduring translations, gives that as:

 

“The souls of mighty chiefs untimely slain;

Whose limbs unburied on the naked shore,

Devouring dogs and hungry vultures tore.”

 

The text most often taught in colleges is Richmond Lattimore’s. He gives that same passage as:

 

“hurled in their multitudes to the house of Hades strong souls

of heroes, but gave their bodies to be the delicate feasting

of dogs, of all birds.”

 

The American poet Robert Lowell has it:

 

“threw so many huge souls into hell,

heroes who spilled their lives as food for dogs and darting birds.”

 

So you can see there is quite a wide latitude of possibility in translation, to make the sense clear and to make the imagery potent.

Here is Chapman’s opening:

Chapman's Iliad

To give a sense how Chapman fits into all this, we should look at the opening of Book XI of the Odyssey.

It tells about Odysseus (Chapman uses the Latin version of the name, Ulysses) and his men setting off from the island of the witch-goddess Circe.

A.T. Murray’s nearly literal translation for the Loeb Library offers:

 

“But when we had come down to the ship and to the sea, first of all we drew the ship down to the bright sea, and set the mast and sail in the black ship, and took the sheep and put them aboard, and ourselves embarked, sorrowing, and shedding big tears.”

 

It is a passage translated by T. E. Lawrence — yes, that is Lawrence of Arabia — as:

 

“At length we were at the shore where we lay the ship. Promptly we launched her into the divine sea, stepped the mast, made sail and went: not forgetting the sheep, though our hearts were very low and big tears rained down from our eyes.”

 

Pope lets these words fly quick and smooth:

 

“Now to the shores we bend, a mournful train,

Climb the tall bark, and launch into the main:

At once the mast we rear, at once unbind 

The spacious sheet, and stretch it to the wind;

Then pale and pensive stand, with cares oppressed, 

And solemn horror saddens every breast.”

 

Pope has expanded the passage to make it fly gracefully. Chapman goes the other way, with his crabbed, organ-tone Elizabethan style:

 

“Arrived now at our ship, we launched, and set

Our mast up, put forth sail, and in did get

Our late-got Cattle. Up our sails, we went,

My wayward fellows mourning now the event.”

 

Over and over, in Chapman, you come across the gnarly masticated consonants that act as rocks in the clear stream of vowels. Reading him can be like walking barefoot on gravel.

But at its best, it makes you taste each word as you utter it. You cannot speed-read the poetry: You must measure each syllable.

There are parts where Chapman seems to be on auto pilot, and when his poetry ventures into the banal, but at his best Chapman gives us a view of Homer as sublime, the version of Homer that Longinus praised, as being bigger and more awesome than our ordinary course of experience.

Oh, but what might have been. There is a passage of Homer that accomplishes what Chapman did, only more so, and with a sweeping poetic power that no one has ever matched.

One wishes, though, that Ezra Pound had finished a full translation of Homer. That cranky, crazy, fraudulent genius has given us the best, most noble, most poetic, and at the same time most comprehensible translation of parts of the Odyssey in his opening section of The Cantos.

It has the best of Chapman and the best of Pope, with a 20th Century irony and an Anglo-Saxon vocabulary that dances and sings. How I wish he hadn’t gone all loony and filled his Cantos with Chinese and economics and instead have spent that same time giving us all of Homer.

 

“And then went down to the ship,

Set keel to breakers, forth on the godly sea, and

We set up mast and sail on that swart ship,

Bore sheep aboard her, and our bodies also

heavy with weeping.”

 

Well, if we can’t have a full Pound, Chapman will more than suffice.

Homer bust

 

George Gordon, Lord Byron

George Gordon, Lord Byron

Forgiveness may be the most important human quality of all.

But, I don’t mean forgiveness in the Christian sense. That has always seemed to me to be forgiveness as a kind of condescension, that it posits a forgiveness from a point of moral superiority: I am better than you; therefore, I forgive you.

The forgiveness I am talking about has its origins somewhere different. Perhaps it is in a kind of Buddhist point of view, that all life is suffering.

We all suffer. It is the foundation of life. Even when we live privileged Western lives of economic comfort, filled with closets of clothes, TVs and WiFi computer hookups, we all face the pain of losing someone we love, heartache in love, betrayals at work. Whether it is the death of a pet at an early age, or the death of a child, we all, as human beings, face emotional pain.

It is universally human.

But what else is universally human is that we cause suffering. We are on both the receiving and the giving end of suffering. We may not have wished to cause pain, but we have. We have caused the heartache of others, just as others have caused the heartache we have suffered.

This is also universal.

So, when we face the evils that others have caused us, we should approach that problem with humility: We forgive not because we are better, but because we aren’t.

It is only fair to recognize that the evils done to us are simply the result of our being human.

That doesn’t mean we have to put up with the sufferings: We can forgive the husband who beats us, but we also separate ourselves from the cause of suffering. Forgiveness does not mean being an idiot. It just means that the wife beater does what he does from the pain and ignorance of his own humanity. Something has caused him to be this way; we probably don’t know what, but we can be sure it is something.

We forgive, but we make sure we never put ourselves — and especially our children — in a position to be beaten again.

The mirror of this is the ability to forgive ourselves. We can recognize the evils we have caused, the pain we have meted out and we can forgive ourselves, because we also know we are not immune from being human.

That doesn’t mean we excuse our vileness; it means we rectify it to the best of our ability. We try always to be a better human being. But we cannot spend our lives in reliving the hurt we have caused.

We are no better than the wife beater; we have caused suffering in this world, too, and the worst thing is, it cannot be avoided.

There is a trap, however, in recognizing our own culpability.

The worst thing you can do when recognizing the pain that you have caused in the world, is to wallow in it, like Lord Byron, wearing the mark of Cain and reveling in your own sense of your own evilness. This is a sentimentality.

The Byronic mode is essentially a way of claiming to be special in the world: It is a magnification of your internal mythology, that idealized sense of yourself.

We all live two lives. That is we have at least two different selves.

The first is the public self, the self that lives among the now 7 billion others on the planet and interacts with them. That self is objectively infinitesimal in the scheme of things, but strives to live and work in that small section of humanity where you have daily interaction.

The second self is internal. It is the self that is the protagonist in the novel of your own life. You are writing this novel every day, and some chapters are more dramatic than others, but you find yourself mythologizing your self as the hero, or anti-hero. This self has an importance all out of proportion to its numerical importance in the world.

When this internal self takes on the Byronic mantle, it is asserting its uniqueness, and its importance in the world, that others should take note, also, of just how evil one is, just how powerful one is, and how much pain and suffering one has caused.

The Byronic self needs to acquaint itself with the public self: The other 7 billion people have hardly noticed you; your evilness is really just your common humanity, attached to a self-regarding sense of guilt.

One has to learn to forgive oneself, because not to do so is to brag.

We are not better than anyone else, we are not worse: We are human and to be human is to cause and suffer pain, and the only sanity to be had is to forgive others, forgive ourselves, and move on.

stooges2

There is nothing more American than the Three Stooges throwing a pie in the face of a soprano warbling “Voices of Spring” at a soiree.

The opposition of high and low culture has been so ingrained in us that it may come as a shock to find out that high culture as an idea is a fairly recent invention. And an unfortunate one.

It is one more cultural polarization stuck in the nation’s gullet. Not only blue and red states, but also the slack-jawed triumph of pop culture and the watering down of art.

Symphonies and ballet companies struggle more than they ever have. Arts commissions fund safe populist and folk art rather than more difficult work. Cities even consider closing public libraries.

It didn’t have to be this way.

It wasn’t until the middle and late 19th century that Americans with newfound wealth went to Europe, bought paintings and sculptures to prove their good taste and formed symphony societies and opera guilds to promote a form of culture that set them apart from the hoi polloi.

It isn’t that high art didn’t exist before high culture, but that before the bluebloods took it over, high art was just one end of a continuum of arts and entertainments that spanned everything from the drawing room to the barroom.

Thomas Jefferson, for instance, played fugues on his spinet and Virginia reels on his fiddle. There was no cognitive dissonance in this. It was assumed that those who wanted a more complex experience, who wanted to face the difficult paradoxes of life, would naturally gravitate to Euripides or Shakespeare, and those who wanted simpler pleasures and diversion would read Smollett or Poor Richard’s Almanack.

And it would be perfectly reasonable for a music lover to listen to Handel in one mood and sing a bawdy ballad in another.

This was true in both America and Europe. We think now of Mozart as being for the educated elite, but when he was alive, organ grinders on street corners played his arias, in simplified form. Non piu andrai could be struck up by a group of jolly topers at the public house. It was a good chune.

It was the same for the other classical composers of the time.

That all this changed, we can blame the Victorians. And worse, Victorians in America.

First, in England, Matthew Arnold wrote, in an influential essay titled “Culture and Anarchy” (1869), that culture is “contact with the best which has been thought and said in the world.” He separated all that was pure and holy, all that was serious and elevated, all that was morally improving, and put it in one pile, called “culture,” and relegated all that was trivial, simple, vulgar and fun to the pile marked “kitsch.”

Like all truly Victorian thinkers, his goal was to improve the world.

What had been a continuum has been reduced to a polarity: High Culture vs. Slob Culture.

It can hardly come as a surprise that “culture” became a synonym for snobbery.

In America, this general tendency to “improvement” — a buzzword of the 19th century — became mixed with a revivalist religious fervor and the many strains of utopian idealism. Art, like religion, was supposed to make you a better person.

Victorianism gave us the Metropolitan Opera, the civic symphony orchestra, the art museum and the Harvard “five-foot shelf.”

five foot shelf 2

In part, all this moral rectitude was a cover for vicious social Darwinism. Those who had made a fortune could buy the respectability of good taste. They improved themselves, in the jargon of the time. They deserved to be rich.

But America has an opposing force: From the time of Andrew Jackson on, there has been a second stream of American culture — although it would never use that word — that mistrusted education, mistrusted privilege, mistrusted any elite.

These are the denizens of Artemus Ward and Finley Peter Dunne writings. They valued common sense over erudition, elbow grease over talent and simple pleasure over taste. They tended to think of those too educated as effete, unmanly — “pointy-headed intellectuals.”

Until recently, these two strains coexisted in America, and acknowledged each other. Our intellectuals praised the common man, and although that common man never went to the opera, he recognized that his betters did go.

Many things went into the overthrow of this peaceful coexistence.

Certainly one irony is that the educated elite ceased to have any faith in their own superiority when they learned the wrong lesson from the civil-rights movement.

The social consciousness of the 1950s and ’60s did not come from the silent majority, for instance. It came from the elite in alliance with the oppressed. Both groups wished to replace the inequalities of society with social equality. Civil rights, social welfare, equality before the law, were all ideals that any educated and sensitive person could sign on to.

But, translated into “no one is better than anyone else,” there was no longer any reason to value Rembrandt over, say, Thomas Kinkade. Even the college-educated believed it was just a matter of personal preference.

kinkade sleigh

In America, even the intellectuals are anti-intellectual.

That’s where we are now: Instead of the death of Victorianism bringing back the full spectrum of human endeavor, it has marginalized the high end and glorified the bottom end. Those few remaining High Enders lament in books and Commentary articles the end of civilization and loss of the canon of Western literature.

At the other end, those many fans of Adam Sandler don’t even know Plato ever lived. They no longer throw a pie in the face of the soprano: Sopranos are no longer important enough to be targets. They throw pies at each other, gleefully and pointlessly. And make that American Pie.

But it isn’t all gloom.

The decline of high art is a temporary reaction to its history and its usurpation by moneyed classes.

The fact is, the high end is not going to go away. It cannot. There are people — of all classes, races, genders and ethnicities — who require an art of greater complexity than popular culture usually affords them. Whether you compare Kenny G with Miles Davis or Danielle Steel with Virginia Woolf, it hardly matters. The simpler, less comprehensive art is always eventually cloying or monotonous. Popular music is annoyingly repetitious; mass-market fiction helplessly shallow; the standard television fare is “chewing gum for the mind.”

All of it can distract us for a time, and some of it is genuine fun, but after a while, to anyone with an active mind, it becomes a droning in the ear, like a mosquito in the bedroom. There is a hunger for something deeper, more complex, something that reflects the human experience, or attempts to.

Perhaps, when the boat is righted, there will be a continuum again, with high, low and everything in between, with no artificial boundaries between them.

In the long run, this is good news for those looking for such things. In the short run, it will do very little to help the foundering symphony orchestras of America’s midlevel cities.

Because when the hunger for complexity is filled in the future, there is no reason for it to fill the vessels of the past. Art music will not be symphonic music. It may very well be electronic. Poetry needn’t be sonnets; novels needn’t be narrative; visual arts will almost certainly not be oil paint.

We are already seeing certain manifestations of the hunger for less trivial art: the independent film movement, for instance, and the re-emerging popularity of poetry.

Regional theater has burgeoned in recent years, bringing a higher-grade drama to part of the nation that used to satisfy itself with an annual Brigadoon put on by the high school.

Although television is usually the villain in this story, the boob tube increasingly offers us entertainment of complexity and depth. You cannot throw pies at The Sopranos — television has rarely offered something as rich. And when it comes to complexity, contemporary relevance and universal characters, you have The Simpsons, one of the best shows in the history of the medium. It includes both ends of the spectrum in one: low comedy and intellectual vigor.

Just like Shakespeare.

SB1 pix 2

It’s time once again for men the size of wooly rhinos to face off in the annual blood sacrifice of our national religion. And, as the betting lines are straightened out, perhaps it might be a good idea to look at what racing handicappers call “past performance.”

It isn’t a perfect weapon, but how a horse or jockey has done in the past is one good indication of how things might go in the future. And the Super Bowl has a long history now to draw from, and if there’s one thing that can be said, it is that a team that was part of the original NFL has a better chance of winning than a later expansion team or a team from the original AFL.

Those of you old enough to remember the very first Super Bowl — then called the AFL-NFL World Championship Game — probably remember a certain swagger to the Green Bay Packers. They dispatched American Football League teams easily the first two years: Kansas City in 1967 (35-10) and Oakland in ’68 (33-14).

And the attitude of any NFL fan going into the third year, when Joe Namath and the New York Jets upset the Baltimore Colts 16-7, was that the upstart AFL just wasn’t up to snuff. It couldn’t hold its own against the “big boys” of the real professional football league, the NFL.

Now it’s 47 years down the road and, strangely enough, the cumulative results of the contest has shown those old fans were right. Teams from the original NFL have trounced the newer teams soundly. The numbers are hard to dispute.

We’re counting just the NFL franchises that were in existence before the first Super Bowl took place. That includes the Dallas Cowboys, Philadelphia Eagles, Washington Redskins, Cleveland Browns, New York Giants, St. Louis Cardinals and Pittsburgh Steelers of the Eastern Conference, and the Colts, Packers, Los Angeles Rams, San Francisco 49ers, Chicago Bears, Detroit Lions and Minnesota Vikings of the Western Conference.

Several teams switched cities. Three — Steelers, Colts and Browns — joined the old AFL teams in the new AFC when the leagues merged for the 1970 season.

None of that matters.

Over the game’s 46-year history, teams from the original NFL have won the big game more than twice as often as the upstarts. That’s 31 wins for original NFL teams, 15 for all the others.

Furthermore, because of conference switching, the Super Bowl has pitted two original teams against each other in 10 of those games. How many times have the AFL and expansion teams faced each other? Three.

There are 10 teams with five or more appearances, then three more tied at four appearances. Of those 13 teams, eight are original NFL teams. Of those, only the Vikings have a losing record.

Of the old AFL teams with four or more Super Bowl appearances, only the Raiders have a winning record.

Of the 10 teams with only a single Super Bowl appearance, only one comes from the core group of original teams — the Chicago/St. Louis/Phoenix/Arizona Cardinals.

This year, the outlook is clouded: San Francisco is an original NFL team, but the Ravens are from an original NFL city. So, in a way, no matter who wins, we’ll leave the old AFL cities out in the cold again and once more, let’s ring in a cheer for Bronko Nagurski, Johnny Unitas, Jim Brown and Vince Lombardi.

currierivesfiremansaves

This is a parable about beauty in art:

A child is trapped in a burning building; a fireman pulls her out, risking his own life in the process. He is hailed in newspapers as a hero.

And so he is. But did he enter the fire because he wanted to become a hero? Or did he want to save a threatened child?

His motivation tells us a great deal about his quality as a human being.

Most of us would agree that the purpose of a heroic rescue is not the heroism, but the saving of a life. Heroism may result, but it is not the reason the action was taken. If it were, we would suspect the hero of being shallow and self-involved.

It is much the same with beauty. It is no more the goal of art than heroism is the goal of the rescue.

And those many lesser artists who create art with only beauty in mind will come up with shallow results.

Because the authentic goal of art is more complex, more difficult and as a result, more lasting. It is no less than the creation of reality — or at least our ability to comprehend our little corner of it, like trying to palm flat the starched corner of a picnic cloth in a tornado.

It is the recognition that simple language can never effectively encompass the contradictory layers, complexities and confusions of experience’s windstorm and that such things can only be approached through metaphor: We cannot say what it is, we can only take stabs at what it is like.

In this sense, the true artists endeavors not to be pretty, but to be true. And if he manages to engage truth, the result will appear beautiful, if not to us today, then to posterity.

So when we think of beauty contests or the colorful landscapes of calendar art, we naturally think of beauty as something trivial and frivolous. And when we look at yet another American Impressionist oil-daub of a well-to-do young debutante in a white dress and parasol standing on a sunny turn-of-the-century lawn, we cannot but recognize that it has more in common with Robin Leach than it has with Rembrandt’s Hundred Guilder Print, Euripedes’ Medea or Bach’s B-minor Mass.

hundred guilder print

Conversely, when we discuss what is most profound in the world, it always seems to possess great beauty.

Such beauty is a natural byproduct of the search for truth.

But, when we use the word “beauty,” we mean different things at different times.

The word can mean “pretty,” as when we talk about a pretty picture or a blush-cheeked cheerleader.

Pretty almost always means, at some level, “conventional.” Pretty is something most people can agree on. There is a predictability to it, ultimately, a monotony.

And when a man says a woman is beautiful, he is almost always making a judgment that is not esthetic, but hormonal.

Beauty also refers to that which we find peaceful and productive of pleasant feelings. A lot of people find burbling brooks and meditative flower arrangements beautiful in this sense. It is also the genesis of New Age music and George Winston.

sunset

Then, there is beauty that is the precise and craftsmanly use of materials. Artists recognize this when they see the delightful line of Picasso’s drawings or the expressive and tight control of a Durer wood engraving.

Artists probably recognize this beauty more than most, but they are not alone. The wide popular appreciation of Salvador Dali owes greatly to the public perception of him as a magician of craftsmanship. The same of Andrew Wyeth.

Fourthly, beauty is sometimes reserved for the subject matter of a piece of art, where the audience pays little attention to the craftsmanship, but feels warm and fuzzy about the subject.

People respond to the religious sentiment rather than the art in most popular biblical art. Likewise, they respond to the scenery in an Ansel Adams photograph.

But all these are essentially lower orders of beauty. And much that is meretricious is deemed beautiful for a while, if it reinforces the prejudices of the age. What Victorians called beautiful, we largely snicker at now. But we shouldn’t be so smug, for our great grandchildren will undoubtedly giggle at our art and mores in just the same way.

The real thing doesn’t founder on the prejudices of the day. That is why we call it “classic.” that is why the best Rembrandt or Eakins speaks across the years and continue to move us.

The beauty of the art we recognize as great has a handful of special qualities that probably make it unsuitable for practical things, like magazine illustrations or political propaganda.

Ambiguity is one such quality. All great beauty is at some level equivocal: It can take no partisan stand because it holds all options open. It manages to allow of multiple, even contradictory interpretations because it is “large and contains multitudes.” Profound beauty makes no moral judgments, at least not on the rote level of dogma.

Real beauty must be ambiguous: like the smile of the Mona Lisa or the meaning of the white whale in Moby Dick.

That ambiguity is inextricably bound up with these other qualities of great beauty:

Metaphor — Not only what something is, but what else it is. A landscape is never only a piece of pretty scenery, but a metaphor for the structure of the human mind, or a metaphor for a nude torso, or a metaphor for the fecundity of nature. A still life usually tells us about death and transience. A portrait tells us about suffering and humanity. Moby Dick is a metaphor; so is the Creation of Adam on the Sistine ceiling.

Complexity — Not necessarily complication. Sometimes, a very simple gesture, in relation to the rest of the work, can be very complex. The complexity may be of character in a play, of metaphor in a painting or of a key relationship in a symphony.

But if a work of art is to mirror life in any meaningful way, it must reflect the complexity — even contradiction — of existence.

Memorability — The quality we say “haunts” us. In great art, we may often find ourselves saying: “I don’t know what it means, but I can’t get it out of my head.” The gift to create what is memorable in this sense is what we mean when we say an artist is “touched by the muse.”

Surprise — Great beauty is always fresh and it does unpredictable things. Why is there a little girl running through Rembrandt’s Night Watch? Why does Beethoven’s Fifth Symphony keep stopping?

Inevitability — But that surprise always forms part of the larger picture and we feel by the time we come to the end that it couldn’t be any other way. There is a “rightness” to every gesture, every musical modulation, every theatrical act.

Wholeness of form — This is almost mystical. The form that I’m talking about and the wholeness we recognize are almost certainly archetypes that are hidden deep in the psyche and the wholeness cannot be prescribed, but only recognized.

It is what makes us recognize when a play is over: The form is complete and whole. The same works in great paintings or music.

Ultimately, what you get from beauty is not a feeling of fuzzy complacency, but of transcendence, a feeling that there is something larger and more meaningful than the everyday, the quotidian. That life is part of some larger pattern, not necessarily religious — that within that pattern you share something with the Zulus in South Africa just as with your brother in Pittsburgh.

Or that there is a great mystery at the center of the universe, whether it is scientific or Hindu. That the simple everyday reality is somehow ennobled, though it may not include any concept of nobility.

And in the end, in a world of suffering, chaos and meaninglessness, the art, like the fireman, rescues us.

firemansavechild

George and Barry

In America, anyone can grow up to become president. At least, that’s what we were taught in school.

The reality is a little more complicated. It turns out that if you want the office, it really, really helps to be related to someone who has already held the office.

Even Barack Obama is 11th cousin to both the Georges Bush, according to noted genealogist Gary Boyd Roberts, who has studied such things and written the book Ancestors of American Presidents.

Certainly, 11th cousin is a little too distant to be meaningful — at that distance, an amazing number of people are related to everyone. It turns out that if you carry it all back far enough, every American president except Rutherford B. Hayes is ultimately descended from Alfred the Great. But then, you might be, too. I don’t know why Hayes misses out.

But we don’t have to go that deep into genealogical minusculae to realize that having a relative in the Oval Office helps: Four presidents are father-son (Bushes and Adamses), two more are grandfather-grandson (William Henry and Benjamin Harrison). Just at that level of connectedness, that means that just under 14 percent of our chief executives are related to other chief executives. That is already statistically significant.

If you add the Roosevelts, that number jumps up to just under 20 percent. Nearly one in five of our presidents is closely related to another president.

But it doesn’t end there. James Madison was second cousin to Zachary Taylor. At the third-cousin level, Martin Van Buren claims the Roosevelts, John Adams adds Calvin Coolidge.

That raises the percentage up to just a hair under 30 percent.

Beyond that, at the fourth-cousin level, you have to include John Tyler and the Harrisons, Ulysses Grant and the Roosevelts (and Van Buren), Zachary Taylor and the Roosevelts, James Garfield and the Bushes, Franklin Pierce and Herbert Hoover, Millard Fillmore and the Adamses, Calvin Coolidge and the Adamses, William Howard Taft and the Adamses. Everyone seems to be related to the Adamses. The presidency is in some respect an Adams Family thing.

If we count all the fourth cousins, that means that 43 percent of American presidents are related to other presidents.

The myth tells us that the common man is king in America, the reality is that America has its own sort of aristocracy: Those in power tend to be born of those who held power before them. Kind of like a legacy admission to Yale or Harvard.

Beyond fourth cousin, the connections become admittedly more tenuous, But there are other notable relationships that might be mentioned: Hoover and Richard Nixon, both Quakers and distantly related, Nixon and Jimmy Carter, Nixon and Gerald Ford.

Ford is an instructive case: The simplest and folksiest of recent presidents is in some ways the best connected. He is distantly related to Franklin Roosevelt, Abraham Lincoln, Millard Fillmore, Garfield, Hayes, Taft and Coolidge. He is also related to George W. Bush.

George W. Bush, in turn, is related to Pierce, Garfield, Hoover, Fillmore, Taft, Grant and Nixon — and, of course, George Herbert Walker Bush — not exactly the most illustrious line of presidents, and several who have been considered at the bottom of the barrel.

Admittedly, many of these more distant relationships are slender, at best. But if you count everyone down to 10th cousin among this presidential inbreeding, you come up with the astonishing figure that 61 percent of our presidents are related to other presidents.

And that doesn’t count such things as: George Washington’s half aunt married James Madison’s half first-cousin, once removed, or that Woodrow Wilson’s second wife was the great-great-great-niece of Thomas Jefferson.

So, it’s not just who you know, but who you’re related to.

Tea party 2

America is a nation of tax whiners.

It is one of our least attractive features. I understand complaining about tax money ill spent; I understand about fretting over taxes being spent on programs we disagree with. In such cases, one should petition for reform of the wasted money or campaign for representatives who will repeal the programs. But complaining merely about taxes seems entirely beside the point.

After all, the very people who most whine about taxes are the same people who scream at the top of their lungs of American exceptionalism: “We’re Number One!”

But, if you live in a country club, you have to pay the dues.

Whine, whine, whine.

It is our unofficial national anthem.

We were founded on the principle of complaining about taxes, and the whining has never ceased despite that Americans pay less in tax than citizens in most other developed countries.

According to the Organization for Economic Cooperation and Development, the only developed nations that have less of a tax burden than the United States are Australia, Japan, Korea and Mexico.

Total tax revenue in the United States is a shade more than 29 percent of the Gross Domestic Product, 25 down the list from the world tax leader, Sweden, which pays more than 50 percent of its GDP in tax.

We are beat out by all of Europe. The median percentage on the list for Europe is about 35 percent of GDP, and the average is above 40 percent.

Many, of course, would say that Americans pay less tax precisely because of their chronic whining — to which we also owe our prosperity and our freedom. Whether you agree, it’s likely that seldom before in history has there been a people who expected so much — in terms of government service — for so little. And in recent years, America’s traditional anti-tax sentiment has increasingly blended into our resurgent demonization of government in general.

Today you cannot turn on a television newscast without hearing a politician or a protester complain that American taxes are unconscionable.

“Taxes are too high and government is charging more than it needs,” said President George W. Bush in his budget speech to the joint session of Congress. “The people of America have been overcharged.”

His answer at a time of two unfunded wars: tax cuts. Whoopee!

This has always been gospel in America. The fighting cry for independence in the 18th century was, “No taxation without representation,” although the protest often seemed more against taxation of any kind.

In 1776, in fact, the American colonists paid less per capita in taxes to the crown than mainland English citizens did. And they paid five times more tax in 1698 than they did in 1773, the year of the Boston Tea Party.

It is ironic that the most famous act of tax rebellion in our history actually protested the elimination of a tax. How many of our current Tea Party activists know that?

The colonists had paid a tax on tea for years, but in 1773 the British Parliament allowed the British-owned East India Co. to sell its tea in the colonies tax free, making its tea cheaper than the American-imported product and essentially creating a tea monopoly.

There were other taxes that colonists found intolerable even when the amount of money collected was nominal. The Sugar Act of 1764 and the Stamp Act of 1765 added fuel to the pyre of anti-tax sentiment.

The true call, it seems, was then, as now, for representation without taxation.

But even after independence, when taxation came with representation, the first serious threat to the new nation came in the form of a tax revolt — the Whiskey Rebellion of 1794, in which farmers of western Pennsylvania rioted against excise tax collectors. President George Washington had to lead the Army one last time to quell the revolt.

Four years later, when Congress enacted the Federal Property Tax to pay for the expansion of the military in anticipation of a feared war with France, John Fries began what is known as Fries Rebellion in opposition to the tax. Fries was tried and convicted of treason, though he was pardoned by President John Adams in 1800, not long before Adams left office.

The first American income tax was floated soon after to pay for the War of 1812, but the war ended before any tax money was collected, so it died a-borning.

It was reanimated during the Civil War; an income tax was collected from 1862 to 1872 although even then tax rebellion was afoot in the form of widespread tax evasion.

More to the point, there is an undercurrent of American historical thought that believes taxes were the primary cause of the Civil War.

Abraham Lincoln had promised the South that if elected he would not interfere with slavery. But he also promised in his inaugural address that he would enforce the collection of excise taxes even if the South attempted to secede. Those taxes were highly unpopular in the South as they favored Northern industry.

An income tax was tried again in 1893 under President Grover Cleveland. The primary income of the federal government had always been tariffs on the import of foreign goods, but Cleveland ran on the platform of reducing tariffs, which had restricted free trade. To make up for the lost revenue, he asked for an income tax on corporate earnings. The following year, Congress passed such a tax, expanded to include personal income.

The Supreme Court would have none of it and struck down the tax as unconstitutional.

The issue was Section 9 of Article I of the Constitution, which said, “No capitation, or other direct, Tax shall be laid, unless in proportion to the census or enumeration herein before directed to be taken.”

Which meant that the collection of direct taxes — as opposed to indirect taxes such as sales taxes — must be made in proportion to the populations of the various states. This made a simple income tax nearly impossible.

But as populist sentiment arose at the turn of the century, many saw an income tax as a way of getting money from the rich.

Theodore Roosevelt advocated a graduated tax on inheritance in 1906. In 1908, he called for Congress to enact a progressive income tax.

But the Constitution still stood in the way.

So, an amendment was proposed, which came into law in 1913 as the 16th Amendment, authorizing an unapportioned income tax. By the way, the Senate voted for the amendment 77 to 0, and the House of Representatives followed, voting 318 to 14. It was hardly a squeaker.

The first income tax under the new amendment gave a $4,000 exemption to families (perhaps equivalent to a $40,000 exemption today) and then charged 1 percent on the first $20,000 above that, 2 percent at $50,000 and a maximum rate of 7 percent on incomes above $500,000.

World War II made the big difference, spreading the tax burden into the middle class. Before the war, about 15 percent of the people paid all of the income tax. After the war, 80 percent paid it.

That is when the federal government first started payroll withholding. During the 1930s, federal individual income taxes never topped 1.4 percent of the Gross National Product. During 1990, that number was 8.77 percent.

The income tax remains the single most contentious tax we pay. And it is the center of most of modern whining.

“When the 16th Amendment became law in 1913,” wrote Robert Ringer in his book Restoring the American Dream, “an important step was taken in laying the groundwork for the destruction of the spirit that had made America the freest, strongest and most prosperous country in history.”

It might be noted that it wasn’t until after the income tax that America, in fact, rose above the level of a Third World nation and became the strongest, most prosperous country in the world.

But the complaints continue.

There was a local tax revolt in Chicago in the 1930s during the height of the Depression and another in California in the 1970s.

The latter revolt still reverberates today, culminating most recently in the Taxpayers Bill of Rights passed overwhelmingly by Congress and signed into law by President Clinton in 1998.

But there was an edge to that 1970s movement, championed by Howard Jarvis, among others, that began to question not just tax but the legitimacy of government.

It resulted in the passage of Proposition 13 in 1978, which limited the state’s ability to increase property taxes. Jarvis was an unlikely revolutionary; he looked more like a jowly retiree bearing photographs of his grandchildren, but he had a mission and a message:

“Tax, tax, tax, spend, spend, spend; elect and elect and elect, is bankrupting we the American people and the time has come to stop it.”

Jarvis

Implicit in his message was a growing mistrust of government in general.

“Proposition 13 in California was an assault not simply on taxes but on government as we know it,” tax historian Elliott Brownlee has said. “It was really the beginning of an anti-government crusade that has continued.”

More extreme elements of this sentiment thrive all over the Internet, in scores of screed-filled Web sites about the evils of tax, government and a one-world conspiracy. One describes taxes as the “economic rape of America.”

“Tax is theft,” it says, “legalized robbery, crime” — begging the question how something legal can be a crime. It is called parasitism, cannibalism, cancer and, alternately, a Mafia protection racket.

Such ranting is the equivalent, amplified and larded with aggressive hype, of the pamphleteering of Tom Paine and others more than 200 years ago. Appealing to the emotions and an unrefined sense of personal freedom, with little sense of practical reality or the interconnectedness of society, they are the screams of our national id.

The founding fathers, it could be said, created the Constitution as a kind of superego to that id, to help Adam Smith’s famous “unseen hand” bring collective benefit out of the selfishness of the individual.

Certainly, as April 15 spins around each year, we all grow anxious: No one likes paying. And if we could run our government on less money, we’d all breathe easier. But taxes, in and of themselves, are at the very least a necessary evil. America would hardly maintain itself if no one paid teachers or built roads. We should decide what we want from government and argue over that, rather than whine about having to pay anything at all.

Colbert painting

Jean Baptiste Colbert, finance minister to King Louis XIV of France in the 17th century, once famously said, “The art of taxation consists in so plucking the goose as to obtain the largest possible amount of feathers with the smallest possible amount of hissing.”

On this count, America may have the world’s lowest threshold of pain.

A look back

Tax receipts can be found among the oldest artifacts of human civilization. Wrapped in a pottery ball from the fourth millennium B.C. discovered in the Near East are the records of a tax having been paid.

The earliest taxes, though, probably came in the form of work extracted. People would be required to work for the state for a given period of time each year. They provided the labor to build roads or pyramids or fill the ranks of armies during war. The military draft was a late remnant of such taxation.

Before money, when tax was exacted, it came in the form of crops and cattle.

In ancient China, one fifth of a farmer’s crops was taken as a “flat rate” tax. A poem from the Chou Dynasty complained about big government: “Big Rat, Big Rat, do not gobble your millet.”

But by the time of the Roman Empire, tax was often monetary. Under Julius Caesar, for instance, a 1 percent sales tax was introduced. And at an early date, a 5 percent inheritance tax was created — later raised to 10 percent — although applied on only what was left after bequests to wife and children.

Roman taxes at first relied on “tax farming” — that is, hiring private enterprise to collect the taxes. These were the publicans mentioned in the Bible, who grew so corrupt that Caesar Augustus outlawed the practice, putting civil servants in charge of gathering the money.

The first income tax was created in 1799 in England to raise money to fight Napoleon. It was repealed in 1816.

In the United States, the first income tax came in 1862 to help underwrite the Civil War, 50 years after an aborted attempt to help finance the War of 1812.

In one of those periodically surreal pronouncements from Washington, D.C., the tax commissioner said, “The people of this country have accepted it with cheerfulness.”

A more realistic assessment of how happy people were can be found in 1870 — the year of the highest compliance for that first income tax — in a nation of 38 million people, that only 276,000 people filed returns.

Papa E and bull

Earl Thaddeus Steele was a man who could think sideways. In fact, he invented the art.

He was my wife’s grandfather, and he was never completely socialized. He kept rattlesnakes under the house. He fished and hunted for a living, and although he sometimes held a job, if he thought the fish were biting, he left the job site for the fishing pond without a single wince of conscience.

He also believed that traffic signs and stoplights were there to ”instruct those who didn’t know how to drive.” Because he knew what he was doing, he ignored them.

But it is for his inventive method of problem solving that I remember him here. Thad Steele never met a problem he couldn’t solve, and in a way no one else might have thought of.

When the corner post of his back porch cracked, and the porch roof sagged on one side, he lassoed a nearby sapling and pulled it down under the porch eave, using the natural springiness of the tree to keep the roof from falling.

”It was still that way the last time I saw it,” my wife says.

Once, when a neighbor used a shortcut across his lawn, eventually wearing a path in the grass, Thad Steele knew how to stop her. He never would have confronted her directly; that wasn’t his style.

No, he dug a person-size hole hip deep in the center of the path, mixed up a load of pudding-quality mud to fill it in, used carpet tacks to secure a cloth across the hole and sprinkled it with dirt and grass to camouflage the hole.

The next day, the woman stepped into the hole up to her waist in brown goo. She stopped shortcutting across his yard.

I especially like the touch of the carpet tacks.

There are many examples of Thad Steele’s peculiar approach to life.

His friend Hub Hawkins had a stutter. Thad had said many times that ”if Hub were frightened enough, he’d talk as plain as the next man.” One day, while Thad and Hub’s cousin Dewey were sitting on the front porch, they saw Hub walking toward them on the railroad tracks.

Thad leveled the pistol he always carried — as all real men in Madison, N.C., did in those days — and fired several times at Hub’s feet, making him dance. Hub yelled back, ”D-D-Dod d-damn you, D-Dad D-D-Deele!”

That was one solution that didn’t take.

But sideways thinking often did work. Thad Steele kept five blue Dodges in his backyard so he always had parts for the blue Dodge he drove.

When his wife and her sister argued violently over who owned a beautiful pitcher and bowl that had belonged to their mother, Thad Steele took the crockery down to the creek and broke them on the rocks. It stopped the squabble.

”He pierced straight through to the reality, to the heart of the matter,” my wife says. ”He was not about social acceptance. He thought it silly.”

She remembers once, when she was about five years old, her grandfather was asked to baby-sit her. For Thad Steele, baby-sitting meant the child adapting to him, not the other way around.

And so, because he had intended to go hunting that day, the child went hunting with him.

He had seen a bunch of flying squirrels in the woods near the house, and so the three of them — Thad Steele, his gun and the five-year-old girl — went out shooting.

After he had shot three of them, he wrapped them in a blanket and handed them to the child, telling her, ”Take care of them like babies.”

There are perhaps more appropriate ways to deal with a child, but the old man knew who he was dealing with.

”He could have said, ‘Watch these while I go down the path looking for more,’ but he didn’t. He turned them into baby dolls for me. He understood me and made me feel like a million dollars. I was their mother because of what he said to me.”

My wife has learned her lessons from Thad Steele well. She is also a master of sideways thinking. She is an artist, after all.

When she was young and poor, recently divorced, she couldn’t afford her automobile inspection, so she painted the inspection sticker on the windshield of the car and used it for the full year.

As an adult and an art teacher in Arizona, she was given an art room with carpet – not the best kind of floor for a room full of second-graders with jars of tempera paint. And when the inevitable spill happened and she couldn’t scrub out the stain, she realized: ”I’m an art teacher; I can mix that color.”

So she matched the color of the original carpet and painted over the stain.

Another time, in another school, after a month of working with clay, the dried gray powder was ground into the tile floor. When she asked the janitor to mop the floor to prevent the clouds of dust from clay choking the kids, the janitor replied ”mopping is not in my job description,” and ignored her.

So, as a master of sideways thinking, she and the kids filled up a dozen or so 10-gallon buckets with tap water and then, in one grand cascade, poured it all out onto the floor. She sent one of the kids to the janitor with the message: ”Help! The room’s flooded!” And the janitor came and mopped up the mess.

”The floor was the cleanest it had ever been,” she notes.

Sideways thinking helps get things done when ordinary thinking is stymied.

I recommend it to Congress.