Removing Statues and Planting Trees: Charlottesville and Beyond

This is not an essay denouncing the violence in Charlottesville, or our president’s disgraceful response to it. Those things are self-evident and, to the extent they’re not, others have spoken about them with more eloquence and insight than I could. I want to talk instead about why Charlottesville happened, why it’s likely to happen again and how we might come through this struggle in a better place than where we began.

First, a little background.

This country has never truly reckoned with the consequences of the Civil War and its aftermath—especially, I think, the aftermath, the true, horrid scale of which I believe still remains largely unknown by white Americans; it certainly was to me for most of my life. Jim Crow was more than just laws that kept black people away from the ballot box, a disgraceful enough thing in itself. It was a collusion of culture, politics and the legal system to recreate slavery in all but name, exploiting blacks for their labor while depriving them of everything they were entitled to as newly recognized American citizens. This was done not only through “literacy tests,” grandfather clauses and whites-only drinking fountains but through the  convict lease system, consciously racist zoning laws, and of course, vigilante terror and murder, to name just the most obvious things.

The Robert E. Lee statue that was the ostensible cause of all this is a relic of this system. Most of the Civil War statuary dotting courthouses and public squares in Southern towns was not made during the war or in its immediate wake. These figures were erected from the 1900s through the 1920s as totems of Jim Crow, meant to remind both blacks and whites of the level each group occupied (low and high, respectively) on the social pyramid. They are subtle instruments of terror wrought in bronze.

As these monuments began to spring up, there came with them what we now know as the myth of the Lost Cause, the revisionist belief that the Civil War was brought about not by slaveholders determined to protect their human property, but by an overzealous, self-righteous and hypocritical Northern government bent on overrunning the states’ rights guaranteed in the Constitution in order to clear the path for a smothering, all-encompassing Federal authority. The Confederate States of America, a would-be oligarchy founded on racial and religious bigotry and with the explicit goal of expanding slavery throughout the Americas, was recast as a noble-but-doomed final stand in the defense of a now-vanished American ideal.

This historical lie, aided and abetted by whites in the North as much as the South, is why the Stars and Bars remain for many Americans (who often do not consider themselves racist to the slightest degree) a symbol of proud American defiance and self-determination, rather than an emblem of hatred on par with the Nazi swastika. It explains why people wearing the insignia of a defeated insurrectionist movement that sought to leave the Union outright now regard themselves, without irony, as the true patriots and lovers of America. By extension, statues of Lee, Forrest, Longstreet et al, gilded in the patina of the Lost Cause, serve a similar purpose in valorizing this deracinated Confederacy. Civil War statues and the Confederate battle flag are not racist, we are told—they’re our heritage, and no one should be made to feel ashamed of his heritage.

To be sure, the Klan-saluting, torch-wielding crowds protesting in Charlottesville seemed very much motivated by racism, however much they speciously invoked their “heritage.” But these monuments haven’t endured this long solely because of the small minority of Americans who think and act like the demonstrators at Charlottesville. They endure because as a nation we have failed to honestly confront what they stand for—because after all this time, we refuse to agree on what the Civil War meant, or engage with what its post-Reconstruction aftermath set out to achieve.

Until now.

Remember that what prompted this whole appalling display was the Robert E. Lee statue in question was set to be removed from its privileged place, because the leaders of Charlottesville could no longer countenance the insidious ideals that had led to its creation. The neo-Confederates protesting this were lashing out in fear, and their fear is fully justified. The lie upon which they have constructed their pride and sense of worth is being challenged, not just by far-off liberal interlopers but by their own friends and neighbors. People both in the South and beyond it are seeing the myth of the Lost Cause as just that—a fable designed to protect the privileged by obscuring the brutality on which that privilege was founded.

After more than 100 years, the cracks are indisputably beginning to show. New Orleans removed a statue of Lee from public view a few months ago. The Republican governor of Maryland just announced his intention to remove a statue of the Supreme Court justice who wrote the Dred Scott decision. The mayor of Lexington, Kentucky announced—ahead of schedule, spurred on by the violence in Charlottesville—that he intended to have two Confederate statues relocated from the former city courthouse, now a visitor’s center. More cities and states are sure to follow.

Expect more outbreaks like Charlottesville as the weight of public opinion turns slowly, inexorably against these background furnishings of Jim Crow. It will undoubtedly get worse before it gets better, and it’s perfectly appropriate to be outraged and sickened at these spasms of racist violence, and at the president who responds to them with cowardly equivocation. But the fact remains that this conversation has been long overdue, and having it was never going to be easy. Do you remember when well-meaning politicians used to talk wistfully about the nation “having a dialogue” about race in America? This is what that looks like.

There is a saying among the Chinese: “The best time to plant a tree was 20 years ago. The second-best time is now.” It is tempting—and futile—to wonder what might have been had Reconstruction continued as Lincoln had envisioned it, and Jim Crow and the Lost Cause had shriveled and died before ever having the chance to bloom. But it is not 120 years ago or 20 years ago, and today we are tasked with planting trees in some very hard, stubborn earth. The work will be long and brutal and not all of us may live to see it bear fruit. Yet when it is done—when those seeds have finally taken root and resisted all efforts to pull them free—we can look back at our efforts and our sacrifices and know that they were given in pursuit of a worthy goal, one which, once achieved, will leave our nation better than it was before.

Originally published on Medium.

24 Cigarettes and One Pipe: Hammett and Chandler

When I was a writing student in college, I came across a how-to manual called The Essence of Fiction, by Malcolm McConnell. It was not like most other writing books I had read before or have read since. My professor, to whom I showed it, was mildly appalled at its strict focus on the mechanics of story construction, and indeed, The Essence of Fiction has no clever exercises a la John Gardener’s The Art of Fiction, nor does it inspire you to live a life devoted to creativity a la Natalie Goldberg’s excellent Wild Mind. Essence is plain and direct and even, to my old teacher’s point, rather crude, but one of its precepts has stuck with me over the years: the rule against “cigarette action.”

Cigarette action is McConnell’s term for the meaningless physical business a writer will assign a character in order to pace a scene. When writing a dialogue scene, you can’t simply follow one speech with another and then another: it gets fatiguing to read, and the scene gradually loses its sense of place, its physicality. (Not that that stopped Elmore Leonard.) So writers solve this by having their characters do … something. Get up and look out the window. Check themselves out in the mirror. Change positions on the couch. And, of course, light cigarettes. Continue reading

Did You Ever Have to Remake Up Your Mind?

Or, How to Convert an Atheist in Seven Extremely Difficult Steps

Faith, defined a little too simply, is a belief one holds without evidence. Perhaps that definition sounds somewhat derogatory or appears to contain an implied rebuke. But people of all stripes have beliefs they cling to for no intellectually defensible reason, whether they be common superstitions (“Crime is more prevalent during the full moon” — it isn’t), personal idiosyncrasies (“Something good always happens to me when I wear my lucky sweater”) and even moral or philosophical precepts (“If I make a point of being trusting and kind, others will be encouraged to follow my example”). Most beliefs of this sort are quite harmless, a few are beneficial and the rest are a small price to pay for the freedom to be occasionally irrational. I think it would be a terribly dull world if everyone had a solid empirical basis for everything they did. Besides, I’d probably have to stop buying lottery tickets, and I like having something to fuel my daydreams.

The snag is that a belief held without evidence is also extremely resistant to change. Christopher Hitchens once said that anything that is claimed without evidence can be dismissed without evidence. That’s an intellectually justifiable position, but not a very satisfying one, at least not if you find yourself wrangling with someone whose judgement you otherwise respect about an issue you can’t agree on. Faith beliefs are felt in the gut; they accord with our sense of how the world operates and are the result of influences we are mostly unaware of, from our parents and families to the media messages we’re exposed to every day. Though I defend recreational irrationality, I don’t hold it as justification for never changing your mind. Resistance to evidence is usually rooted in fear: fear of admitting you may be wrong and feeling stupid, fear of having your worldview attacked, fear of having to start at square one in determining just what it is you believe. This kind of fear is unhealthy and ought to be stood up to, at least once in a while. So occasionally I undertake the mental exercise of determining what it would take to change my mind on an issue I care deeply about. Today’s issue: religion.

I am an atheist, and I am an atheist of a particular stripe: I do not believe in a god or gods. That is not the same as saying “there is no god.” The latter is a statement about the nature of reality, the former about one’s own knowledge and the limits thereof; another way of saying it might be “I have seen no evidence of a god.” This distinction is sometimes called “soft atheism” versus “hard atheism” (neither of which are to be confused with agnosticism, an oft-misused word that describes the belief that true knowledge of god’s existence or non-existence is unknowable by human standards). In practical terms, there is not much daylight between the two positions, and holders of either belief/nonbelief would be indistinguishable in how they lived their lives. The only difference is that one has come to a conclusion and the other hasn’t. In the spirit of jiggling a knife into that small chink in the armor of certainty, and in keeping with Carl Sagan’s dictum that “extraordinary claims require extraordinary evidence,” here are the conditions I would require to renounce my atheism and adopt a belief in god. Continue reading

They Live

You’re a drifter — down on your luck, roaming from town to town with a bedroll and a tool chest strapped to your back. Everywhere around you, other people seem to be getting the breaks — although, admittedly, many more seem to be just as up against it as you are. You find a job as a scab laborer on a construction site, and a squatter’s village that at least offers a hot meal and a place to sleep. Despite all this, you don’t let it get you down. You still believe firmly in the lessons you learned as a kid: that the world is fundamentally a fair place, that people will treat you well if you treat them well, and that working hard and playing by the rules will one day get you to a place of comfort and security; maybe not the mansion on the hill, but not the squatter’s camp either. America still works, you tell yourself, and that gives you the strength to pick yourself up and keep trying.

Then one day you put on a pair of sunglasses and see things you never saw before, and your world goes to shit.

John Carpenter’s They Live looked unflinchingly at the underside of Ronald Reagan’s Morning in America. While Gordon Gekko was rhapsodizing about the goodness of greed, migrant worker George Nada trawled through a stunted shadow economy that grew like a fungus on America’s underbelly. They Live presents an America that seems decent enough to justify George’s faith: the squatters’ camp where he finds shelter runs on compassion and good old American hard work, a true expression of the generosity we hold as one of our core values. The problem, as it turns out, is the ultimate viper in the garden: the elite feeding on America’s underclass are actually aliens in human form, hopscotching rapaciously across the galaxy like a cross between Gordon Gekko and Galactus. Even more heartbreaking is when George discovers why he was able to maintain his faith in the American dream while it fell apart around him. The aliens have submerged the culture in subliminal messages, with every surface blaring a mute clarion of stasis and conformity. Thanks to a pair of sunglasses invented by the revolutionaries fighting the aliens, George walks through L.A. and finally sees, in literal black and white, the new guiding principles of America. SLEEP 8 HOURS A DAY. MARRY AND REPRODUCE. WATCH T.V. STAY ASLEEP. CONFORM. OBEY.


What makes They Live resonate so much for me, a decade after I first saw it and well after it was first released, is what it reveals about paranoia and the comforts of conspiracy. While the film bears the trappings of a sci-fi-based horror movie, its central conceit — that American society is being undermined by alien invaders — is actually more comforting than frightening, because it supports the premise that people are too fundamentally decent to create the kind of society depicted in They Live. Suddenly, we didn’t do it — it was done to us. This preserves our ideas of our own goodness while offering a tantalizing promise of redemption. An alien menace is a menace that can be fought and destroyed; what came from outside can be sent back outside. Sure, defeating a technologically advanced alien race is not going to be a walk in the park. But if there’s one thing we know how to do as humans, it’s kill those who are different from us. Whether the solution proved to be sunglasses, computer viruses or red anti-alien virus powder, we’d find a way. If, however, the problem turns out to be us — if we, not alien invaders, made the world around us, with all its greed and its waste and its callousness — then we’re probably screwed. Continue reading

Truth and Beauty: Tender Is the Night

While traveling in Spain I finally read Scott Fitzgerald’s Tender Is the Night. It seemed a nice “continental” choice for a trip to Europe.

I have a soft spot for Scott (whom I occasionally call by his first name). Raymond Chandler felt that Fitzgerald just missed being a great writer, and I can see his point: an awful lot of Fitzgerald’s work is either not quite formed (his first two novels, which honestly I have been so far unable to finish) or commercial and vaguely hacky (much of his short fiction, although many of his stories are beautiful and completely honest). Someone once said Fitzgerald is a writer best discovered when young, and as a no-longer-quite-young person, I think that’s true. He has a young person’s longing to be swept up and away, a young person’s ideals, a young person’s eagerness to admire — even to worship — and to mold himself in a beautiful and noble image.

Yet while I am no longer able to look at life quite as breathlessly as his characters do, I sympathize with, and even admire, their determination to live in a kind of refined and rarefied grace. I am nearly Fitzgerald’s age when he died, and I marvel at how strong his idealist streak remained through years that tried him severely. I can’t remember where I read it, but I recall he once described Tender Is the Night as a “testament of faith.” Partly it was simply faith in himself, in his ability to persevere while living with a mad wife, deepening debts and dwindling inspiration. And partly it was faith that the beautiful illusion was still worth cherishing, worth nurturing, worth bringing, however improbably, toward reality. Beauty is truth, as Keats said and Fitzgerald believed, and it’s no coincidence that a Keats verse inspired the novel’s title. Continue reading

The iPad and the Dog that Didn’t Bark. (And the Dog that Barked too Soon.)

The product Apple revealed yesterday was largely what most people expected. Called the iPad (well, that name probably wasn’t expected), it is slim and elegant, engineered with meticulous care to do a few things well: deliver the internet, display movies and photographs, play music and serve as an electronic reading device. The latter capability was revealed about halfway through Steve Jobs’ launch presentation, not quite an afterthought but lacking the marquee position of an A-list feature. As Jobs remarked several years ago when dismissing Amazon’s Kindle, people don’t read anymore; certainly they don’t buy books the way they buy music, movies and TV shows. Perhaps this justified the middling prominence of the iBooks application and its accompanying online bookstore, which aims (like the Kindle) to do for reading what iTunes and the iPod have done for music. And perhaps that explains why one of the day’s most significant announcements was made as little more than an aside. “We are also,” said Jobs, not sounding very excited, “very excited about textbooks as well.”

Perhaps Jobs soft-pedaled this announcement because he knew it wasn’t a surprise at all. The night before the iPad launch, McGraw-Hill CEO Terry McGraw spilled many of Steve Jobs’ beans in an interview with CNBC, breezily confirming that Apple was announcing a tablet computer running the iPhone OS, for which McGraw-Hill was collaborating with Apple to provide educational content. It might not appear entirely out of character for Jobs to lop McGraw-Hill out of his presentation, provided it had ever been included — Jobs famously dropped graphics chip vendor ATI from a keynote when they revealed upcoming Mac models before he could. And it prompts a mordant chuckle to imagine the look on Jobs’ face as he watched McGraw blithely steal his thunder. But I give Jobs the benefit of the doubt. It is likely that Apple’s negotiations with textbook publishers are still in progress, and that Apple will formally tout the iPad as an education tool at a later date. Because this arrangement is a very big deal — one that could potentially have a huge impact on both parties. Continue reading