Normandy and Charlottesville

I’ve never been one for skydiving or red convertibles, the cliché signs of a midlife crisis. My confrontation with mortality tends instead to lead to a less dangerous bucket list, one that’s focused primarily on places I have yet to visit—the Grand Canyon, Stonehenge, northern Norway for a front-row seat at the aurora borealis.

I recently checked off one of the spots near the top of my list: Normandy. My father fought in World War II and arrived in France a few days after D-Day (thus significantly increasing the chances that I would exist to be writing this today). He never spoke much about the war, so I got a better idea about what battle is like from Steven Spielberg than from him. In 1945, when it was all over, he was just 25 years old, and already battered by more trauma than I hope to know in a lifetime. I wouldn’t want to relive that much, either.

He’s been gone for a while now, but I wish I could tell him what it was like to step onto Omaha Beach 73 years after he did. The most remarkable thing is that it looks like a beach—just a beach. I don’t know exactly what I was expecting—angels hovering over the water, perhaps, or a John Williams score blaring from the clouds. Instead, I found a surprisingly narrow strip of sand, with children playing on it.

There’s a monument, of course, Allied flags still waving from tall poles, and a modern sculpture that looks like wings emerging from the sand. But what moved me most was the expanse of the place. Beyond the sand, there are grassy fields, a road, storefronts, all on a flat stretch of land far from the hills upon which the Germans roosted on that fateful, fatal day. That openness, and the placidity of the place, gave me a shudder. The only phrase that came to mind was another cliché, both inadequate and inevitable: sitting ducks. When the hatches opened and the Allied soldiers emerged into hellfire, there was nowhere to hide.

But ducks don’t know they’re sitting. Those men knew exactly what was about to happen. And still they did it. They did it simply because it had to be done, because there was no living without freedom. They did it not for themselves—they all must have feared they’d be dead within minutes—but for us.

Later, we visited the American Cemetery, a beautiful, cliffside lawn where nearly 10,000 people are buried. It’s a sea of white crosses and stars of David, all facing westward, toward home.

I walked slowly, as reverently as an atheist can, through the rows of graves. And I found myself lightly touching each marker I passed and murmuring, Thank you. Thank you for putting principle over personal safety. Thank you for standing up to tyranny. Thank you for making the world a better place for the rest of us.

We were on vacation, and—especially these days—vacation means an escape from the news. I didn’t watch any television that week, barely checked the headlines on the rare occasions when I bothered to log in to Wi-Fi. But I carried the news with me. I carried the trauma that has haunted me since November 2016. I carried the knowledge that the country I was visiting, the one that had capitulated to Hitler, had soundly rejected protofascism in its own recent election, while ours—the country that had saved it—had gone the other way. And I carried the memory of Charlottesville, where a woman had been killed fighting against the same kind of hatred that had started the war this hallowed ground acknowledged. Just days before, fascists and white supremacists had marched through the streets of Charlottesville, torches in hand, proclaiming hatred for anyone who didn’t look like them. And, as I came to learn while I was across the sea, the president of the United States had refused to condemn them.

The markers in that cemetery are a tribute to the brave Americans who stood up to barbarians and sacrificed their lives for freedom. Those people did not die so that the president of the United States could pander to Nazis. They demonstrated bravery in the face of death, not cowardice in the face of poll numbers. If our current commander-in-chief set foot on that sacred spot overlooking Omaha Beach, the very ground would tremble in protest.

cemetery star


Is Historical Perspective Gone with the Wind?


When a friend of mine was halfway through her English Ph.D. program at Berkeley, she told me she could no longer read Shakespeare because he was “too sexist.”

Yes, that Shakespeare. The one who gave us characters like Viola, Juliet, Cleopatra, and Cordelia. But, despite the inherent strength and wisdom of such characters, Shakespeare also portrayed many of his women as powerless over their own destinies. In the 16th century. Imagine!

I was reminded of my friend when I read that a Memphis movie theater was canceling its annual tradition of showing another politically incorrect classic, Gone with the Wind. The film’s romantic image of the Old South—embodied in part by the happy-to-be-a-slave Mammy (played by Hattie McDaniel in the first role ever to win an Oscar for an African-American actor)—has apparently met its match in our current climate.

I get it. If, god forbid, Gone with the Wind is your only source of information about the Civil War, then you are woefully and dangerously underinformed. And given the sad state of American education, it’s quite possible that a 1939 movie is indeed the dominant image a lot of people have about one of the most pivotal and horrifying periods in American history. For such people, America’s original sin is likely to seem more like an episode of Leave It to Beaver.

We are living in dangerous, ugly times, where racists feel empowered to show us their faces, now that their president has told them they can safely leave the pointy white masks in the back of the closet. And if you can’t trust that your audience has appropriate context, it’s probably best not to show the film at all. After all, this theater is in the heart of the South. There are probably Confederate flags embossed on cars in the parking lot.

That said, there’s another message here, a message I’m much less comfortable with. Censorship always gives me the creeps, especially when censors let the alleged politics of art eclipse the art itself.

It’s one thing to remove statues of Confederate leaders: Robert E. Lee and Jefferson Davis were guilty of treason; they have no business standing on a pedestal anywhere. The Germans have no problem remembering their history despite the fact that no likeness of Adolf Hitler is to be found in any public space.

Gone with the Wind, however, is a work of art—a film classic that, adjusted for inflation, remains the highest-grossing movie of all time. For its era, it is a marvel of filmmaking—in terms of design, cinematography, direction, and, perhaps most of all, the iconic performance of Vivien Leigh as Scarlett O’Hara.

Let’s also bear in mind that there’s more than a little satire in the character of Scarlett, whose “fiddle-dee-dee” approach to life is judged quite harshly by the filmmakers, as it was by her creator, Margaret Mitchell. The main arc of the plot is Scarlett’s evolution from shallow narcissist to empathetic human, as her ignorance is shattered along with the immoral institutions that created and buttressed it.

In his announcement about the cancellation, the Orpheum Theater Group’s president stated, “As an organization whose stated mission is to ‘entertain, educate and enlighten the communities it serves,’ the Orpheum cannot show a film that is insensitive to a large segment of its local population.”

Hmm. In what way are education and enlightenment served by censorship? This is the same thinking that leads college campuses to issue trigger warnings and deny discussion of any ideas that don’t adhere to politically correct notions. If anything, censorship adds insult to injury, a patriarchal approach that suggests that the censor knows better than the people he is aiming to protect. The plantation parallels are a bit ironic.

There’s no question that Gone with the Wind got it wrong, very wrong. The ugliness of slavery wasn’t at the top of Margaret Mitchell’s list of plot points. Slavery was little more than a backdrop for Scarlett’s tale. There’s no question that the film is insensitive to the suffering that lay at the core of the alleged glories of the Old South. The title, of course, refers nostalgically to the antebellum South, and the story bathes it in a naïve romanticism that may have been common in its day but grates in our own. Gone with the Wind was made 70 years after the war it depicts. Now, 80 years after that, we’re still suffering the ramifications of that wretched legacy, through a long history that makes Scarlett’s ignorance look pathetic by comparison. In the age of Ferguson, in an era when the KKK marches hoodless through the streets, Scarlett O’Hara should be seen exactly for what she is: an artifact of a bygone era.

Perhaps the appropriate way to handle her story is to put it on a double bill with Twelve Years a Slave, to give the lie to the old Hollywood notion of the joys of forced labor. The latter, another Oscar winner, tells the story of slavery from the perspective of its victims, and it does so without sugarcoating anything. It’s a far more historically accurate and culturally sensitive film. But they are both excellent works of art. Sadly, Twelve Years a Slave could not have been made in 1939. That, too, is a historical reality.

Art is a product of its time. Even the classics can’t escape the era in which they were made. We could disparage Hitchock’s Rope as homophobic, My Fair Lady as sexist. Or we could accept them all for what they are and use our own critical judgment to put them in perspective. Shutting down the conversation before it’s even begun serves no one.

Dude the Obscure

This is the height of paradox: writing a blog about my fear of attention. Blogs, after all, were created so that everyone who wants attention can get it.

But the truth is that I have a very ambivalent attitude toward being noticed. I seem to be an introvert trapped in an extrovert’s body. Somewhere inside, it feels arrogant to expect attention—even to want it. That doesn’t stop me, of course, from doing just that.

I’m sure my feelings aren’t unique. The phenomenon seems particularly common among artists, since creativity and introversion seem to go together like paint and canvas. Ironically, the main drive of an artist is self-expression. But what’s the point of expressing yourself without the benefit of an audience? If you sing an aria in the forest and no one’s there to hear you, does it make a sound?

The discomfort hits most powerfully when I’m about to publish a book. I was hoping that by the third time (the proverbial charm), I’d be used to it. I was hoping that, by now, I could concentrate on the excitement, unaccompanied by this all-too-familiar dread.

Alas, that was not to be. I am no less terrified now than I was when my first novel came out, more than 10 years ago. Will readers understand my intention? Will anyone be turned off? Will, god forbid, I receive a bad review?

Of course, the answer is Yes, and Yes, and Yes. Someone won’t get it. Someone will be pissed off. Someone will disparage it. So what? Does that mean you shouldn’t try, that you shouldn’t let your work be seen by the world? Legend has it that Emily Dickinson wrote her poems on grocery bags and stuck them in a drawer. They are now among the finest artifacts of American literature. Apparently, obscurity goes only so far.

Publishing a novel brings up a couple of different worries. First, the personal exposure: there’s something of the author in every book. And even if it’s not autobiographical, you fear that people will read it that way, that they’ll attribute every action of every character to you personally. Second is simply the fact that by putting your stuff out there, you’re inviting criticism of the work itself.

So in other words, there are two options: people can use the novel as an opportunity to judge your talent or your life. No wonder I’m anxious.

There’s little consolation in the fact that Channeling Morgan is the least autobiographical of my novels. I’m not a ghostwriter, like the protagonist (though we do share a certain discomfort with being the heroes of our own lives). My exposure to drag queens and movie stars is minimal (i.e., a few of the former and none of the latter). But, as in all my work, there are pieces of me everywhere. You have to empathize with your characters to some degree. The crucial question, then, is: will people guess right as to which detail is which?

As for the stuff that’s not me, the perennial concern is whether I captured it correctly. Did I describe the drag world well enough? Did I offend poets or actors with my satirical renditions of various archetypes? Is the story universal enough? The plot is intended to be a bit far-fetched, the characters a bit over the top, but that doesn’t excuse me from an obligation to a certain degree of verisimilitude.

Most of all, of course, I want people to like it. Recently, an article in the New York Times grabbed my eye with the title “Popular People Live Longer.” Loneliness, apparently, is second only to smoking in the list of behavioral conditions that cause premature death. Words to live by.

So I guess I’ll take the bet. I’ll put my work out there, lay my heart and soul on display, and hope that the occasional word of praise will add a day or so to my lifespan.

Hamlet Debates Watching Donald Trump’s Speech

To watch or not to watch, that is the question.

Whether ’tis safer for the sanity and the blood pressure to ignore

The bigotry and nonsense of this outrageous circus,

Or to take arms against a sea of crass stupidity

And by laughing, end it.  To cry, to weep–

No more–and by guffawing to say we end

This absurd union of rednecks and greed-mad narcissists,

And the million certifiable lunatics the GOP is heir to.

‘Tis a consummation secularly to be wished. To cry, to sleep–

To sleep, perchance to dream, and find that this nightmare

Is nothing more than the fear and small-handedness

That tyrants are made on.

May the Myth Be with You

Star Wars is a summer movie if ever there was one. Or seven. Or nine. And yet, there’s something terribly appropriate about the fact that, this holiday season, The Force Awakens is all anyone can talk about. A December opening isn’t just about Oscar lust and box office. The winter solstice has been a time of mythic meaning from the pagans, through the Christians, to the Jedis.

When the Star Wars logo appeared on the screen at today’s matinee, and the plot summary in its familiar yellow font receded slowly into star-strewn space, I felt a catch in my throat. It’s the same reaction I have on Christmas Eve when I hear the story of the wise men, even though I’ve been a confirmed atheist for a long time in what seems like a galaxy far, far away.

In neither case is my reaction because the story is factual. In both cases, it’s because the story is true.

No spoilers, I promise. I’ll refer to only one aspect of the plot, and that’s revealed in those yellow words at the opening, when we’re told that Luke Skywalker has disappeared. Later, when Luke is mentioned, one of the characters dismissively says, “Luke Skywalker? I thought he was a myth.”

The line reminded me of the opening of another modern myth—Ayn Rand’s Atlas Shrugged—the sentence that serves as a catchphrase for hopelessness throughout the first part of the book: “Who is John Galt?” Well, after a few hundred pages, the world finds out who John Galt is, and hopeless is the last word to describe him.

When people say things like this—“I thought he was a myth”—the suggestion is that a myth is somehow less powerful than a flesh-and-blood person. In truth, men and women act upon their time in history. Myths live forever. The stories of the fictitious Odysseus still resonate for the world. Jesus was a man who became a myth, and as I write, people the world over are celebrating his birthday, or an approximation thereof.

Myths resonate so deeply in part because of their simplicity. There are a few common elements that each mythology seems to recycle. One such element is the absent hero, the hero who’s retreated from the world and is somehow coaxed back into it, toward his greatest triumph. Jesus wanders in the wilderness, John Galt creates a hidden community in the mountains. And Luke Skywalker … well, he does what Luke Skywalker does. (I promised: no spoilers.)

And then there’s the central concept that gives each myth its unifying structure, its raison d’être. I’d forgotten how cool the concept of the Force is, the idea that there’s an energy coursing through the universe that has both light and dark components. In Star Wars, it’s an almost tangible thing, external to people but still running through them. In religion, it’s pretty much the same—grace emanating from a god, granted to individuals. And in both cases, it’s a metaphor for free will. Light and dark, good and evil, lie before us all the time, and every moment of the day we choose one or the other. On some days, mostly good; on other days, not so much. In our own ways, we are all members of the Skywalker clan, tempted by warring forces within our own minds.

My favorite of all myths is the Nibelungenlied, as manifested in Wagner’s transcendent Ring cycle. There’s a force in that story, too, a force that can be used for good or for evil. And in the end, there is redemption. The most beautiful music I know is the leitmotif that pours through the orchestra at the end, when Brünnhilde sacrifices herself to save the world.   The theme is known as “redemption through love.”

There are two more movies before we reach the end of the Star Wars saga, before we find out what action saves that particular galaxy. But we know that something will. There’s a huge surprise near the end of episode 7, but the final moment of the film is completely predictable. That’s what happens in myth. And we wouldn’t want it any other way.


On Rediscovering Madame Bovary

(Warning: Spoilers abound.)

In my freshman year, I comped for the undergraduate literary magazine. (Comping: that’s Harvardspeak for “trying out.”) As I recall, the process had two major phases: first, we had to read submissions to the magazine and review them—pointing out flaws, recommending rewrites. I imagine I did pretty well at that, because it got me to phase two—the interview.

That interview lives in my memory as one of the most intimidating moments of my life—more so than any job interview I’ve had since. I walked into a dimly lit room to face a tribunal of what in memory seems like a dozen people who tossed questions at me, designed to gauge my literary tastes and my ability to articulate them for the magazine. One editor, a blonde girl dressed all in white, asked what my favorite book was. The answer came quickly: Madame Bovary. But then came the hardest question of all.

“Why?” she asked. “Tell us what makes it your favorite.”

A simple enough question, but I was flummoxed. It was true that I had loved the book. It was also true, however, that I believed the mere fact that I had read it would impress them. It struck me as a rather mature choice for an 18-year-old.   But I stumbled over the why. I have no memory now of what I said, but I do know that it was inarticulate, unimpressive, and not enough to get me into the club.

Of course, I hadn’t yet learned how to explicate a text, how to explain how a particular work of literature makes its mark on a reader. That was what I was about to learn in class, years spent reading and discussing the work of writers from Geoffrey Chaucer to Virginia Woolf. But at that point, still adjusting to college and figuring out where the libraries were, I did not yet have the tools.

There was, of course, another reason for my inarticulateness. I may have had a visceral reaction to the themes of Madame Bovary and an appreciation for Flaubert’s artistry, but I had no life experience to prepare me for the content. A person enraptured by fantasies of love, prey to the danger of romantic notions in an all-too-practical world, was as yet a completely foreign concept to me.

Now, more than 30 years later, I have finally reread Flaubert’s masterpiece, in the wonderful recent translation by American writer Lydia Davis. And this time was completely different. To see Emma’s story through the eyes of experience is to discover something almost entirely new.

Madame Bovary is the story of a naïve woman whose understanding of love is shaped entirely by novels, and when I first read it, so was mine. At the time of first reading, I grasped Flaubert’s point—that romance and reality are very distinct, and that a failure to appreciate the difference can be disastrous. But my understanding of even that theme was purely intellectual. And, perhaps unsurprisingly, it didn’t protect me from later making mistakes very similar to Emma’s, albeit with far less tragic results.

In fact, despite what I should have learned from Flaubert, I spent most of my twenties and thirties in pursuit of romance. I cursed my loneliness, decried the imperfection of my lovers, and despaired of ever finding happiness. Along the way, I fell hard for a married man, another whose mental illness brought out every codependent nerve in my body, and a series of others who were unable to commit to a third date, let alone a white picket fence. All the while I seldom stopped to ask myself whether the problem lay less in the reality I was finding than in the expectations I brought to the process.

Flaubert famously said of his heroine, “Madame Bovary, c’est moi.” And now I know what he meant. I have been Emma Bovary, again and again. I have had my share of Charles Bovarys, men whose love for me couldn’t compete with the imaginary Mr. Right I kept stowed away in the back of my head. In my youth, I even shared Emma’s tendency to go into debt, thinking that pretty things (oh, the money spent on those “colourful” Alexander Julian shirts that were my fashion obsession in the 80s!) would compensate for what was missing inside.

In rereading the book all these years later, I found that my sympathy for the heroine had expanded into empathy: I could actually relate to her experience. Flaubert is judging her, of course, using her as an example of the dangers of not just romantic notions but commercialism and the forced conformity of provincial life. But he also demonstrates compassion for her. Emma’s mistakes are an indictment of society as much as of the woman herself. She is taught to depend upon men for everything, but she is never taught how to calibrate her dreams in relation to her circumstances. Most tragically, she comes to believe that love—that is, the romantic notion of love promulgated by fiction—can cure all ills, can in fact give her the identity she seems unable to grasp on her own. Because she is so desperate to find that soul-stirring love (essentially, nothing more than a mix of romance and lust), she never discovers the deeper love that actually can cure all ills: the kind of love that comes only with time, because only time can teach you patience, and only prolonged attachment can teach you true compassion.

Emma is disappointed when marriage doesn’t bring her immediate pleasure and constant excitement. She is disappointed when Rodolphe proves unable to translate his lust into commitment. She is disappointed that Léon is unable to give her the luxurious life she craves. Emma’s life is a series of disappointments because her dreams get in the way. Despite my compassion for her, I can also say that she is rather narcissistic: the men in her life amount to little more than vehicles for her own pleasure.

The first time I read the book, I’m sure I was rooting for Rodolphe, the swarthy one. Or Léon, the serious, sensitive one. I’m sure I wasn’t rooting for Charles, the steadfast, earnest one.

Oh, what a difference a few decades make.

I can’t help thinking now that Charles does indeed love Emma in a mature, almost noble way—ironically, just the way she needs to be loved, the way anyone needs to be loved, really. His great flaw is that he is blind to her impractical desires and therefore can’t prevent her from giving heedlessly in to them. It never occurs to him that the comfortable life he offers her might not be enough to satisfy her. He never really understands just what she wants from life—nor does he really ask the question. That is his own tragic flaw, the source of his downfall.

Flaubert brilliantly draws a convincing portrait of Emma, far more psychologically nuanced than one would expect for a novel published in 1857. To my mind, the psychological depth of this novel would not occur in English literature until more than 30 years later, with the late novels of Thomas Hardy. And no doubt, this psychological realism is the true reason Flaubert was charged with obscenity: Madame Bovary not only deals with sexual desire, but does so sympathetically and credibly. It was one thing to depict a woman’s sexuality if you condemned her in the process, but to show compassion for her was, at the time, no doubt unforgivable.

Although Emma does die, of course—and it is one of the most agonizing deaths in all of literature—her fate cannot be read as the author’s attempt to punish her behavior. On the contrary, at the end, Emma finally understands that her life itself was the real agony. She isn’t suffering now because she did something wrong; she suffered all along, because she lived her life in terms of desire rather than reality.

I don’t remember the language of the translation I read 30 years ago well enough to tell you whether Davis’s is better, but it is certainly very, very good. The novel flows in beautiful English, but with a cadence and syntax that reveal its French roots.

Emma Bovary never had the chance to learn that time and experience are the greatest of all teachers. So the rest of us should be more grateful that we do. I know I’ll never again be so tongue-tied when someone asks why I love this novel so much.

In my senior year of college, with more experience as an English major under my belt, I comped for the Crimson, the campus newspaper. This time I succeeded, and got my first byline writing film reviews—which, I recently discovered, are now available online. I’m not quite sure I’m ready to reread those.

Happy Birthday to the American People

When I told a friend I was reading The American People, he immediately shut down the conversation with a dismissive, “I hate Larry Kramer!”

That’s what you get with Larry Kramer. His anger turns people off. His conviction that he knows what’s right turns people off. Despite the fact that he usually is right, and his anger is justified. In fact, his anger and his arrogance are two of the main reasons that AIDS is no longer the plague it once was. But that’s another story. Or is it?

For Kramer, the line between fact and fiction has always been fairly thin. He has admitted that the reason he could write The Normal Heart so quickly was that he was really just writing about what had actually happened. So when it comes to The American People, his long-awaited tome (clocking in at just under 800 pages—so far), are we talking about fiction or history? The answer, quite clearly, is a little of both.

The volume begins with a series of epigraphs, including this rather telling one from Joseph Conrad: “Fiction is history, human history, or it is nothing.” For Kramer, too, art must have a social agenda. It must say something about our time, even be a call to action. I don’t know what he would say about the concept of art for art’s sake, but it has always seemed to me that even that dictum is a social statement. And it is well to remember that one of its greatest proponents, Oscar Wilde, always had a message up his subversive sleeve.

How fitting that I should finish this book in the week between Pride and the 4th of July—holidays that both honor, albeit in very different ways, revolution.

Kramer’s history begins in the pre-colonial Everglades and, at the end of Volume 1, brings us to the 1950s. Along the way we meet many of the major figures of American history, several identified by name, others disguised by pseudonyms. (The latter technique seems largely reserved for more modern figures, perhaps only because the living and heirs of the recently deceased are more likely to be litigious.) Thus, we have George Washington as George Washington, but Peter Ruester as Ronald Reagan.

Aside from the historian narrators who quibble among themselve about their own versions of the truth, the only character that remains a constant through the book is the Underlying Condition. This is one of Kramer’s most intriguing tropes. In a literal sense, the UC, as he calls it, is a virus that will eventually evolve into HIV. We see it mutating its way through America, changing its tactics and getting stronger with each generation of hosts. But in a metaphoric sense, it’s a great deal more than that: the Underlying Condition is the dark side of the American dream—the combination of hatred and greed that has followed us through history, the combination that has arguably made our history. As I suspect we will see in Volume 2, AIDS is the ultimate manifestation of that moral virus, the result of a culture that survives by targeting and destroying its own marginal elements.

The first 100 or so pages of the book are a bit hard to get through—lots of competing voices (scientists, historians, and various talking heads), lots of descriptions of violent and ugly episodes, and lots of shit—literal shit. But even in those early sections there are glimmers of brilliance. Indeed, the book is full of inspired moments and several extended passages that are quite insightful and even beautiful. I particularly like the depiction of ordinary people (as opposed to, say, Alexander Hamilton), such as the Jewish family that becomes the center of the story in the second half—allowing Kramer to demonstrate the effects of history on relatable human beings.

Much has been made about how Kramer deals with more historical figures, most of whom he depicts as gay or at least participating in homosexual activity: Washington, Franklin, Hamilton, Lincoln, the list goes on. For many of them—e.g., Lincoln and J. Edgar Hoover, of course—there’s nothing surprising here. For others, one wonders how much Kramer’s imagination transformed his stated research. But even that isn’t the point. History doesn’t tell us much about people’s sex lives, other than whether they were married. So why, as Kramer has said elsewhere, presume they’re straight? Why not assume they were gay? By doing so, he provides another through-line, but also a commentary—that homosexuality is natural and therefore deserving of neither shame nor denial. He seems to be daring the reader to argue with him, but you have the sense that he has reams of research to back up everything (much of it actually cited in the text, while other titles and authors are clearly made up). History belongs to the historian, right? And in an age when memoir, with its avowed verisimilitude despite the fact that the human memory plays havoc with literally everything, is the new novel, why shouldn’t fiction be the new history?

And therein lies the genius of the book. Kramer plays with genre, calling into question our ordinary definitions of truth (and, of course, the pesky problem that fact and truth are seldom the same thing). At the same time, he creates a phantasmagoric landscape, complete with a talking virus. The novelist he most reminds me of is Salman Rushdie, and the book I would say this has the most in common with is The Satanic Verses. And we all know what kind of trouble that got Rushdie into.

In this book, as in every other phase of his life, Larry Kramer is fearless. He is 80 years old, living with HIV for decades. He founded both GMHC and ACT-UP, taking on everyone from Anthony Fauci to Ed Koch and Ronald Reagan. He is a force of nature, and a voice that not only demands to be heard, but needs to be heard.

The second volume promises to blow the lid off. If people are upset with what Kramer says about George Washington, just wait until we see what he does to Ronald Reagan.