• What Do Superheroes and Zombies Have to Do With the End of the World?

    Peter Biskind on Pop Culture's Obsession with How It All Ends

    The end of days has been just around the corner from the beginning of recorded history. Still, it seems that today, the professional doomsayers are working overtime. The dire warnings that come at us at every moment from every quarter have become a staple of global culture. Not a minute goes by without a new alert. Billy Graham, who popularized evangelicalism in the postwar era, prophesized the apocalypse way back in 1950, saying, “We may have another year, maybe two years.” But then, he added, “It’s all going to be over.” More recently, according to astrologer Jeane Dixon, the end will come between 2020 and 2037; World Bible Society president F. Kenton Beshore expects the Second Coming in 2020 at the earliest, 2028 at the latest; in his book The Cassandra Prophecy, Ian Gurney favors 2023, while 2033 is supposed to be the 2000th anniversary of the Crucifixion, during which something or other of an unpleasant nature is supposed to occur, even though the Raelians, who worship UFOs, beg to differ, anticipating extinction in 2035 and pinning their hopes on an alien invasion to make it a reality.

    There’s no question that apocalypse fever has hit epidemic proportions. The “imagination of disaster” (to borrow Susan Sontag’s term) has historically accompanied periods of uncertainty and transformation, from the fall of the Roman Empire to the present, but now it is fueled by the unprecedented acceleration of change. Moreover, in 2001, 9/11’s twin towering infernos gave Americans a taste of what the end of the world might be like. With exemplary British understatement, Paul Greengrass, director of United 93 (2006), explained, “After the dramatic fears of 9/11 . . . I think all these stories are speaking to a sense of a future that is less assured.” According to Robert Kirkman, auteur of the comic that serves as the basis for The Walking Dead (2010- ), “Apocalyptic storytelling is appealing when people have apocalyptic thoughts. With the global economic problems and everything else, a lot of people feel we’re heading into dark times.”

    Putting aside the migrant crisis, growing poverty, and famine with all its ancillary consequences, such as drug-resistant plagues, just when we thought the curtain had dropped on the Cold War, the bomb, which was the odds-on favorite to end life as we know it during the 1950s, has made a comeback in the guise of dirty bombs fashioned from enriched uranium purloined from the great powers, not to mention the spread of nuclear weapons technology to Israel, India, Pakistan, Iran, and now North Korea. The Bulletin of the Atomic Scientists’ Doomsday Clock is ticking louder than ever. In fact, the hands were moved ahead in early 2015 to three minutes to midnight, coupled with the warning, “The probability of global catastrophe is very high.” In 2017, after the presidential election, the hands were moved ahead another 30 seconds, to two and a half minutes to midnight, and then again in early 2018 to two minutes before midnight, the closest it’s been to the apocalyptic hour since the height of the Cold War in 1953.

    Climate change provides us with something even more intractable than terrorism or the likelihood of nuclear war to worry about, with climatologists, not kooks, predicting a “new Dark Age.” After 2017’s and 2018’s serial hurricanes, furious forest fires followed by mudslides in Southern California, a total eclipse of the sun, and the bellicose exchanges between Trump and Kim Jong-un, even the late, lamented Christopher Hitchens might be excused for consulting the Gospel of Luke, wherein it is written: “And there will be signs in the sun and moon and stars, and on earth distress of nations in perplexity . . .

    Almost by definition, science-fiction, fantasy, and horror narratives anticipate the possible, no matter how unlikely. The blizzard of apocalyptic shows gives us a glimpse of what the end might be like—thought experiments that provoke us to think about the unthinkable, dress rehearsals for a show we hope will never open.

    Although the danger of a world-ending event is real enough, more often than not, the apocalypse is in the eye of the beholder. Speaking of the 2016 presidential election, both candidates appropriated the language of the end times. To Hillary Clinton, the apocalypse was manifest in the person of Donald Trump. She told her supporters, “I’m the last thing standing between you and the apocalypse.” To Trump, the apocalypse had already happened, namely, the two-term Obama presidency, enabled by then secretary of state Clinton. He called his Democratic opponent, “the devil,” adding, all Hillary Clinton had brought to the world was “death, destruction, terrorism and weakness.” And it wasn’t so long ago that former House majority leader (and “moderate”) John Boehner referred to Obamacare as “Armageddon.” In other words, each of the ideological tendencies mentioned earlier bends the apocalypse to its own purposes; it is inflected center, left, and right.

    The ideology of the bipartisan coalition that comprised the postwar mainstream was called “pluralism.” In an influential book called The Vital Center (1949), historian Arthur M. Schlesinger Jr. theorized a “third force” composed of “democratic socialists” and “liberal capitalists” intended to navigate a middle way between democracy’s two enemies, communism and fascism, thus avoiding the bloodbath that engulfed Europe and ensuring America’s leadership of the “free world.” Schlesinger was prescribing a foreign policy, not endorsing an approach to postwar domestic governance, and he went out of his way to ridicule centrism, but he helped organize Americans for Democratic Action, which lobbied for just that at home, and his description of the third force perfectly described the coalition of Cold War Democrats and East Coast Republicans that ran postwar America.

    Liberal intellectuals like Schlesinger, Daniel Bell, Seymour Martin Lipset, and others agreed that American society was a democratic, “open society” made up of people of many colors, ethnicities, and religions. It was a melting pot in which cultural differences disappeared in a soup of assimilation.

    None of these groups could get the upper hand, because power was dispersed among them. Robert A. Dahl, the so-called dean of American political scientists, described pluralism as a “polyarchy,” meaning that power is distributed among many competing centers of authority, and therefore no one group is strong enough to dominate the others. Contending factions are forced to compromise with one another. In other words, if everyone is powerful, no one is powerful, and power, in effect, doesn’t exist at all in America, only in its totalitarian enemies.

    “The blizzard of apocalyptic shows gives us a glimpse of what the end might be like—thought experiments that provoke us to think about the unthinkable, dress rehearsals for a show we hope will never open.”

    In the world according to pluralism, consensus is based on abstract values shared by all. These were the principles that emerged from the Enlightenment, when the philosophes pitted them against primitive, parochial loyalties to clan or place. “By including fraternity, or the ‘brotherhood of man,’ among their ideals along with liberty and equality,” writes philosopher Peter Singer, “the leaders of the French Revolution neatly conveyed the Enlightenment idea of extending to all mankind the concern that we ordinarily feel only for our kin.”

    Later, Darwinists provided an evolutionary basis for the same morality. Natural selection strongly suggested that humans prevailed over other life-forms because they evolved beyond the dog-eat-dog competitiveness that characterized the state of nature. Cooperation, or sociability, not self-interest is the key to survival of the species. Or rather, sociability is in the self-interest of the human species. The relatively new discipline of evolutionary ethics was christened by Edward O. Wilson in his landmark 1975 study Sociobiology, the goal of which was to remove morality from the purview of theologians and philosophers, and claim it for biology.

    Contra Republicans’ confidence that Adam Smith’s invisible hand would ensure prosperity for all, liberal Democrats feared that it would pick the pockets of the poor and deliver wealth into the pockets of the rich. Capitalism was a given, but it had to be managed. Therefore, in pluralist poker, the public interest, which is identified with abstract moral principles, supersedes private interest, which is merely selfish. Those who put their own private interests ahead of the public interest are punished for it. They are bound by the imperatives dictated by social groups of escalating generality. The rights of the individual are trumped by the claims of kinship, that is, the blood ties that bind the nuclear family. These in turn give way to the extended family, then the tribe or ethnic clan, then the region, and then the nation. Beyond the nation, as the ripples spread outward, we have gender, the species, and even the universe.

    Citizens of the vital center were expected to play by the rules of pluralism, which favored pragmatism, compromise, tolerance, democratic decision making, and the rule of law. The East Coast Republicans and Cold War Democrats agreed that the bigger, more capacious and inclusive the tent, the more stable the consensus and the fewer the number of crazies lobbing grenades from beyond the perimeter. Tolerance, therefore, was not only a virtue in itself, it was a strategy that ensured stability, and more, survival.

    Pluralists worshipped at the altar of progress, and science was the tip of its spear. After all, science had just given us the A-bomb that ended World War II, followed by the H-bomb that seemed to guarantee America’s security for the foreseeable future, then the Salk vaccine that erased the terrifying scourge of polio, and finally the double helix that unlocked the secrets of the genetic code—all in quick succession. Meanwhile, a revolutionary pesticide, DDT, kept the little buggers at bay, enabling us and countries like us to become breadbaskets to the world.

    Science’s dominion over nature reduced it to fodder for our culture, our civilization. Nature tamed meant mountains leveled for their coal, rivers harnessed for electric power, forests cut for their lumber. Ever since the Industrial Revolution, nature has been the enemy of culture; it gets in the way of new homes, malls, golf courses, and interstate highways. Nature is no more than development waiting to happen. In the mainstream, therefore, disasters—earthquakes, floods, hurricanes, and other catastrophes—are often rendered as nature’s revenge against culture, nature run amok.

    Science and technology were the engines of progress. We were assured that the future was going to be better than the present, just as the present was better than the past. The postwar faith in progress was symbolized by the 1964 World’s Fair, a futuristic extravaganza that beguiled visitors with sugarplums of endless improvement, like the space program that electrified the imagination as well as promising, on a more mundane level, victory over the Soviets in the space race.

    George R.R. Martin, who wrote A Song of Ice and Fire, the series of books on which Game of Thrones is based, remembers, “When I was a kid in the 50’s, and even into the 60’s, everyone thought life was getting better and better. You’d visit the Carousel of Progress at the World’s Fair, and you could see all the amazing things the future had in store for us: robots and flying cars, and so on. Life would be great.”

    The Fair pops up with surprising regularity in today’s shows, a totem of sorts for the technological utopia that never happened. Iron Man 2 (2010), contains a scene in which Tony Stark’s father, launching a Stark Industries expo modeled on the 1964 Fair, extols progress: “Technology holds infinite possibilities for mankind and will one day rid society of all its ills.” In Brad Bird’s Tomorrowland (2015), we actually visit the Fair, and are dazzled by its wonders.

    “Ever since the Industrial Revolution, nature has been the enemy of culture; it gets in the way of new homes, malls, golf courses, and interstate highways. Nature is no more than development waiting to happen. In the mainstream, therefore, disasters—earthquakes, floods, hurricanes, and other catastrophes—are often rendered as nature’s revenge against culture, nature run amok.”

    In the wake of the Fair, Albert Einstein became the posthumous poster boy for an era in which it seemed like science had all the answers. His likeness—twinkly eyes staring out from under wisps of white hair flying wildly in all directions—was blown up to poster size and took its place alongside Humphrey Bogart, Marilyn Monroe, and Marlon Brando on dorm room walls across America. The beloved mad scientists like Baron Victor von Frankenstein, who got us all in trouble by messing with God’s works, discovered there were no longer jobs for them in mainstream shows and were relegated to the extremes, where they were still welcome.

    Schlesinger was a great fan of Reinhold Niebuhr, the influential postwar theologian whose pessimistic Christianity (original sin, etc.) led him to step on the brake of the express train of progress. From his perspective, millennialism was a dangerous illusion, the province of totalitarians like Hitler and Stalin. Rather, tolerance of conflict, not the utopian promise of conflict resolved, was the key to a functioning democracy. Its momentum was so great, however, that most Americans ignored Schlesinger’s misgivings. Cold War Democrats, standing on the graves of Soviet and Nazi messianism, assured us that utopianism was alive and well in the USA, and that our country was on its way to achieving the classless society that Marx had promised but the Soviet Union had so dramatically failed to deliver.

    Capitalism with a friendly face, softened by those New Deal safety nets that the right ritually denounced as “socialist,” had enabled us to reconcile the contradictions and transcend the divisions that had torn Europe to pieces. What we already had was more than we could hope for anywhere else. Americans could eat their cake and have it too. They lived in a both/and paradise that was realized in the here-and-now. Exuding the confidence of the home team, they were convinced that the grass could never be greener. Things were going their way. Contrary to Marx’s dire predictions, capitalism was delivering the goods, so much so that left-liberal economist John Kenneth Galbraith could entitle a book The Affluent Society. Consumers were swamped by a torrent of cars, washing machines, air conditioners, and TV sets. Workers enjoyed a living wage that enabled them to step up to a brand-new Ford and a house in Levittown—that is, as movies like 2017’s Suburbicon and Mudbound remind us, if they were white.

    So confident were the Cold War intellectuals that in 1960, sociologist Daniel Bell proclaimed “the end of ideology” in a book of the same title. 30 years later, it looked like events had proved him right. Buoyed by the triumphalism that attended the fall of the Berlin Wall in 1989 and the end of the Soviet Union in 1991, Bell’s intellectual heir, political scientist Francis Fukuyama, peering through rose-colored glasses, not only visualized the end of ideology but the “end of history.” He wrote, “What we are witnessing is not just the . . . passing of a particular period of postwar history, but the end of history as such: that is, the end point of mankind’s ideological evolution and the universalization of western liberal democracy as the final form of human government.” Famous last words, as they say.

    Prisoners of progress, of the American belief in improvement without end, in short, of postwar utopianism, mainstream shows have become fewer and farther between, but they have by no means disappeared. They are full of characters bursting with optimism, satisfied with the present, and filled with faith in the future. If they are fearful, it is the past that scares them, King Kong, not R2-D2.

    Mainstream narratives still cling to the belief that the system works. They use national emergencies, a slightly milder version of the apocalypse feared by the extremes, to showcase the federal government flexing its muscles and saving the day. After all, big government was widely credited with rescuing the American economy after unbridled, unregulated capitalism had plunged the country into the Great Depression of the 1930s and then mobilizing the nation to defeat the Axis powers in World War II.

    Even in a center-right film like Independence Day, soldiers and scientists work together, while the president informs the world, “We can’t be consumed by our petty differences anymore. We will be united in our common interests.” Indeed, the president of the United States forges a global coalition that includes, among others, the British, Japanese, Arabs, and Africans.

    There is no apocalyptic event, as such, in The Martian (2015), but it too is a showcase for mainstream values. Matt Damon, a member of a Mars mission, is stranded on the red planet when his fellow astronauts, hastily departing to avoid a vicious storm, inadvertently leave him behind. He manages to survive until he’s rescued by living on the potatoes he somehow cultivates in the meager Martian soil. In other words, the American pioneer spirit is very much alive and well, able to domesticate nature, no matter how inhospitable.

    Here, the National Aeronautics and Space Administration (NASA) is a thriving government agency that launches regular voyages to Mars. The picture cuts back and forth among its far-flung facilities—Cape Canaveral in Florida, the Johnson Space Center in Houston, and the Jet Propulsion Laboratory in Pasadena. NASA has a vast reach—but maybe not quite so vast as it used to. In the old days of American hegemony, it would have scrambled to send another rocket to Mars to retrieve Damon, no big deal. But in the current atmosphere of cutbacks, NASA needs help, and which country should it turn to but China, which has coincidentally helped save the film business with its robust grosses. China rides to the rescue, sending a rocket “booster” to save Damon. The film extends the reach of pluralism to include the world, an expression of the globalism now derided by the extreme right. And like a good centrist show, The Martian reconciles opposites: Damon’s individualism with big government’s emphasis on cooperation. And note that the Martian of the title doesn’t refer to a bug-eyed monster, but rather to an American—not Them, but Us.

    In 2016’s Hidden Figures, a drama that takes place at the height of the Cold War, the apocalypse is the prospect of a Soviet weapon in space, not so far-fetched in that Russia was the first nation to launch a satellite, and then to put a man into orbit. The national emergency is Russia’s lead in the space race, along with the so-called (and illusory) missile gap, dramatically rendered in the film by the hysteria of NASA bigwigs, augmented by documentary footage of President John F. Kennedy soberly rallying Americans to catch up to the Soviets. NASA’s efforts are in trouble, however, until its white male scientists learn that African Americans, who happen to be female, are not only people too, but scientists and mathematicians with skills that are equal if not superior to their own. As in The Martian, the mission succeeds by extending tolerance to those formerly considered Other, inhabitants of the badlands beyond the boundaries of the vital center. Racial and gender inequality is overcome, NASA is integrated, and an American astronaut successfully orbits the Earth.

    If ever a country, or in this case a continent, could use a good dose of pluralism, it is Westeros and its Seven Kingdoms in Game of Thrones, which is rent by family feuds, endless dynastic wars that tear it apart. Westeros is dominated by over a half-dozen “houses.” Most of these houses have at least one vendetta going, and some have two or three: the Lannisters against the Starks, the Targaryens against the Lannisters, the Tyrells against the Starks, and so on.

    It’s not until Season 7 that this system is tested, when a common enemy emerges—the White Walkers—representing an existential threat to all the houses. We know what pluralists think of blood ties and the parochial self-interest that motivates each of the clans, so it comes as no surprise when Jaime Lannister tries to convince his sister, ice queen Cersei, to rise above primitive tribalism by explaining what’s at stake in the coming conflict with the wights. “This isn’t about noble houses,” he says. “This is about the living and the dead.”

    Jon Snow, King of the North, and ostensibly the bastard son of Ned Stark, the late patriarch of House Stark, gets it. He is the only ruler who has seen the wights in the flesh, so to speak, and he knows that no house can go up against them alone. As Sansa Stark says to her sister Arya, “When the snows fall, and the white winds blow, the lone wolf dies, but the pack survives.”

    Snow plunges into feverish coalition-building, defying the tribalism that is destroying the Seven Kingdoms. We’ve seen alliances made and unmade before, but this time he’s trying to fashion the ultimate alliance against the ultimate enemy. Approaching the imperious Daenerys Targaryen, Snow declares, “I must put my trust in you. A stranger. Because I know it is the best chance for my people. For all our people.” She gets it too, but when Snow then tries to make common cause with Cersei, he discovers that Cersei doesn’t. She refuses to forgive and forget her feud with the Starks. Mocking him, she says, “So we should settle our differences and live together in harmony for the rest of our days?” Echoing the president in Independence Day, Snow explains, “This isn’t about ‘in harmony.’ It’s just about living.” In other words, for all its forays into magic and fantasy, for all its dragons, witches, zombies, spells, priests, and priestesses, Game of Thrones by the end of Season 7, turns out to be about America, circa 2017. It’s a mainstream show adhering to centrist values.

    Despite its trust in science, the postwar center made room for faith, so long as it cleaved to the tenets of pluralism. There was an uptick of religion during the 1950s, characterized by an interdenominational spirit that mirrored the political consensus. According to student of religion Ross Douthat, a “convergence [was] taking place towards a kind of Christian center.” Despite the claims of Catholics, Jews, Protestants, and what-have-you to exclusive access to God’s ear, they coexisted peacefully. It didn’t hurt that the spectacle of multifaith civil rights workers in the South praying, singing, and swaying while being beaten senseless by white deputies wielding clubs and cattle prods was beamed into American living rooms every night on the news. It sent the same message that Jon Snow was trying to deliver to Cersei: Sectarianism is passé, and therefore self-defeating.

    “In other words, for all its forays into magic and fantasy, for all its dragons, witches, zombies, spells, priests, and priestesses, Game of Thrones by the end of Season 7, turns out to be about America, circa 2017. It’s a mainstream show adhering to centrist values.”

    Generally speaking, excepting the run of biblical spectacles in the 1950s, religion was conspicuous by its absence from postwar movies. Even as late as 2014 the crop of high-profile biblical epics—both Noah and Exodus: Gods and Kings—flopped. The Hollywood community is generally a godless bunch, and indicative of the lowly place religion has traditionally occupied in show business, as of that year, over the previous two decades, only seven Oscar winners thanked God in their acceptance speeches, while 30 credited their good fortune to Harvey Weinstein.

    Today, that has changed; shows of the ecumenical center with interfaith themes reflect conservative mainstream values, family, and religion, like the ones that pervaded ABC’s hit show Lost which first aired in September 2004. The show began with a bang, that is, a plane crash. Oceanic Airlines Flight 815 bound for Los Angeles from Sydney breaks up high in the air over the Pacific Ocean. The passengers, along with their ephemera—baggage, coats, books, magazines, soda cans, thumb-size vodka bottles—are sucked out of the plane like so many matchsticks and unceremoniously dumped onto sand beaches blanched salt white by the blazing sun. The laws of gravity would have predicted that they all should have died, but Lost doesn’t have much use for the laws of gravity. Miraculously, tumbling 35,000 feet to the ground doesn’t so much as muss their hair.

    Lost ended, six seasons and 121 episodes later, in 2010. The intervening shows are devoted to a post-apocalypse purgatory that consists of an array of peekaboo glimpses of now-you-see-them-now-you-don’t monsters, sudden violence, torture, and dizzying disruptions of time and space until the survivors wind up in church to discover that they have been dead all along—perhaps—and only now do these ghostly infrequent travelers, dressed in their Sunday best, qualify for a nonstop flight to heaven, first-class, with no blackout dates.

    Looking around this church in the show’s finale, we can’t help but notice that although Christian iconography predominates, it is an ecumenical smorgasbord displaying the sacred totems of other religions as well—Judaism, Islam, Buddhism, and so on. After all, this is the church of network television.

    The feel-good finale is so banal, it became a touchstone for how not to end a series provoking George R.R. Martin to an inspired bit of vituperation: “Even as early as the second season and certainly the third season, I started saying, how the hell are they going to pull all of this together?” He added, “And then when I reached the end . . . they hadn’t pulled it altogether [sic], in fact, they left a big turd on my doorstep.”

    Inclusion and diversity were central to pluralism, but firm boundaries were still important because they marked the line between those who qualified for membership in the center and those who did not and were thus excluded. Although such ideological cleansing seemed to fly in the face of pluralism, centrists found a friend in Karl Popper, who furnished them with a work-around in an influential book first published in 1945 called The Open Society and Its Enemies, in which he wrote, “If we extend unlimited tolerance even to those who are intolerant . . . then the tolerant will be destroyed, and tolerance with them.”

    To the center, Popper’s “intolerant” were the “Geronimos” who turned their backs on the pluralist welcome wagon, thereby condemning themselves to the so-called lunatic fringe, a group for whom the world looked very different from the way it appeared to the center.

    __________________________________

    From The Sky Is Falling: How Vampires, Zombies, Androids, And Superheroes Made America Great For ExtremismCourtesy of The New Press. Copyright © 2018 by Peter Biskind.

    Peter Biskind
    Peter Biskind
    Peter Biskind is a contributing editor to Vanity Fair, a writer for Esquire, and the author of the classic bestsellers Easy Riders, Raging Bulls and Down and Dirty Pictures. He lives in upstate New York. His latest book, The Sky Is Falling is available from The New Press.





    More Story
    How the Great Lorraine Hansberry Tried To Make Sense of it All Lorraine Hansberry was the first Black woman to have her play produced on Broadway and the first Black winner of the prestigious...