Can Our Capacity for Empathy Actually Save Us From Ourselves?
Andrew Keen on Our Dangerous Game with AI
We always want what we can’t have. A Keen On guest last month told me that she wants artificial intelligence that can deliver more human empathy. But only we humans have empathy. Understanding and sharing the feelings of others is our superpower. And in tomorrow’s age of ubiquitous smart machines, it’s the one thing that will distinguish us from our robot cousins.
The idea of empathy as a “superpower” was articulated on Keen On last week by Toby Walsh, the author of Machines Behaving Badly: The Morality of AI, one of the world’s foremost experts on artificial intelligence. If our superpower is, indeed, empathy, Walsh asks, then why the hell are we trying to teach computers to be empathetic?
The answer, of course, to the question of why we are trying to outsource empathy, is money. The holy grail of artificial intelligence is mimicking humans by creating algorithms that can understand and share the feelings of others. There’s serious money, trillions and trillions of dollars, in mimicry. But like other technological holy grails—from social networks to cryptocurrency—this quest to invent non-human empathy probably isn’t going to end well.
Toby Walsh believes that by 2062 we will live in what he calls a “world that AI made.” You can read Walsh’s nonfiction vision of 2062. Or you can read Klara and the Sun, Kazuo Ishiguro’s fictional version of a world in which we’ve tried to make machines empathetic. The eeriest thing about 2021 Ishiguro’s techno-dystopia is its familiarity. That supposedly futuristic world which “AI made” isn’t far away. 2062, I fear, will happen well before 2062.
But it isn’t inevitable. Just as tech entrepreneurs are chasing the holy grail of empathetic algorithms, so smart humanist critics of Big Tech have also been revisiting the idea of empathy. One frequent Keen On guest, Sherry Turkle, for example, has been warning about the perils of pretend empathy for a while now. Another Silicon Valley humanist, Jaron Lanier, suggested to me that entrepreneurs should be making their customers wealthy by enabling human empathy rather than trying to mimic it.
And it’s not just tech critics like Lanier and Turkle who are revisiting the idea of empathy. The Pulitzer Prize-winning science writer Ed Yong has a new bestselling book, An Immense World, in which he introduces us to the way in which animals perceive and imagine their worlds. Yong’s enlightening work should be read as a form of radical empathy with other creatures. Animals can help us humans develop empathy, Yong explained last week on Keen On.
Like Toby Walsh, Ed Yong argues that empathy is our superpower. Empathy towards other species and toward nature is the only way out of our current ecological predicament, Yong believes. While empathy toward other humans is the most effective way to confront the other man-made injustices of our age. Jackie Higgins, another prescient writer about the sensory world of other species, came on Keen On last year to make equally resonant arguments.
It’s not coincidental that Jackie Higgins and Ed Yong’s work is appearing in an AI age in which entrepreneurs are trying to outsource our species-being to machines. If the holy grail of tech entrepreneurs is mimicking human empathy, then our great quest is reminding ourselves of who we are at a time in which machines are able to effectively mimic us. We raised ourselves above nature to potentially not just destroy the planet and other species, but also ourselves. The humanist alternative is a return to nature to remind ourselves of our superpower.
Nor is it coincidental that we are now recognizing the very worst 21st-century people—humans behaving appallingly badly, so to speak—are entirely lacking in empathy. Earlier this week, I talked to the author of Hollywood Ending, Ken Auletta about what the Harvey Weinstein story tells us about the broader American culture.
The problem with Harvey Weinstein, Auletta reminded me, was that he always wanted what he couldn’t have. It was his narcotic. Maybe this was the cause or maybe it was the effect of Weinstein’s profound lack of empathy with all other human-beings, especially women. Either way, Weinstein’s criminal narcissism and egoism captures the absolute antithesis of human empathy.
Like Weinstein, we want what we can’t have in technology that mimics human empathy. His catastrophic Hollywood Ending might, I fear, also be our ending unless we recognize and celebrate our superpower. The clock is ticking. It’s almost 2062.