When Loneliness Leads to Sex Robots: A Study in Teledildonics
Dianne Araral on A.I., Robo-Shaming, and Our Lonely World
In Spike Jonze’s cinematic masterpiece Her, a lonely writer is reconciling with his ex-wife when she discovers that he has been dating his artificially-intelligent operating system. Against the pastel idyll of their quiet brunch, Rooney Mara’s once-demure character lashes out in acerbic disbelief: “You always wanted a wife without dealing with the challenges of anything real. Well, I’m glad you found someone. It’s perfect.”
In our mass-scale cultural paradigm shift, there is no shortage of ways to map intimacy onto tech: chatbots, robot dogs, Japanese girlfriend simulators. The incredibly sexy term for the study of relationships between love, sex and robotics is “teledildonics,” and within it, “androids”: anthropomorphic automatons offering human companionship. Despite sitting squarely in the uncanny valley, prominent and controversial, androids are growing in consumer use: according to a 2017 YouGov study, half of American adults surveyed expect that, easily within 50 years, having sex with robots will be common practice.
Androids have sparked a strong enough cultural and moralizing anxiety—enough for the anthropologist Dr. Kathleen Richardson to launch a 2015 campaign calling for their blanket ban. She cites, rightly, their objectification and degradation of the human (usually female) body; their tendency to codify racism and sexism; and their inability to teach the navigation of boundaries and consent.
Two years later, a heretofore unlikely coalition of sex therapists and roboticists came together in the Foundation for Responsible Robotics’ 2017 report, “Our Sexual Future with Robots.,” It cautioned against the dangers of over-dependence on robot companions, particularly by disaffected young men. Indeed, suggestions to mollify incels with sex dolls have been met with swift criticism by writers spanning Salon to the New Statesman, rightly noting that sex dolls merely exacerbate a heady mix of loneliness and male aggression, rather than nourish healthy communication and respect.
That we regard androids with wariness comes as no surprise. Sold to private consumers, these automatons are a product rushing to the bottom line—and the bottom of our brainstems, pandering unchecked to the basest of biases and desire. They lack the shortcomings that keep us humble; limits that teach us patience, forgiveness, and all else essential to the human condition. This much is obvious.
Yet what’s woefully missing in the conversation is not why, but how, to deal with the uses and abuses of androids. Alongside tech thickly texturing daily life, a loneliness epidemic—unprecedented in scale—means we cannot simply write off android users as errant or rare. A 2015 meta-analysis by Dr. Holt-Lunstad, professor of psychology at Brigham Young University, found that loneliness is now a global epidemic, particularly in developed countries. The chronically lonely are more vulnerable to diabetes and higher blood pressure, and loneliness is better predictor of early death than obesity.
We cannot simply write off android users as errant or rare.It’s also little wonder that solitary confinement is considered by leading human rights organisations to be “cruel and unusual” punishment: it isolates us not only from others, but from our very own sense of self, wreaking destructive—sometimes lasting—havoc on our well-being. Isolation is an issue acute, urgent, and very, very real.
So while android-caveats are well-documented, they still hold massive potential for mollifying a critical public health problem. As machine learning, natural-language processing, and facial recognition all blossom in computing complexity, we are not far off from A. I. deftly grasping basic human emotions. Automatons are uniquely positioned to be a stopgap solution while we grapple with the problem of individuation, now acute and piercing, under late-stage modernity. But what, then, does our requisite work of a collective cultural reckoning look like, in practice?
The first step is recognizing that talk of human-companion androids is still cast in the language of extremes—either as a panacea or a death knell. Worse still, particularly for lonely young men, they are cast in the language of shame or guilt, a transgression to keep secret, only shared with fellow users behind the anonymity of an online forum.
Shame is the natural outgrowth of a society ill-equipped to grasp the collision course between isolation and tech advancement. It’s also a powerful silencing tool—and therefore deeply counterproductive.
In Hannah Gadsby’s therapeutic tour de force Nanette, she unpacks the bitter, lasting effect of soaking a person in shame for their sexual deviance. Shame is a sharp blade that burrows itself in deeper and deeper with time, keeping us from reaching out to others and tackling loneliness at its roots. It also enables the problem to fester, spawning resentful, incel-esque underground communities with little recourse—save their androids, their very transgressive token.
Shame is the natural outgrowth of a society ill-equipped to grasp the collision course between isolation and tech advancement.The second step, then, is locating the antithesis of shame: through “harm reduction,” a key public-health philosophy. Much like drug addiction, users grappling with loneliness lack support systems for a strong sense of self and holistic well-being, driving some to find a crutch: projecting affection onto a bot. Unlike drug addiction and loneliness, however, the particular chimera of android-cum-loneliness overwhelmingly remains the subject of blunt derision and ostracision, sheer scorn and suspicion.
Incorporating harm-reduction sensitivity into press coverage and bioethics conferences shifts the problem away from blunt moralizing to a complex public health issue. Of course, having a “relationship” with an android is nowhere near the damaging effects of drug addiction or other public health problems under a harm-reduction framework. However, as they grow in private consumer use, it is helpful to understand them as such: accepting that undesirable crutches are a living reality for many (with loneliness now a global epidemic); understanding android use or dependency as complicated, borne of multiple factors (lack of self-esteem, work-life balance etc); establishing rich community support networks of, and for, users.
Most importantly, it calls for a non-judgmental, non-coercive provision of services and resources encouraging users to affirm themselves as primary agents of their own growth and healing.
Androids are at once a stand-in and a stepping stone, merely one element of a holistic personal framework comprising coping mechanisms and support. These automatons are a sorely-needed intervention offering novel intimacies. They are also poor surrogates for what Rainer Maria Rilke calls “the highest task of a bond between two people”: to guard the independence of the other. Guarding the independence of one another—that work is up to us.