Could Dating Apps Help Mitigate Racial Bias in Dating—Instead of Exacerbating It?
Orly Lobel on the Possibilities of the Algorithms That Help Us Find Love
Technology has a powerful ability to shed new light on old problems. Tech can expose the silent assumptions in our systems, one of which has been the persistent reluctance to address prejudice in our most intimate choices. We’ve passed policies against discrimination in employment, consumer markets, housing markets, schools, and banks, but it would seem that our choices of whom to love (as opposed to, for example, whom to hire) are considered too private to regulate.
The ways that race and ethnicity are presented and weighed on dating platforms offer a particularly illuminating dichotomy between enabling autonomy, choice, and identity to play in our sexual digital encounters and propelling a more inclusive and equal trajectory.
For as long as we can remember, people have dated in racially discriminatory patterns. Evidence suggests, however, that online dating is increasing rates of interracial marriage. A study done by research partners at the University of Vienna and the Center for European Research examined the effects of online dating and the increasing number of interracial marriages over the last fifty years, finding marked increases in the percentage of new marriages that were interracial a few years following the introduction of dating websites (circa 1995), the increase in popularity of online dating platforms (2006), and, specifically, the creation of Tinder (2015).
Though it’s possible that some of this increase is a result of population composition changes, the rate of interracial marriages has outpaced the growth rate of minorities as a percentage of the overall population. Interracial marriages among Black Americans jumped from 5 percent in 1980 to 18 percent in 2015, yet the percentage of Black Americans held steady at 12 percent throughout that time. Of course, social norms and our online behaviors are entangled, and we never assume a correlation signifies causation, but these positive trajectories are worth investigating further. Technology can nudge change, but lasting changes have to come from social norms. And we also must recognize that race is salient, and significantly impacts matches, on many dating sites.
As with other types of platforms and choice architecture, there is no neutral design. The design of the dating apps reflects normative choices, including about whether race plays a role in human choices, as well as AI selection, of matches. A study released in 2018 by OkCupid confirms that there is abundant racial bias in how matches are made. According to OkCupid founder Christian Rudder, “When you’re looking at how two American strangers behave in a romantic context, race is the ultimate confounding factor.” The study found that Black women and Asian men are the least likely to receive messages or responses on dating apps, and that white men and white women are reluctant to date outside of their race. Black men and women are ten times more likely to message whites than white people are to message Black people.
Some evidence shows that gay men are the most likely to exclude partners based on racial preference. Indeed, there is contemporary debate going on in queer theory about the impact of sites like Grindr, Hornet, and Scruff on “gay male cruising” culture and whether algorithmic sorting reinforces class and race hierarchies in the gay community.
Critics worry that these apps in particular commodify sexual relations, treat humans as part of a “meat market,” objectify partners to be consumed and disposed of, and deepen classifications along identity lines. Others respond that digital spaces allowing one to be treated as an object have some advantages—as queer theorists have described it, they preserve a gap between oneself and one’s potential partner, “thwarting the desire to know, speak for, and act in the interest of others—a tendency that may appear altruistic but has annihilative ends.” Queer theorist Tom Roach explores how Grindr and other male-to-male dating/hookup apps can help reimagine a radical post-pandemic subjectivity—a queer sociability—in which participants are formally interchangeable avatar-objects (“virtual fungibility,” as he terms it).
While these debates provocatively challenge us to think about what has traditionally been conceptualized as sordid, selfish dating behaviors, the challenge of preventing racial exclusions remains salient and pervasive. Importantly, racial preference is found to be stronger during the initial choice to initiate a match on a dating app. After people are given the opportunity to interact and are shown a broader choice set, their preferences change toward more openness and inclusivity.
This presents a technology design opportunity for a dating equality machine. Nobel laureate Gary Becker identified the two principles of the marriage market as satisfying preferences and maintaining competition. Each person, Becker reasoned, competes to find the best mate, subject to market conditions. Marriage contributes to one’s health, well-being, financial security, and happiness. When marriage markets are segregated, inequality is replicated over generations. We tend to view dating as the last haven of completely personal choice. But digital design has the power to either mitigate or intensify problematic patterns of exclusion.
Selecting romantic partners has never been an entirely rational process. Dating apps provide an opportunity to look outside the confined dating pools that have dominated our social lives in the past. Dating apps can open spaces that were previously unavailable, spaces where those of diverse backgrounds can meet and mate. A dating app can make the decision whether or not to allow search and sorting according to race or ability. A friend of mine who teaches queer theory suggested to me a thought experiment of hiding indications of gender, sex, or sexuality on a dating platform altogether. The user would still be able to make choices based on what they see in people’s profiles, but the platform would not facilitate decisions along lines of sex, gender, or sexuality, making the entire dating pool visible to all.
*
If you think about it, blinding identity is what we’ve seen as a possible design solution in the employment context, too. After all, we can classify ourselves according to so many other qualities than our biology—our hobbies, our professional and personal experiences, our politics, our profession, and whatever else makes us unique. We also know that algorithms are likely to discover that some of those other qualities are still correlated with race or gender or religion, but far less so than with direct filtering.
The Japanese gay dating app 9Monsters categorizes all users as one of nine types of “monsters” based on a variety of personality traits and passions, recategorizing and rejecting our traditional offline categories. This move to shape preferences by design while maintaining user autonomy is often a good approach to overcoming biases. Beyond dating apps, TrenchcoatX, a porn site founded by pornography stars Stoya and Kayden Kross, has taken up a similar attempt to reject racial classifications by removing racial tags, making racial categories unsearchable.
Only a small number of the leading dating apps have anti-discrimination policies. The platforms that do have anti- discrimination policies make it harder for users to act on racial and other biases, all the while maintaining freedom of choice. Platforms can choose to prevent filtering of profiles based on race. They can further prohibit explicit statements of racial preferences such as “No Latinas please” or “I only date Caucasians.” Yes, racial bias will continue to seep in, but at least explicit exclusions of entire groups would not be aided by the app.
In the summer of 2020, as companies were responding to Black Lives Matter protests, Grindr removed its “ethnicity filter” and launched a campaign against discriminatory behavior on the platform, with the motto “Kindness Is Our Preference.” Before summer 2020, Grindr’s ethnicity filter had allowed paying users to see only those results matching the ethnicities of their choosing. Scruff, another gay dating app, also announced that it would remove race-based filters from its platform.
Most dating apps, though, such as Hinge, OkCupid, and eHarmony, do continue to allow users to search by ethnicity, in addition to other classifications, from height to education and everything in between. eHarmony’s U.K. website has “lifestyle” category options like Asian, Bangladeshi, Black, Chinese, Christian, people over sixty, single parents, et cetera. Its American version has a Hispanic dating platform, and its Australian site has an “ethnic dating” option. Such racial categorizations have been justified as a way for minorities to find prospects within their communities. A spokesperson for Match .com defended the use of race filters as giving users “the ability to find others that have similar values, cultural upbringings and experiences that can enhance their dating experience.”
It’s indeed important to note that there may be inadvertent costs or harms when filter categories are removed. For example, a traditionally marginalized group of people who have difficulty finding each other and forming a community offline may benefit from the ability to screen for their identity. Growing up Jewish Israeli, I am well familiar with the not-so-subtle messaging from parents to children about marrying within their faith.
Moreover, the expansion of dating opportunities outside of one’s community and ethnicity may itself be happening in patterned ways—for example, in gendered ways or along socioeconomic class lines—resulting in some people within the minority community having fewer options than before. These are tensions we should be discussing openly, and technology is pushing us to have these conversations. The beauty of technology is that it can mitigate the tensions between competing values we hold dear. As a behavioral researcher, I’ve studied extensively how the presentation of information and the decision-making environment shape our preferences in subtle ways.
With the scale of connectivity and innovations in user interface, we have an opportunity to challenge historical preferences based on race, religion, caste, social standing, and other criteria without taking away freedom to choose one’s partner. We can think of incremental shifts of expanding the dating pool, expanding initial matches, and building on insights we learn from researching user behavior on apps, such as how people’s preferences are likely to be less rigid after initial connections.
Recall that, beyond direct individual choices, when an algorithm is programmed to optimally satisfy preferences in multisided user apps, it might use racial indicators to create matches even if it just predicts statistical preferences based on past behavior of other people. We don’t know enough about how dating apps automate the matching process because platforms almost universally keep their data and algorithms confidential.
Moreover, because algorithms now learn autonomously from massive amounts of data, often the software engineers who programmed them don’t even understand why the neural nets they built have arrived at certain outcomes. But from what we do know, the algorithmic matching on many dating platforms likely takes race into account. For example, reporter Amanda Chicago Lewis found that when she specified on the dating app Coffee Meets Bagel that she was willing to date males from any race, she received exclusively Asian men’s profiles.
The algorithm may have found a dearth in the pool of women willing to date Asian men and optimized the showing so that users with a willingness to date Asian men have those profiles shown to them. Other reporters discovered that when users did not specify ethnic preferences, apps still tended to show them partners of their own race.
MonsterMatch is a game that simulates a dating app designed to expose inherent biases that fuel matching algorithms. Users create a monster character and profile, start swiping right or left on other monsters’ accounts, then chat and date. The more playing time is logged, the more the game learns a user’s “monster preferences.” Say a user, in the game, was a werewolf swiping “yes” on a zombie, and then swiping “no” on a vampire. From then on, when a new user also swipes “yes” on the zombie, the algorithm may assume that the new user also dislikes vampires and thus withhold vampire profiles from that user. MonsterMatch is a creative example of efforts to educate people on how dating apps really work, and how their swipes may affect not only their future matches but others’ too—and fuel racial bias.
Designing better algorithms means that we need to think about whether preferences in love matching are a type of discrimination that we need to tackle as a society. Ideals about beauty are shaped by the wallpaper of our environment. Ideas about talent are shaped by socially contingent formulations of merit. Preferences are malleable in every field and decision, and love is no different. Technology has an immense impact not only on what we see and whom we meet, but also on how we feel. If a dating app tells a user that someone is found to be compatible with her, the chances that the user will select this person increase.
Understanding that preferences are shaped and sustained by everything around us, past and present, helps move the debate forward. Love markets that are limiting mean that we are limiting some people’s access to all the benefits that flow from romantic partnership—status, income, health, education, social and professional networks, community impact, and more. Technology can help us find better balance among the principles we value.
By default, users are impacted by an app’s power of suggestion. Digital design in dating, as in other spheres of life, can encourage diversity and inclusion while respecting users’ autonomy to choose. Love may be blind, but technology is not.
____________________________________
This article has been excerpted from The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future by Orly Lobel. Copyright © 2022. Available from PublicAffairs, an imprint of Perseus Books, LLC, a subsidiary of Hachette Book Group, Inc.