It’s no secret that racial biases factor into swiping choices on dating apps ― even in 2018, people feel bold enough to write things like “no blacks” and “no Asians” on their profiles. But a new study suggests the apps themselves might reinforce those prejudices.
“People may have no idea that a matching algorithm is limiting their matches by something like race since apps are often very vague about how their algorithms work,” said Jessie Taft, a research coordinator at Cornell Tech and co-author of the study
To conduct the study, the researchers downloaded the 25 top-grossing apps in the iOS app store as of fall 2017, including Tinder, OKCupid, Hinge, Grindr and some lesser-known apps like Meetville and Coffee Meets Bagel.
Then they logged in and looked for functionality and design features that could affect users’ discriminatory behavior toward other users. This included things like the apps’ terms of service, their sorting, filtering and matching algorithms and how users are presented to each other. (Do they get pictures or bios? Can you sort matches according to different categories?)
They found that most apps employ algorithms that cater to users’ past personal preferences and even the matching history of people who are similar to them demographically.
So, for instance, if a user had matched with white users repeatedly in the past, the algorithm was more likely to suggest more white people as “good matches” moving forward.
When apps encourage users to act on quick impressions and filter other people out, serendipity is lost, the researchers say
“Users who may not have a preference for race or ethnicity in their partner may find their matching results artificially limited by an algorithm that’s calculated to repeat ‘good’ past matches without considering what ‘good’ future matches might be,” Taft told HuffPost.
Data released by apps themselves support the research. In 2014, OkCupid released a study that showed that Asian men and African-American women got fewer matches than members of other races. White men and Asian women, meanwhile, are consistently seen as more desirable on dating sites.
“We don’t want to stop users from dating the people they want to date; we want to ensure that minority users aren’t excluded, abused, or stereotyped as a result of those choices.”
While many of us have “types” we’re drawn to, it’s worth looking at whether lack of exposure as well as stereotypes and cultural expectations are influencing our preferences. (For instance, women may exclude Asian men in their search because of the group has long been portrayed as effeminate or asexual in film and on television.)
Given how widely used apps are ― one study suggested more than one third of U.S. marriages begin with online dating ― developers have a rare opportunity to encourage people to move beyond racial and sexual stereotypes rather than entrench them, Taft said.
“The problem with ‘giving users what they want,’ as the apps claim they do, is that more often than not the users who are getting what they want are the ones who are being discriminatory, not the ones who are being discriminated against,” the researcher said.
Even small tweaks could make the experience more beneficial to users across the board.
“The solutions that we propose in the paper ― adding community guidelines and educational materials, rethinking sorting and filtering categories and changing up algorithms ― can make outcomes better for marginalized users without interfering in anyone’s right to choose a partner,” Taft added.
Some apps are already making progress. Grindr, a gay dating app with a troubled history of allowing racist behavior, recently announced a “zero-tolerance” policy toward racially tinged, hateful language. It’s even considering removing options that allow users to filter potential dates by age and race.
“Any language that is intended to openly discriminate against characters and traits, like infamously, ‘No fats, no femmes, no Asians’ ... that isn’t going to be tolerated any more,” Landen Zumwalt, Grindr’s head of communications, told Reuters in September.
It’s a clear step in the right direction, Taft said.
“Educating all users about stigma and discrimination faced by minority users, and even requesting a non-discrimination commitment before using the app, can make everyone more aware of the impact of their swipes,” Taft said.
It can also help singles reevaluate their preferences, the researcher said.
“You may think you’re only into one specific type of person, but understanding that preferences are fluid and shaped by culture can help us look beyond individual differences.”