When Your iPhone Reinforces Sexism

Tech is created by humans, and it’s vulnerable to our biases and stereotypes.
If you insult Siri, she might tell you off. But shell keep doing stuff for you — because she's programmed to be subservient.
If you insult Siri, she might tell you off. But shell keep doing stuff for you — because she's programmed to be subservient.
Tom McCarten

If you turn on your iPhone and say, “Siri, you’re a bitch,” the digital assistant might respond, “There’s no need for that.” If you call her a crude word for female genitalia, her reply is genteel: “Your language!” But if you then ask her to set an appointment for you or start a text message, she’ll happily comply, as if the derogatory statements never occurred.

Some of the most popular artificial intelligence apps — Apple’s Siri, Amazon’s Alexa and Microsoft’s Cortana, to name a few — have endured well-documented barrages of verbal abuse from users. In the United States, these assistants are programmed to sound feminine by default, and they’re generally obedient to a multitude of simple commands. And because they’re trapped in our electronic gadgets, they have a limited number of ways to respond when users engage in trash talk or speech that would be considered harassment if it were directed at a human woman.

Apple doesn’t talk much about whether it has observed troubling interactions between Siri and her users. But other prominent companies have been more open about how people treat their female-voiced AI apps.

Amazon acknowledged such interactions last year, quietly rolling out a disengage mode for Alexa so it wouldn’t submit to sexist taunts. The feature is supposed to shut down sexist conversations, though it doesn’t really explain to users why their language triggered that response.

In 2016, Microsoft said that a “good chunk” of people want to know about Cortana’s sex life ― and that the company has programmed the app to “get mad” if users are “particularly assholeish.” Cortana might tell off the user by saying, “Well, that’s not going to get us anywhere,” but sometimes she’ll just perform a semi-related web search instead of responding to something lewd.

The Android-only voice assistant Robin fields a significant number of interactions that are “clearly sexually explicit,” Ilya Eckstein, the founder and CEO of the now defunct Robin Labs, which developed the app, told The Times of London in 2016. “People want to flirt, they want to dream about a subservient girlfriend, or even a sexual slave,” he told Quartz.

The company Audioburst acquired Robin in 2017, and although Eckstein is no longer in the picture, the app still offers a baffling range of responses when confronted with explicit questions. Sometimes it’ll scold you, other times it acts flirty, or it might even tell a bawdy joke.

Tech companies haven’t made much progress in addressing the problematic power dynamics inherent in having what amounts to a female servant living in your phone. Instead, programmers will say that you can change the voices to sound male or that digital assistants aren’t really supposed to be thought of as gendered.

But the reality is that these AI voice programs sound like women, and their names sound feminine. Plus, consumers have been found to prefer female voices for the kinds of tasks that apps like Siri are designed to do: answer our questions about the weather, play music for us, send text messages while we’re driving and so on. It’s a preference that may be hardwired into the human brain, researchers have found.

“It’s much easier to find a female voice that everyone likes than a male voice that everyone likes,” Stanford University professor Clifford Nass, the author of The Man Who Lied to His Laptop: What Machines Teach Us About Human Relationships, told CNN in 2011. “It’s a well-established phenomenon that the human brain is developed to like female voices.”

However, just because a woman’s voice is preferred doesn’t mean the listener holds her in high regard. “Female voices are seen, on average, as less intelligent than male voices,” he told HuffPost in 2013, shortly before his death.

Some critics believe the notorious bro culture of Silicon Valley is to blame for its weak response to the sexist power imbalance that female voice assistants create. Consider the fact that the humans who make AI tech are mostly guys: In 2017 a survey found that men accounted for 85.5 percent of software developers in the U.S.

“Giving digital assistants human personalities has a long tradition, with things like Clippy the paperclip,” said Kate Devlin, a senior lecturer in the computing department of Goldsmiths College, part of the University of London. “They’re meant to help the user form a better relationship with the machine so they can be guided through the system. I think giving them humanlike traits is not necessarily a bad thing, but the gender aspect is particularly insidious.”

“It’s like Silicon Valley is just trying to re-create their moms,” she added, nodding not only to the soothing way in which female-voiced apps like Siri and Alexa speak but also to the tasks they perform.

Digital assistants help us with things like errands and phone calls ― stuff that’s considered women’s work, according to Miriam Sweeney, a feminist researcher and an assistant professor at the University of Alabama. “Service work, domestic labor, health care, office assistants — these are all industries which are heavily feminized and also often low paid and low status, with precarious work conditions,” she told the Australian Broadcasting Corp. last year. “We’re seeing the same with digital labor online.”

There are potentially big ramifications because of the popularity of AI voice apps. Half a billion devices have Siri on them. Amazon said it sold tens of millions of Alexa devices during the 2017 holiday season. And Microsoft said Cortana has nearly 150 million monthly active users. Investment in AI startups has grown sixfold since 2000, and a Forrester Research report last year predicted that about 9 percent of jobs will become automated by the end of 2018. The roles that artificial intelligence and automation play in our lives are large and growing.

Because AI is created by humans, it’s vulnerable to the biases and stereotypes people hold. Some worry that, as AI and automation become more important in our daily lives, these biases will become more ingrained in technology.

“As AIs are built, they learn from their environment,” said Tabitha Goldstaub, a co-founder of the AI education platform CognitionX and an advocate for diversifying the tech industry. “So if we don’t have women in that environment or in the data sets AIs read, we will definitely end up with machines that are misogynistic.”

“Because AI is created by humans, it’s vulnerable to the biases and stereotypes people hold.”

Apple didn’t respond to requests for comment about Siri’s interactions with users. It gives English speakers the option to make Siri’s voice sound male, and in some languages the app defaults to a male voice.

English-speaking users of Alexa can change her gender as well, if they know how to navigate the settings. An Amazon spokesperson said, “When we developed Alexa’s personality, we wanted her to have a lot of attributes we value at Amazon, like being smart, helpful and humble while having some fun too. These attributes aren’t specific to any one gender — rather, traits we value in all people.”

A Microsoft representative said engineers “thought long and hard about gender and how we want to portray Cortana” before settling on a female persona because “research found there is a certain warmth to a female voice that is associated with helpfulness.” The company doesn’t offer a male voice option, though it’s looking into the possibility.

Audioburst, whose AI tech has been integrated into Alexa, says it’s revamping Robin to make it less like an assistant and more like a podcast-playing and news-fetching service that users control with their voice. Soon Robin won’t respond to explicit or rude questions because it won’t chit-chat with users at all, said Assaf Gad, the company’s vice president of marketing. “Once the updates are complete, none of the type of interactions in question will be possible,” he said. Robin does not have a male voice option, he added.

Goldstaub isn’t buying tech companies’ arguments that they tried to create these apps as responsibly as possible. She said female-voiced assistants are a good example of male-dominated industries not thinking through the possible effects of their products.

“There isn’t enough conscious thinking happening during the making,” she said. She argued that limited diversity in the workplace can lead to a narrow way of thinking and stop products from being properly scrutinized.

“I like to think women would be designing differently,” Devlin said.

So what can tech companies do to discourage the harassment of voice apps? Amazon says it has programmed Alexa to be more of a feminist than before. (If you ask her about it, she’ll explain what the word “feminism” means.) But even the app’s disengage mode leaves users to ponder why Alexa won’t continue a certain conversation; the app won’t tell users why their behavior is problematic. Leah Fessler at Quartz proposed a more in-your-face solution:

In an ideal world, such disengagement would help condition the user to understand that sexual harassment is unacceptable and disrespectful. In response to “you’re a slut,” Alexa or Siri would say something like, “That sounds like sexual harassment. Sexual harassment is not acceptable under any circumstances, and is often rooted in sexism.” She could then provide the customer with resources to help them more deeply understand sexual harassment, how to curb it, and how to respectfully ask for consent.

Goldstaub believes we should go further and not humanize digital assistants at all.

“I feel that as AI becomes more prevalent in everyday life, we need to be very clear about what is human intelligence and what is artificial,” she said. “My concern is that if machines resemble humans too much, we might forget these systems don’t have human abilities of empathy or reasoning. And I think that can become quite dangerous.”

For more content and to be part of the “This New World” community, join our Facebook Group.

HuffPost’s “This New World” series is funded by Partners for a New Economy and the Kendeda Fund. All content is editorially independent, with no influence or input from the foundations. If have a tip or idea for the series, send an email to thisnewworld@huffpost.com.

Popular in the Community

Close

What's Hot