iPhone Siri
Siri will soon use third-party apps by default. (Pictured: A woman tries to use 'Siri' voice-activated assistant software built into the Apple iPhone 4S March 13, 2012 in Washington, DC.) AFP/Getty Images/Karen Bleier

Are popular voice assistants like Apple’s Siri, Amazon’s Alexa and Microsoft’s Cortana reflecting a sexist culture? An international body believes they do.

The United Nations has released a new report that raises concerns about the reasons behind why voice assistants today are female, and also points out the negative effects it has on women and the culture.

The report, titled “I’d Blush If I Could” in reference to a now-removed response Siri makes when asked a specific question using a sexist insult meant to demean a woman, pointed out how today’s voice assistants, whether Siri, Cortana, Alexa or the unnamed-but-obviously female Google Assistant, reveals how men view women.

Gender Bias

The report pointed out that these assistants were deliberately given female voices and “personalities” by male developers for some reason. While most of them say they gave these assistants female voices simply because women are generally perceived as “nurturing” and “caring,” other factors seem to reveal gender bias.

First, although the tech companies say customers prefer to have female assistants, the fact remains that they earn profits by attracting more customers. Thus, by giving voice assistants female voices that would attract more consumers, these companies earn more profits from them.

Second, the report pointed out a few instances when female-voiced assistants, not necessarily used on smartphones, were replaced with male-voiced assistants. In these instances, the female voice assistant was giving commands or instructions, and not taking orders or giving help.

In short, they were replaced because they were exercising “authority,” not giving “assistance.”

Third, these female-voiced assistants seem programmed to show subservience or servility regardless of how the owner of the device would talk to them. They would respond to wake words like “hey” whether it was said in a gentle or harsh manner. They were designed to be “humble” and “helpful” no matter what was told them.

Attack on women

These, and many more factors including commanding one voice assistant to speak in a sexually suggestive way; flirting with an assistant; and naming them after women, one of which is described by the U.N. as a “synthetic… sensuous unclothed woman;” all point to using female voice assistants as an affront to women.

U.N.’s study found that the assistants’ passive responses to queries of a sexual nature, “especially in the face of explicit abuse, reinforces sexist tropes.” These responses, the U.N. said, can “intensify rape culture by presenting indirect ambiguity as a valid response to harassment.”

The U.N. report details more about the matter. Interested parties are encouraged to access it here.

Apple Siri WWDC
Siri was introduced on the iPhone 4s, pictured here, but the core functionality of the product has not evolved much beyond its original features. Oli Scarff/Getty Images