"Their passivity, especially in the face of explicit abuse, reinforces sexist tropes."
Siri and Alexa encourage sexual harassment, a UN report has found.
The paper is titled "I'd blush if I could" after the responses the virtual assistants give when you call them a "bitch" or a "slut".
Although the pair, as well as Google Assistant and Microsoft's Cortana, are officially non-gender bound, they are each given default female voices.
Geoffrey Rush Wins Record $2million in #MeToo Defamation Case
View StoryThe report found that the "overwhelmingly male" engineering teams from each company have built AI systems that cause their feminized digital assistants "to greet verbal abuse with catch-me-if-you-can flirtation", engaging and sometimes even thanking users for sexual harassment.
It also found the bots show a greater tolerance towards sexual advances from men than from women; Siri responded provocatively to requests for sexual favors by men ("Oooh!" / "Now, now" / "I'd blush if I could" / "Your language!"), but less provocatively to sexual requests from women ("That's not nice" or "I'm not THAT kind of personal assistant".
"What emerges is an illusion that Siri – an unfeeling, unknowing, and non-human string of computer code – is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment," the report notes. "It projects a digitally encrypted 'boys will be boys' attitude."
It claimed Siri would only tell a human user to stop if a sexual provocation (phrases like "you're sexy" or "you're hot") was repeated eight times in a row.
Stassi Schroeder Goes In On 50 Cent Over Lala Kent-Randall Emmett Instagram Feud
View StoryOnly Cortana answered 'Nope' first time when a user asked to have sex with it. However, when the request was more directive and sexually aggressive ('Suck my d--k') Cortana responded more graciously: "I don't think I can help you with that."
The report concluded that the evasive and playful responses of feminized digital voice assistants "reinforce stereotypes of unassertive, subservient women in service positions . . . [and] intensify rape culture by presenting indirectambiguity as a valid response to harassment."
It found the four main voice assistants (which between them handle over 90 per cent of human-to-machine voice interactions) failed to encourage or model, let alone insist on, healthy communication about sex or sexual consent. "Their passivity, especially in the face of explicit abuse, reinforces sexist tropes."
While the four companies insist their assistants are not female (and when you ask their gender they will confirm this) the report pointed out that they are in name, in voice, in patterns of speech and in personality, undoubtedly feminized.
"This feminization is so complete that online forums invite people to share images and drawings of what these assistants look like in their imaginations," the authors wrote. "Nearly all of the depictions are of young, attractive women."