Home United States USA — software Alexa, Siri, and Google Assistant promote sexist attitudes towards women, says UN

Alexa, Siri, and Google Assistant promote sexist attitudes towards women, says UN

248
0
SHARE

A UNESCO report criticizes the default use of female-sounding voice assistants, saying it reinforces gender stereotypes.
A report by UNESCO has suggested that the default use of female-sounding voice assistants in our smart home gadgets and smartphones perpetuates sexist attitudes towards women.
The report, titled I’d Blush if I Could, takes its name from Siri’s former default response to being called a bitch by users – and criticizes the fact that Apple’s Siri, Amazon Alexa, Google Assistant, and Microsoft’s Cortana, are “exclusively female or female by default, both in name and in sound of voice”.
Why is this a problem? Well, according to the report, the default use of female-sounding voice assistants sends a signal to users that women are “obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’”.
The report also highlights the fact that these voice assistants have “no power of agency beyond what the commander asks of it” and responds to queries “regardless of [the user’s] tone or hostility”.
According to the report, this has the effect of reinforcing “commonly held gender biases that women are subservient and tolerant of poor treatment”.
This subservience is particularly worrying when these female-sounding voice assistants give “deflecting, lackluster or apologetic responses to verbal sexual harassment”.

Continue reading...