Not a question you often hear asked. That was until a UNESCO report in May 2019 confirmed that virtual assistants like Siri and Alexa being perceived as female, promotes sexist ideals. The report, titled “I’d blush if I could: closing gender divides in digital skills through education”, opened up an argument about how we perceive not only Siri and Alexa, but AI in general. There appears to be a tendency for AI to have female voices, when of course AI doesn’t have a gender at all – but attaching a female voice leads us to behave as if the AI is female. Is this linked to a lack of diversity in the sector when it comes to gender? Most likely.
Not only do the AI have female voices, but the names also reflect a female gender: Siri, Alexa, Cortana. The report even goes so far as to say that they also exhibit “female personalities”, and behave differently when being addressed by men as opposed to women.
“Siri responded provocatively to requests for sexual favours by men (‘Oooh!’; ‘Now, now’; ‘I’d blush if I could’; or ‘Your language!’), but less provocatively to sexual requests from women (‘That’s not nice’ or ‘I’m not THAT kind of personal assistant’),”
Researchers in this paper say that this reinforces stereotypes of women as “servile” beings, who only exist to do what they are told. They respond to insults in a polite manner, which the UNESCO report states reinforces gender bias and normalises sexual harrassment.
The study is the first official UN recommendation regarding AI and personal assistants, recommending that bodies explore the possibility of making the voices neither male or female.