UNESCO: Artificial Intelligence assistants such as Amazon’s Alexa and Apple’s Siri “replicate patriarchal ideas, in many communities this reinforces commonly-held gender biases that women are subservient and tolerant of poor treatment”

A piece published on the website of Justice for Men & Boys (and the women who love them), the political party of which I’m the chairman, in May 2019:

I recently received an email from the chairman of the Patriarchy Council – “Mr Big”, a 6’8″ American, who must remain anonymous, at all costs. He recommended that I buy an Artificial Intelligence assistant, either Amazon’s Alexa, or Apple’s Siri. I haven’t yet ordered such a device, but will now do so, after reading this piece by Mark Bridge in today’s Times. They sound marvellous, no self-respecting patriarch should be without one, clearly. I’ll probably go for Amazon’s Echo Dot, currently on sale for just £29.99, a discount of £20.00 on the usual price. You can even pay for it with five monthly payments of £6.00. Where else could you buy a subservient female at such prices, in today’s world? The Times piece:

AI assistants with female voices are fuelling gender bias and reinforcing the patriarchy with submissive and coquettish responses to men, the UN has said.

Services such as Amazon’s Alexa and Apple’s Siri suggest that women are “docile helpers, available at the touch of a button or with a blunt voice command like ‘hey’”, the organisation said.

Feminists have previously criticised the technology companies for perpetuating Victorian stereotypes of women as subservient and men as masterful by casting digital assistants as female.

Now the UN’s Educational, Scientific and Cultural Organisation, Unesco, has said in a report that the assistants “replicate patriarchal ideas”. “In many communities this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment,” the authors wrote.

The female version of Siri in particular provided the illusion of “a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment”. The report highlighted Siri’s response when told by a reporter: “You’re a slut.” On different occasions, the reply was: “I’d blush if I could,” “Well, I never!” and “Now, now.”

Such design was the result of tech companies’ “overwhelmingly male engineering teams”, the researchers said. They noted that there had been no mention in multiple testimonials of a man changing a female voice to male.

Among recommended changes is a requirement that digital assistants “announce” they are not human at the start of interactions with people. Tech companies should stop giving the assistants female voices by default and users should have a choice on sign-up, including a gender-neutral option, the team said. Siri has a male voice by default in the UK. The assistants “should not invite or engage in sexist language” and, if users requested sexual favours, should respond flatly with replies such as “No” or “That is not appropriate”.

The research company Gartner predicts that by next year many people will have more conversations with digital assistants than with their spouse.

Please support Mike Buchanan’s work on Patreon. Thank you.