Sexist Siri? UN report says digital assistants shouldn't all be female and 'servile'

AP logo
Thursday, May 23, 2019
Sexist Siri? UN report says digital assistants shouldn't all be female
A United Nations report suggests the all-female and "servile" voices of digital assistants like Apple's Siri and Amazon's Alexa reveal a gender bias.

Are the female voices behind Apple's Siri and Amazon's Alexa amplifying gender bias around the world?

The United Nations thinks so.

A report released Wednesday by the UN's culture and science organization raises concerns about what it describes as the "hardwired subservience" built into default female-voiced assistants operated by Apple, Amazon, Google and Microsoft.

The report is called "I'd Blush If I Could."

It's a reference to an answer Apple's Siri gives after hearing sexist insults from users.

It says it's a problem that millions of people are getting accustomed to commanding female-voiced assistants that are "servile, obedient and unfailingly polite," even when confronted with harassment from humans.

The agency recommends tech companies stop making digital assistants female by default and program them to discourage gender-based insults and abusive language.