How smart speaker AIs such as Alexa and Siri reinforce gender bias
Unesco urges tech firms to offer gender-neutral versions of their voice assistants
Smart speakers powered by artificial intelligence (AI) voice assistants that sound female are reinforcing gender bias, according to a new UN report.
Research by Unesco (United Nations Educational, Scientific and Cultural Organisation) found that AI assistants such as Amazon’s Alexa and Apple’s Siri perpetuate the idea that women should be “subservient and tolerant of poor treatment”, because the systems are “obliging and eager to please”, The Daily Telegraph reports.
The report - called “I’d blush if I could”, in reference to a phrase uttered by Siri following a sexual comment - says tech companies that make their voice assistants female by default are suggesting that women are “docile helpers” who can be “available at the touch of a button”, the newspaper adds.
The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
The agency also accuses tech companies of failing to “build in proper safeguards against hostile, abusive and gendered language”, reports The Verge.
Instead, most AIs respond to aggressive comments with a “sly joke”, the tech news site notes. If asked to make a sandwich, for example, Siri says: “I can’t. I don’t have any condiments.”
“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” says the Unesco report.
What has other research found?
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
The Unesco report cites a host of studies, including research by US-based tech firm Robin Labs that suggests at least 5% of interactions with voice assistants are “unambiguously sexually explicit”.
And the company, which develops digital assistants, believes the figure is likely to be “much higher due to difficulties detecting sexually suggestive speech”, The Guardian reports.
The UN agency also points to a study by research firm Gartner, which predicts that people will be having more conversations with the voice assistant in their smart speaker than their spouses by 2020.
Voice assistants already manage an estimated one billion tasks per month, ranging from playing songs to contacting the emergency services.
Although some systems allow users to change the gender of their voice assistant, the majority activate “obviously female voices” by default, the BBC reports.
The Unesco report concludes that this apparent gender bias “warrants urgent attention”.
How could tech companies tackle the issue?
Unesco argues that firms should be required to make their voice assistants “announce” that they are not human when they interact with people, reports The Sunday Times.
The agency also suggests that users should be given the opportunity to select the gender of their voice assistant when they get a new device and that a gender-neutral option should be available, the newspaper adds.
In addition, tech firms should program voice assistants to condemn verbal abuse or sexual harassment with replies such as “no” or “that is not appropriate”, Unesco says.
Tech companies have yet to respond to the study.
-
Ex-FBI agents sue Patel over protest firingspeed read The former FBI agents were fired for kneeling during a 2020 racial justice protest for ‘apolitical tactical reasons’
-
The real tragedy that inspired ‘Hamlet,’ the life of a pingpong prodigy and the third ‘Avatar’ adventure in December moviesThe Week Recommends This month’s new releases include ‘Hamnet,’ ‘Marty Supreme’ and ‘Avatar: Fire and Ash’
-
‘These moves would usher in a future of chemical leaks’Instant Opinion Opinion, comment and editorials of the day
-
Separating the real from the fake: tips for spotting AI slopThe Week Recommends Advanced AI may have made slop videos harder to spot, but experts say it’s still possible to detect them
-
Inside a Black community’s fight against Elon Musk’s supercomputerUnder the radar Pollution from Colossal looms over a small Southern town, potentially exacerbating health concerns
-
Poems can force AI to reveal how to make nuclear weaponsUnder The Radar ‘Adversarial poems’ are convincing AI models to go beyond safety limits
-
Has Google burst the Nvidia bubble?Today’s Big Question The world’s most valuable company faces a challenge from Google, as companies eye up ‘more specialised’ and ‘less power-hungry’ alternatives
-
Spiralism is the new cult AI users are falling intoUnder the radar Technology is taking a turn
-
Is Apple’s Tim Cook about to retire?Today's Big Question A departure could come early next year
-
AI agents: When bots browse the webfeature Letting robots do the shopping
-
Is AI to blame for recent job cuts?Today’s Big Question Numerous companies have called out AI for being the reason for the culling