How smart speaker AIs such as Alexa and Siri reinforce gender bias
Unesco urges tech firms to offer gender-neutral versions of their voice assistants

Smart speakers powered by artificial intelligence (AI) voice assistants that sound female are reinforcing gender bias, according to a new UN report.
Research by Unesco (United Nations Educational, Scientific and Cultural Organisation) found that AI assistants such as Amazon’s Alexa and Apple’s Siri perpetuate the idea that women should be “subservient and tolerant of poor treatment”, because the systems are “obliging and eager to please”, The Daily Telegraph reports.
The report - called “I’d blush if I could”, in reference to a phrase uttered by Siri following a sexual comment - says tech companies that make their voice assistants female by default are suggesting that women are “docile helpers” who can be “available at the touch of a button”, the newspaper adds.
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
The agency also accuses tech companies of failing to “build in proper safeguards against hostile, abusive and gendered language”, reports The Verge.
Instead, most AIs respond to aggressive comments with a “sly joke”, the tech news site notes. If asked to make a sandwich, for example, Siri says: “I can’t. I don’t have any condiments.”
“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” says the Unesco report.
What has other research found?
The Unesco report cites a host of studies, including research by US-based tech firm Robin Labs that suggests at least 5% of interactions with voice assistants are “unambiguously sexually explicit”.
And the company, which develops digital assistants, believes the figure is likely to be “much higher due to difficulties detecting sexually suggestive speech”, The Guardian reports.
The UN agency also points to a study by research firm Gartner, which predicts that people will be having more conversations with the voice assistant in their smart speaker than their spouses by 2020.
Voice assistants already manage an estimated one billion tasks per month, ranging from playing songs to contacting the emergency services.
Although some systems allow users to change the gender of their voice assistant, the majority activate “obviously female voices” by default, the BBC reports.
The Unesco report concludes that this apparent gender bias “warrants urgent attention”.
How could tech companies tackle the issue?
Unesco argues that firms should be required to make their voice assistants “announce” that they are not human when they interact with people, reports The Sunday Times.
The agency also suggests that users should be given the opportunity to select the gender of their voice assistant when they get a new device and that a gender-neutral option should be available, the newspaper adds.
In addition, tech firms should program voice assistants to condemn verbal abuse or sexual harassment with replies such as “no” or “that is not appropriate”, Unesco says.
Tech companies have yet to respond to the study.
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
-
North America is 'dripping' into Earth's mantle
Under the radar Things are rocky below the surface
By Devika Rao, The Week US Published
-
8 essentials for the perfect picnic
The Week Recommends Celebrate warmer weather by dining al fresco
By Catherine Garcia, The Week US Published
-
Sudoku medium: April 14, 2025
The Week's daily medium sudoku puzzle
By The Week Staff Published
-
How might AI chatbots replace mental health therapists?
Today's Big Question Clients form 'strong relationships' with tech
By Joel Mathis, The Week US Published
-
What are AI hallucinations?
The Explainer Artificial intelligence is known for making things up – and that can cause real damage
By Elizabeth Carr-Ellis, The Week UK Published
-
The backlash against ChatGPT's Studio Ghibli filter
The Explainer The studio's charming style has become part of a nebulous social media trend
By Theara Coleman, The Week US Published
-
Not there yet: The frustrations of the pocket AI
Feature Apple rushes to roll out its ‘Apple Intelligence’ features but fails to deliver on promises
By The Week US Published
-
OpenAI's new model is 'really good' at creative writing
Under the Radar CEO Sam Altman says he is impressed. But is this merely an attempt to sell more subscriptions?
By Theara Coleman, The Week US Published
-
Could artificial superintelligence spell the end of humanity?
Talking Points Growing technology is causing growing concern
By Devika Rao, The Week US Published
-
Space-age living: The race for robot servants
Feature Meta and Apple compete to bring humanoid robots to market
By The Week US Published
-
Musk vs. Altman: The fight over OpenAI
Feature Elon Musk has launched a $97.4 billion takeover bid for OpenAI
By The Week US Published