How smart speaker AIs such as Alexa and Siri reinforce gender bias
Unesco urges tech firms to offer gender-neutral versions of their voice assistants
Smart speakers powered by artificial intelligence (AI) voice assistants that sound female are reinforcing gender bias, according to a new UN report.
Research by Unesco (United Nations Educational, Scientific and Cultural Organisation) found that AI assistants such as Amazon’s Alexa and Apple’s Siri perpetuate the idea that women should be “subservient and tolerant of poor treatment”, because the systems are “obliging and eager to please”, The Daily Telegraph reports.
The report - called “I’d blush if I could”, in reference to a phrase uttered by Siri following a sexual comment - says tech companies that make their voice assistants female by default are suggesting that women are “docile helpers” who can be “available at the touch of a button”, the newspaper adds.
The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
The agency also accuses tech companies of failing to “build in proper safeguards against hostile, abusive and gendered language”, reports The Verge.
Instead, most AIs respond to aggressive comments with a “sly joke”, the tech news site notes. If asked to make a sandwich, for example, Siri says: “I can’t. I don’t have any condiments.”
“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” says the Unesco report.
What has other research found?
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
The Unesco report cites a host of studies, including research by US-based tech firm Robin Labs that suggests at least 5% of interactions with voice assistants are “unambiguously sexually explicit”.
And the company, which develops digital assistants, believes the figure is likely to be “much higher due to difficulties detecting sexually suggestive speech”, The Guardian reports.
The UN agency also points to a study by research firm Gartner, which predicts that people will be having more conversations with the voice assistant in their smart speaker than their spouses by 2020.
Voice assistants already manage an estimated one billion tasks per month, ranging from playing songs to contacting the emergency services.
Although some systems allow users to change the gender of their voice assistant, the majority activate “obviously female voices” by default, the BBC reports.
The Unesco report concludes that this apparent gender bias “warrants urgent attention”.
How could tech companies tackle the issue?
Unesco argues that firms should be required to make their voice assistants “announce” that they are not human when they interact with people, reports The Sunday Times.
The agency also suggests that users should be given the opportunity to select the gender of their voice assistant when they get a new device and that a gender-neutral option should be available, the newspaper adds.
In addition, tech firms should program voice assistants to condemn verbal abuse or sexual harassment with replies such as “no” or “that is not appropriate”, Unesco says.
Tech companies have yet to respond to the study.
-
This flu season could be worse than usualIn the spotlight A new subvariant is infecting several countries
-
Political cartoons for November 18Cartoons Tuesday’s political cartoons include MTG's marching band, AI data centers, Trump's fat cat friends, and more
-
What a rising gold price says about the global economyThe Explainer Institutions, central banks and speculators drive record surge amid ‘loss of trust’ in bond markets and US dollar
-
Is AI to blame for recent job cuts?Today’s Big Question Numerous companies have called out AI for being the reason for the culling
-
‘Deskilling’: a dangerous side effect of AI useThe explainer Workers are increasingly reliant on the new technology
-
AI models may be developing a ‘survival drive’Under the radar Chatbots are refusing to shut down
-
Saudi Arabia could become an AI focal pointUnder the Radar A state-backed AI project hopes to rival China and the United States
-
How the online world relies on AWS cloud serversThe Explainer Chaos caused by Monday’s online outage shows that ‘when AWS sneezes, half the internet catches the flu’
-
AI is making houses more expensiveUnder the radar Homebuying is also made trickier by AI-generated internet listings
-
‘How can I know these words originated in their heart and not some data center in northern Virginia?’instant opinion Opinion, comment and editorials of the day
-
AI: is the bubble about to burst?In the Spotlight Stock market ever-more reliant on tech stocks whose value relies on assumptions of continued growth and easy financing