Siri voice recordings: Apple fires ‘hundreds’ of contractors over eavesdropping
Third-party workers were sent audio files containing ‘confidential information’
Apple has fired “hundreds” of third-party employees after discovering they were eavesdropping on audio recordings of Siri users.
The sackings come after an internal review revealed that contractors were listening in on conversations that contained “confidential information”, The Guardian reports.
According to “multiple” former contractors, they were regularly sent “accidental activations” – where a user inadvertently wakes their Siri-enabled device – by Apple for evaluation, the newspaper says.
The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
More than 300 employees from a third-party facility in Cork, Ireland, had their contracts ended early, the paper notes, with more workers across Europe being sent home as well. Staff were given one week’s notice on 2 August, the day Apple elected to suspend voice-recording reviews.
“As a result of our review, we realise we have not been fully living up to our high ideals, and for that we apologise,” an Apple spokesperson said.
Why were contractors able to access recordings?
Apple, along with a host of other major tech firms, employ moderators to assess the content of user voice records, a practice commonly referred to as “grading”, The Independent says.
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
According to TechRadar, third-party workers contracted by Apple were instructed to review the audio recordings of Siri users to evaluate the voice assistant’s accuracy.
The recordings evaluated by contractors were those caused by users inadvertently activating their Siri-powered device. As a result, third-party employees were exposed to private conversations, drug deals and people having sex.
The programme was suspended earlier in August following an internal review, the BBC says. Around 0.2% of all Siri audio recordings were evaluated by human moderators.
While Apple apologised for the incident, it said it intends to resume the programme “later this fall when software updates are released to our users”.
Can you stop Siri from listening?
Yes. Part of Apple’s aforementioned software update will contain a setting that lets users opt out of sharing their audio records.
Another option is to deactivate Siri, which can be achieved by going into the Settings app on an iPhone and searching for the “Siri and Search” tab. Once there, users can toggle the “Listen for ‘Hey Siri’” to the off position.
For those still keen on using Siri, Apple says it will no longer keep audio recordings by default, the Guardian reports. It will, however, retain “automatically generated transcripts” of Siri commands.
The company has also assured its users that only official Apple employees will be allowed to listen to audio records, while conversations logged through inadvertent activations will be deleted.
-
AI is making houses more expensiveUnder the radar Homebuying is also made trickier by AI-generated internet listings
-
‘How can I know these words originated in their heart and not some data center in northern Virginia?’instant opinion Opinion, comment and editorials of the day
-
AI: is the bubble about to burst?In the Spotlight Stock market ever-more reliant on tech stocks whose value relies on assumptions of continued growth and easy financing
-
Your therapist, the chatbotFeature Americans are increasingly turning to artificial intelligence for mental health support. Is that sensible?
-
Supersized: The no-limit AI data center build-outFeature Tech firms are investing billions to build massive AI data centers across the U.S.
-
Digital addiction: the compulsion to stay onlineIn depth What it is and how to stop it
-
AI workslop is muddying the American workplaceThe explainer Using AI may create more work for others
-
Prayer apps: is AI playing God?Under The Radar New chatbots are aimed at creating a new generation of believers

