Siri voice recordings: Apple fires ‘hundreds’ of contractors over eavesdropping
Third-party workers were sent audio files containing ‘confidential information’
Apple has fired “hundreds” of third-party employees after discovering they were eavesdropping on audio recordings of Siri users.
The sackings come after an internal review revealed that contractors were listening in on conversations that contained “confidential information”, The Guardian reports.
According to “multiple” former contractors, they were regularly sent “accidental activations” – where a user inadvertently wakes their Siri-enabled device – by Apple for evaluation, the newspaper says.
The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
More than 300 employees from a third-party facility in Cork, Ireland, had their contracts ended early, the paper notes, with more workers across Europe being sent home as well. Staff were given one week’s notice on 2 August, the day Apple elected to suspend voice-recording reviews.
“As a result of our review, we realise we have not been fully living up to our high ideals, and for that we apologise,” an Apple spokesperson said.
Why were contractors able to access recordings?
Apple, along with a host of other major tech firms, employ moderators to assess the content of user voice records, a practice commonly referred to as “grading”, The Independent says.
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
According to TechRadar, third-party workers contracted by Apple were instructed to review the audio recordings of Siri users to evaluate the voice assistant’s accuracy.
The recordings evaluated by contractors were those caused by users inadvertently activating their Siri-powered device. As a result, third-party employees were exposed to private conversations, drug deals and people having sex.
The programme was suspended earlier in August following an internal review, the BBC says. Around 0.2% of all Siri audio recordings were evaluated by human moderators.
While Apple apologised for the incident, it said it intends to resume the programme “later this fall when software updates are released to our users”.
Can you stop Siri from listening?
Yes. Part of Apple’s aforementioned software update will contain a setting that lets users opt out of sharing their audio records.
Another option is to deactivate Siri, which can be achieved by going into the Settings app on an iPhone and searching for the “Siri and Search” tab. Once there, users can toggle the “Listen for ‘Hey Siri’” to the off position.
For those still keen on using Siri, Apple says it will no longer keep audio recordings by default, the Guardian reports. It will, however, retain “automatically generated transcripts” of Siri commands.
The company has also assured its users that only official Apple employees will be allowed to listen to audio records, while conversations logged through inadvertent activations will be deleted.
-
‘No Other Choice,’ ‘Dead Man’s Wire,’ and ‘Father Mother Sister Brother’Feature A victim of downsizing turns murderous, an angry Indiana man takes a lender hostage, and a portrait of family by way of three awkward gatherings
-
Political cartoons for January 11Cartoons Sunday’s political cartoons include green energy, a simple plan, and more
-
The launch of the world’s first weight-loss pillSpeed Read Novo Nordisk and Eli Lilly have been racing to release the first GLP-1 pill
-
Will regulators put a stop to Grok’s deepfake porn images of real people?Today’s Big Question Users command AI chatbot to undress pictures of women and children
-
Most data centers are being built in the wrong climateThe explainer Data centers require substantial water and energy. But certain locations are more strained than others, mainly due to rising temperatures.
-
The dark side of how kids are using AIUnder the Radar Chatbots have become places where children ‘talk about violence, explore romantic or sexual roleplay, and seek advice when no adult is watching’
-
Why 2025 was a pivotal year for AITalking Point The ‘hype’ and ‘hopes’ around artificial intelligence are ‘like nothing the world has seen before’
-
AI griefbots create a computerized afterlifeUnder the Radar Some say the machines help people mourn; others are skeptical
-
The robot revolutionFeature Advances in tech and AI are producing android machine workers. What will that mean for humans?
-
Separating the real from the fake: tips for spotting AI slopThe Week Recommends Advanced AI may have made slop videos harder to spot, but experts say it’s still possible to detect them
-
Inside a Black community’s fight against Elon Musk’s supercomputerUnder the radar Pollution from Colossal looms over a small Southern town, potentially exacerbating health concerns