Good news! After being exposed, Apple claims its contractors will stop listening in on your intimate moments, at least for a while.
According to The Verge, “Apple has said that it will temporarily suspend its practice of using human contractors to grade snippets of Siri voice recordings for accuracy.”
This came after a revelation from The Guardian where an anonymous whistelbower claimed that workers “regularly hear confidential medical information, drug deals, and recordings of couples having sex” as a common part of the job.
Apple claims that the data “is used to help Siri and dictation … understand you better and recognise what you say.”
Apple explained to The Guardian at the time, “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID.” However, small snippets, reportedly less than 1% of daily Siri usage, is recorded and used for grading.
The issue is that Siri is not exclusively activated by users saying, “Hey Siri.” The anonymous contractor said, “The sound of a zip, Siri often hears as a trigger.” The Guardian’s coverage elaborated, “The service can also be activated in other ways. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated.”
The whistleblower explained,
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
While Siri is on most Apple devices, most famously on phones, Apple Watch and HomePod cause the majority of mistaken recordings.
“You can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”
As The Guardian makes clear, while Google and Amazon have opt-out options, Apple lacks an equivalent privacy protection option other than turning Siri off. “That’s a particularly bad look,” The Verge quips, “Given that Apple has built so much of its reputation on selling itself as the privacy company that defends your data in ways that Google and Amazon don’t.”