Apple is one of the biggest tech companies we have in our generation today. After all, they offer the best quality phones, laptops, tablets, and more. On top of that, they are most proud of their super-secure system which ensures your privacy to be safe. But it’s kind of ironic how a company that prides itself on privacy actually listens to private customer interactions. You read that right, Apple apparently hires real people to listen to the Siri voice recording.
Apple Listens to Siri Recordings?
The people over at the Guardian said that Apple really does hire people to listen to Siri recordings. But, they claim that this isn’t for any malicious reason, instead, it’s for “improving the accuracy and quality of the voice assistant”.
The hired contractors come across all kinds of recordings like private medical appointments, people having sex, and even drug deals. It is apparently their job to screen these recordings as part of the quality control process, or “grading” as they like to call it.
The main reason why you’ve probably never heard of this before is because Apple doesn’t explicitly disclose this fact to the general public. After all, what kind of tech company would they be if they couldn’t promise complete Siri privacy?
How do they monitor the Siri voice recording?
If you’re an Apple user, you’re probably really curious as to how exactly they go about with your recordings.
First of all, a few of the recordings are distributed to independent workers that the company hired from all over the world. Their job is to grade the responses whether the voice activation was intentional or by accident, if the task was something Siri could help with, and if Siri was able to correctly answer the question.
All we can get from Apple is that the data is used to “help Siri and dictation … understand you better and recognize what you say”.
In a recent statement, “A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” according to Apple. They continued to defend themselves saying that they only use less than 1% of daily Siri activations for grading. And apparently, the ones used are only a few seconds long which doesn’t expose much of the users’ privacy.
An anonymous worker told the Guardian that “there have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
However, Apple keeps dodging this idea saying that Siri privacy data “is not linked to other data that Apple may have from your use of other Apple services”.
The contractor from Apple listens to Siri recordings said that they came forward because they were scared that information like this could possibly be misused.