Apple Inc. has been subjected to a lot of controversies in the recent past when it comes to users’ privacy and a recent report has come as a shock to many as Apple has just admitted that its virtual assistant named Siri does listen to some of our private recordings. Yes, you read it right. It should be noted that a majority of Apple fans stick to its products for the promised security that the tech giant boast offs, however, this new controversy might hamper Apple’s image to some extent in regard to privacy of users, especially those using iPhones.
For the unknown, a report by The Guardian has recently highlighted that Apple is supposedly hiring contractors in order to hear accidental recordings done by Siri as part of a process called which Apple is calling “grading”. The report adds that Apple contractors hear confidential medical information, recordings of couples having sex, drug deals, and more as part of their job which is providing quality control for the tech giant.
According to the whistleblower, who reportedly works for one of the contractors that Apple has hired, Siri can get activated accidentally after hearing the word “wake” or it’s wake-up phrase “hey Siri”. The whistleblower further adds that Apple’s virtual assistant service often mistakes the sound of a zip for that of a trigger and Siri can also be activated accidentally in many other ways. For example, if an Apple Watch detects that has been raised followed by hearing a speech, Siri is automatically activated on the smartwatch.
It should also be noted that Apple’s privacy explainer site does mention that Siri data is sent to “Apple servers” in order to improve the quality of its voice-enabled virtual assistant service, however, it doesn’t explicitly states that humans are processing the information and neither does it mention any third-party contractors having access to the same.
Before dwelling further into what else has been disclosed by the whistleblower and why let’s have a look at how Apple’s grading system for Siri works. Following are the categories in which contractors are tasked for grading Siri’s responses on Apple products.
- Whether Siri’s activation was deliberate or accidental.
- Whether the user’s query was something Siri should be able to answer.
- Whether response received by the user was appropriate.
On the other hand, the whistleblower has highlighted that there have been several instances of accidental recordings being accessed by Siri which featured private discussions between doctors and patients as well as some business deals, sexual encounters and alleged criminal dealings to name a few. “These recordings are accompanied by user data showing location, contact details, and app data,” the whistleblower was quoted as saying by The Guardian. He further reported that contractors even have some targets set for their employees who are expected to complete them as fast as possible. Moreover, employees are also encouraged to treat accidental recordings as “technical problems”, however, it has been claimed that there is no procedure in place to deal with the gathered sensitive information.
Here is Apple’s response on this new controversy regarding user privacy:
“A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
The Guardian also quoted Apple Inc as saying that a very small random subset, which is around less than 1 percent of daily Siri activations both accidental and intentional, are used for grading. Moreover, the recordings from Siri which are used here, are typically only a few seconds long.
Everything said and done, the one question that remains unanswered is, “In days when cybersecurity has become such a boiling issue, shouldn’t tech giants such as Apple disclose these types of procedures on their own if there is nothing to hide?” and to which extent will the users be tested before they start to retaliate whenever there is a threat to their privacy?
Share your thoughts in the comment section below.