Technology

Apple’s ‘Siri’ is listening to some of your private recordings. Here’s why you should be worried

Apple Inc. has been subjected to a lot of controversies in the recent past when it comes to users’ privacy and a recent report has come as a shock to many as Apple has just admitted that its virtual assistant named Siri does listen to some of our private recordings. Yes, you read it right. It should be noted that a majority of Apple fans stick to its products for the promised security that the tech giant boast offs, however, this new controversy might hamper Apple’s image to some extent in regard to privacy of users, especially those using iPhones.

For the unknown, a report by The Guardian has recently highlighted that Apple is supposedly hiring contractors in order to hear accidental recordings done by Siri as part of a process called which Apple is calling “grading”. The report adds that Apple contractors hear confidential medical information, recordings of couples having sex, drug deals, and more as part of their job which is providing quality control for the tech giant.

Futurism

While many might think that iPhone users would be the biggest target as part of the ‘grading’ process by Apple and Siri is an inbuilt voice-enabled virtual assistant in most of Apple products, the report by Guardian highlights that users of Apple Watch and Apple HomePod smart speaker have been the biggest source of such information coming from accidental recordings. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on,” a whistleblower was quoted as saying by The Guardian.

According to the whistleblower, who reportedly works for one of the contractors that Apple has hired, Siri can get activated accidentally after hearing the word “wake” or it’s wake-up phrase “hey Siri”. The whistleblower further adds that Apple’s virtual assistant service often mistakes the sound of a zip for that of a trigger and Siri can also be activated accidentally in many other ways. For example, if an Apple Watch detects that has been raised followed by hearing a speech, Siri is automatically activated on the smartwatch.

It should also be noted that Apple’s privacy explainer site does mention that Siri data is sent to “Apple servers” in order to improve the quality of its voice-enabled virtual assistant service, however, it doesn’t explicitly states that humans are processing the information and neither does it mention any third-party contractors having access to the same.

Before dwelling further into what else has been disclosed by the whistleblower and why let’s have a look at how Apple’s grading system for Siri works. Following are the categories in which contractors are tasked for grading Siri’s responses on Apple products.

  • Whether Siri’s activation was deliberate or accidental.
  • Whether the user’s query was something Siri should be able to answer.
  • Whether response received by the user was appropriate.

On the other hand, the whistleblower has highlighted that there have been several instances of accidental recordings being accessed by Siri which featured private discussions between doctors and patients as well as some business deals, sexual encounters and alleged criminal dealings to name a few. “These recordings are accompanied by user data showing location, contact details, and app data,” the whistleblower was quoted as saying by The Guardian. He further reported that contractors even have some targets set for their employees who are expected to complete them as fast as possible. Moreover, employees are also encouraged to treat accidental recordings as “technical problems”, however, it has been claimed that there is no procedure in place to deal with the gathered sensitive information.

Here is Apple’s response on this new controversy regarding user privacy:

“A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

The Guardian also quoted Apple Inc as saying that a very small random subset, which is around less than 1 percent of daily Siri activations both accidental and intentional, are used for grading. Moreover, the recordings from Siri which are used here, are typically only a few seconds long.

Everything said and done, the one question that remains unanswered is, “In days when cybersecurity has become such a boiling issue, shouldn’t tech giants such as Apple disclose these types of procedures on their own if there is nothing to hide?” and to which extent will the users be tested before they start to retaliate whenever there is a threat to their privacy?

Share your thoughts in the comment section below.

Facebook Comments
Team Rapid

Share
Published by
Team Rapid

Recent Posts

Unravelling the Whim-Whams : Types of Phobias

Every person has experienced some degree of fear at a given point of time, this…

7 hours ago

Scuffling For Sleep: Let’s Get Acquainted What Is Insomnia?

Every human must have experienced a varied degree of restless sleepless nights in their life.…

3 weeks ago

Beyond Inspiration: The Science of Retrieval-Augmented Generation

In the realm of artificial intelligence, advancements in natural language processing have pushed the boundaries…

1 month ago

Sun’s Symphony: An Insight Into Different Types Of Sunflowers

Sunflowers (Helianthus annuus) are iconic flowering plants known for their large beautiful, daisy-like blooms which can…

1 month ago

5 Mistakes New Traders Make and How to Prevent Them

Trading involves buying and selling financial assets like stocks, currencies, or commodities with the aim…

2 months ago

EXPECTING BLISS: A Comprehensive Guide On How To Prepare For A Baby?

Prenatal well-being refers to the overall health and happiness of an expectant mother during pregnancy.…

2 months ago