Technology

Apple’s ‘Siri’ is listening to some of your private recordings. Here’s why you should be worried

Apple Inc. has been subjected to a lot of controversies in the recent past when it comes to users’ privacy and a recent report has come as a shock to many as Apple has just admitted that its virtual assistant named Siri does listen to some of our private recordings. Yes, you read it right. It should be noted that a majority of Apple fans stick to its products for the promised security that the tech giant boast offs, however, this new controversy might hamper Apple’s image to some extent in regard to privacy of users, especially those using iPhones.

For the unknown, a report by The Guardian has recently highlighted that Apple is supposedly hiring contractors in order to hear accidental recordings done by Siri as part of a process called which Apple is calling “grading”. The report adds that Apple contractors hear confidential medical information, recordings of couples having sex, drug deals, and more as part of their job which is providing quality control for the tech giant.

Futurism

While many might think that iPhone users would be the biggest target as part of the ‘grading’ process by Apple and Siri is an inbuilt voice-enabled virtual assistant in most of Apple products, the report by Guardian highlights that users of Apple Watch and Apple HomePod smart speaker have been the biggest source of such information coming from accidental recordings. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on,” a whistleblower was quoted as saying by The Guardian.

According to the whistleblower, who reportedly works for one of the contractors that Apple has hired, Siri can get activated accidentally after hearing the word “wake” or it’s wake-up phrase “hey Siri”. The whistleblower further adds that Apple’s virtual assistant service often mistakes the sound of a zip for that of a trigger and Siri can also be activated accidentally in many other ways. For example, if an Apple Watch detects that has been raised followed by hearing a speech, Siri is automatically activated on the smartwatch.

It should also be noted that Apple’s privacy explainer site does mention that Siri data is sent to “Apple servers” in order to improve the quality of its voice-enabled virtual assistant service, however, it doesn’t explicitly states that humans are processing the information and neither does it mention any third-party contractors having access to the same.

Before dwelling further into what else has been disclosed by the whistleblower and why let’s have a look at how Apple’s grading system for Siri works. Following are the categories in which contractors are tasked for grading Siri’s responses on Apple products.

  • Whether Siri’s activation was deliberate or accidental.
  • Whether the user’s query was something Siri should be able to answer.
  • Whether response received by the user was appropriate.

On the other hand, the whistleblower has highlighted that there have been several instances of accidental recordings being accessed by Siri which featured private discussions between doctors and patients as well as some business deals, sexual encounters and alleged criminal dealings to name a few. “These recordings are accompanied by user data showing location, contact details, and app data,” the whistleblower was quoted as saying by The Guardian. He further reported that contractors even have some targets set for their employees who are expected to complete them as fast as possible. Moreover, employees are also encouraged to treat accidental recordings as “technical problems”, however, it has been claimed that there is no procedure in place to deal with the gathered sensitive information.

Here is Apple’s response on this new controversy regarding user privacy:

“A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

The Guardian also quoted Apple Inc as saying that a very small random subset, which is around less than 1 percent of daily Siri activations both accidental and intentional, are used for grading. Moreover, the recordings from Siri which are used here, are typically only a few seconds long.

Everything said and done, the one question that remains unanswered is, “In days when cybersecurity has become such a boiling issue, shouldn’t tech giants such as Apple disclose these types of procedures on their own if there is nothing to hide?” and to which extent will the users be tested before they start to retaliate whenever there is a threat to their privacy?

Share your thoughts in the comment section below.

Facebook Comments
Team Rapid

Published by
Team Rapid

Recent Posts

New age lingo – Evolution or Degradation of English as a method of communication

If you are a parent to a pre teen or young adult you must already…

1 week ago

Umbilical Cord Care: Do’s and Don’ts for a Healthy Start

Bringing a newborn home is a whirlwind of emotions—joy, exhaustion, and a healthy dose of…

2 weeks ago

The Secret to Success: The Importance of Portion Control in Weight Management

We often think of weight loss as a battle against specific foods. We demonize carbs,…

2 weeks ago

How to Bathe a Newborn Safely at Home: A Parent’s Guide

Holding your newborn is one of the most magical feelings in the world. But let’s…

2 weeks ago

How to Meal Prep for the Week and Win Your Time Back

It's 7:00 AM on a Monday. You’re rushing to get out the door, grabbing a…

4 weeks ago

How to Maintain Baby’s Hygiene During Winters: A Complete Parent’s Guide

Winter brings cozy sweaters, warm cuddles, and the joy of your baby’s first cold-weather season.…

4 weeks ago