STAY TUNED!
Google Pixel 5 event | Here's how to watch and what to expect
0 d
00: 00: 00

Apple users' most intimate and confidential moments are relayed by Siri to contractors

Apple users' most intimate and confidential moments are relayed by Siri to contractors
According to a report published by The Guardian, Apple's digital assistant Siri sends personal recordings to contractors. Similar to what Amazon does with its virtual assistant Alexa to improve its understanding of human language, these contractors grade Siri's performance. But some of the recordings that were captured by Siri and sent to be graded included couples having sex and others contained personal and confidential conversations about medical information. Besides Apple and Amazon, Google also does something similar with Google Assistant. But while Amazon and Google allow their users to opt-out of some uses of the recordings they generate, Apple doesn't.


Apple says less than 1% of the daily activations of Siri are passed along to the contractors who try to determine whether the assistant was activated on purpose or by accident. Siri is also graded on whether it was able to respond to the user's request or query, and whether that response from Siri was appropriate. Apple says that each snippet of audio from Siri graded by these third party firms runs for only a few seconds. It notes that "A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements."

A whistleblower says that most accidental Siri activations take place on the Apple Watch and the HomePod


The Guardian's report cites a whistleblower working for one of the contractors who says that in too many cases, users are accidentally activating Siri allowing the contractors to hear sensitive personal information or activities. Besides mistaking certain words for the "Hey Siri" phrase that awakens the virtual helper, the whistleblower notes that sometimes the sound of a zipper will activate Siri. He also says that the most mistaken activations occur on the Apple Watch and the HomePod smart speaker. When the smartwatch is raised up and hears speech, it will automatically activate Siri. With 35% of the global smartwatch market, the Apple Watch is on a lot of wrists and is generating plenty of Siri-based content for the contractors to go through.


The whistleblower reveals that those working on grading Siri have quotas that must be met, so the goal is to go through the recordings as fast as possible. He also says that it could be possible to identify those whose voices can be heard on the Siri samples. "There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad. It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on," he warns.

The company values its reputation when it comes to customer privacy and while it can hide behind the claims that what it is doing will improve the experience of using Siri, for some Apple customers it might come at too high a cost.

FEATURED VIDEO

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at https://www.parsintl.com/phonearena or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit https://www.parsintl.com/ for samples and additional information.
FCC OKs Cingular's purchase of AT&T Wireless