Apple suspends program that used recordings of customers' sexcapades and medical secrets

Apple suspends program that used recordings of customers' sexcapades and medical secrets
Last week, we told you that Apple, like Google and Amazon, has a third party firm listening to clips recorded by its virtual digital assistant. Apple and the other companies say that this is necessary in order to improve the user experience of their AI-driven helpers. One whistleblower who works for the contractor employed by Apple related how private medical information is sometimes heard on these Siri snippets and occasionally the contractors are titillated by the sounds of two (or more) people engaging in sexual activity. In such situations, Siri is activated by mistake. The whistleblower pointed out that sometimes the sound of a zipper will act like the wake word, which might explain the recordings of users' intimate moments.

This all seemed to contradict Apple's claims about how the company protects the privacy of iPhone users. In fact, you might recall that back when this year was just five days old, Apple paid for a giant billboard that looked out over the streets of Las Vegas in the vicinity of the Las Vegas Convention Center. At the time, the venue was hosting the Consumer Electronics Show and Apple was hoping to attract the attention of those passing by on the way to the event. The sign, borrowing from the city's own iconic promotional slogan, read "What happens on your iPhone, stays on your iPhone." Except that turned out not to be entirely true.

A future iOS update will allow users to opt-out of Siri's grading process


Well, it seems that the blowback from the report about consultants listening to snippets of recordings from Siri touched a raw nerve at Apple. Reuters reports that this morning, Apple has decided to suspend the global program that graded Siri's performance. Apple has said that one of the reasons why these recordings were being looked at was to see what sounds accidentally awaken the virtual digital assistant. The company previously stated that "A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements."


An Apple spokeswoman said this morning "While we conduct a thorough review, we are suspending Siri grading globally." She also noted that in a future iOS update, users will be able to opt-out of the program at their discretion.

Last April, Amazon was criticized when word leaked out about the teams it employs to transcribe users' conversations with Alexa. Amazon claimed at the time that the program is designed to help Alexa improve its understanding of the human language. Each employee, stationed in Boston, Costa Rica, India and Romania, puts in a nine-hour workday during which he or she might listen to 1,000 audio clips taken from Alexa. And Amazon has added an internal system that allows members of these teams to share recordings with one another. While that is supposedly done to help team members figure out hard to understand words or phrases, we can imagine it being used to share the most embarrassing comments and lewd remarks made by Echo owners. These are recorded by Alexa without the user's knowledge.

Even though the programs run by Apple, Google and Amazon are supposedly done to improve the performance of Siri, Google Assistant, and Alexa respectively, it is easy to see how consumers could be concerned. If a virtual digital assistant is summoned by accident, users might never know when something they are saying that is supposed to be private, is being heard and shared by members of a team sitting in an office somewhere far away.

FEATURED VIDEO

21 Comments

1. LAgurl

Posts: 119; Member since: Dec 05, 2018

Eh I could care less really even tho They probably heard me and all the threesome's and bad drug deals I had in cheap motels .

4. sgodsell

Posts: 7574; Member since: Mar 16, 2013

Apple the hypocritical company when it comes to their customers privacy and security. Or I should just say privacy and security in general.

15. sgodsell

Posts: 7574; Member since: Mar 16, 2013

Apple needs to change their billboard ads by adding the word "sometimes" on the end.

2. cmdacos

Posts: 4321; Member since: Nov 01, 2016

Apple: "sorry we got caught, we'll hide it better next time"

3. lyndon420

Posts: 6878; Member since: Jul 11, 2012

Like facebook...wait...is this deja vu or something? Wasn't an article like this already published by PA??!

9. cmdacos

Posts: 4321; Member since: Nov 01, 2016

Yep. If they're going to duplicate articles, I'm going to duplicate my comments so that Alan gets paid seeignas he's always the second one in with the same topic, lol

13. Alan01

Posts: 643; Member since: Mar 21, 2012

Yes, that would be the case if it wasn't the fact that I wrote the first story too. Since I wrote the first story a week ago, I felt enough time had elapsed to warrant more than just a cursory update to the original story. Regards, Alan

16. cmdacos

Posts: 4321; Member since: Nov 01, 2016

Except Daniel posted the same update this morning. Hence the duplication.

12. Alan01

Posts: 643; Member since: Mar 21, 2012

The difference is that Apple has now suspended the program. Regards, Alan

14. lyndon420

Posts: 6878; Member since: Jul 11, 2012

Hey Alan. Engadget reported the same thing about Google and how they 'were' only using .2% of recordings. And apparently they are shutting it down for 3 months while they investigate and make other plans.

23. cmdacos

Posts: 4321; Member since: Nov 01, 2016

A quote from Daniel's article: "Well, not any more, as it has suspended the access to Siri voice recordings by third-party contractors until it patches the omission and manages to introduce an opt-out system as well."

5. Pureviewuser1

Posts: 161; Member since: Mar 28, 2016

Wow il0vea@pps recordings seducing his iPhone!!! Lol

6. apple-rulz

Posts: 2198; Member since: Dec 27, 2016

Still more secure and private than any offerings from Google or Amazon.

7. lyndon420

Posts: 6878; Member since: Jul 11, 2012

Uhmm...not yet lol. Google and Amazon give users the option to opt out - something that apple is supposedly fixing with an upcoming update.

8. Man_Utd

Posts: 190; Member since: Feb 03, 2015

What happened to the argument that "Apple products are more expensive but at least my data is secure"?

17. oldskool50 unregistered

Are you stupid or a fool. Pick one!!!! Did you even read the article? I guess what Apple claimed is nothing more than a lie. This is like the 4 time in just a few years, Apple has been busted retrieving data and not telling you. And FYI, Google's TOS tells you what data they collect. Apple lies and claims they dont collect anything. Maybe Apple.doesnt directly, but some 3rd party they hire does and Apple cant see or control what happens after that. But you you think your privacy is secure with Applw. You are a complete and total idiot. You don't have to pick stupid or fool. Both perfectly fits you and all Apple fans in general.

10. Seatech21

Posts: 68; Member since: Jan 01, 2018

Snowden warned us about Apple. Besides, if any of you truly believed that apple is this morally upright company that values their customers privacy are truly idiots. Apple is controlled just like every other corporate corrupt Temple that needs to be destroyed.

19. JCASS889

Posts: 608; Member since: May 18, 2018

Absolutely unacceptable from a so called privacy minded company. Maybe this will wake people up to see that these companies care absolutely nothing about you but only their profits. Truly disheartening. Anyone that uses apple or siri is truly blind to the company's intent.

26. cmdacos

Posts: 4321; Member since: Nov 01, 2016

'Privacy' the best marketing term that entices fools but in the end means nothing.

* Some comments have been hidden, because they don't meet the discussions rules.

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at https://www.parsintl.com/phonearena or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit https://www.parsintl.com/ for samples and additional information.