Apple suspends program that used recordings of customers' sexcapades and medical secrets
23

Last week, we told you that Apple, like Google and Amazon, has a third party firm listening to clips recorded by its virtual digital assistant. Apple and the other companies say that this is necessary in order to improve the user experience of their AI-driven helpers. One whistleblower who works for the contractor employed by Apple related how private medical information is sometimes heard on these Siri snippets and occasionally the contractors are titillated by the sounds of two (or more) people engaging in sexual activity. In such situations, Siri is activated by mistake. The whistleblower pointed out that sometimes the sound of a zipper will act like the wake word, which might explain the recordings of users' intimate moments.
A future iOS update will allow users to opt-out of Siri's grading process
Well, it seems that the blowback from the report about consultants listening to snippets of recordings from Siri touched a raw nerve at Apple. Reuters reports that this morning, Apple has decided to suspend the global program that graded Siri's performance. Apple has said that one of the reasons why these recordings were being looked at was to see what sounds accidentally awaken the virtual digital assistant. The company previously stated that "A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements."

The program Apple runs to grade Siri's performance, seen here when it debuted on the iPhone 4s in 2011, has been suspended
An Apple spokeswoman said this morning "While we conduct a thorough review, we are suspending Siri grading globally." She also noted that in a future iOS update, users will be able to opt-out of the program at their discretion.
Last April, Amazon was criticized when word leaked out about the teams it employs to transcribe users' conversations with Alexa. Amazon claimed at the time that the program is designed to help Alexa improve its understanding of the human language. Each employee, stationed in Boston, Costa Rica, India and Romania, puts in a nine-hour workday during which he or she might listen to 1,000 audio clips taken from Alexa. And Amazon has added an internal system that allows members of these teams to share recordings with one another. While that is supposedly done to help team members figure out hard to understand words or phrases, we can imagine it being used to share the most embarrassing comments and lewd remarks made by Echo owners. These are recorded by Alexa without the user's knowledge.
Things that are NOT allowed: