Apple's child-abuse-scanning tech: open letter opposes the policy, Apple addresses concerns officially

21comments
Apple child abuse scanning tech: open letter opposing the policy, Apple addresses concerns officiall
A few days ago, Apple announced a new system it will implement on iPhones, iPads, and Mac computers with the OS updates in the fall, that will scan your iCloud photos for child abuse materials. The new system immediately sparked controversy given the fact that AI will be scanning your photos, and that could therefore be a questionable privacy solution. Now, CNBC reports Apple has addressed some of the concerns with the new system.

Cryptographers sign an open letter asking Apple to halt the new policy due to privacy concerns


The main concern that many cryptographers and privacy experts raised is related exactly to this - the possibility of the new scanning system to be exploited to surveil users' photos.

The concerns of the cryptography community were raised in an Open Letter, reports TheHill, singed by almost three dozen organizations and over 6,600 individuals. The letter states that despite the fact this system is meant to be used to combat child abuse, it could open a "backdoor" for other types of surveillance.

The letter states the new proposal and the possible backdoor it can introduce can undermine the fundamental privacy protections for all users of Apple products. An issue would be bypassing the end-to-end encryption and thus compromising users' privacy.

The open letter signatories request that Cupertino stops its new policy and issues a statement to reaffirm its commitment to end-to-end encryption and user privacy. It's a fact that Apple has been known as the company protecting privacy at almost all costs.

Apple's answer to the raised concerns


CNBC reports that Apple has addressed the concerns with the new child abuse material (CSAM) scanning system. Cupertino underlined that the new technology is more private in comparison to Google and Microsoft, which have also been eliminating illegal child abuse images from their servers. The company stated that governments cannot force it to add non-CSAM images to the system.

Addressing the privacy concerns, Apple stated the new system ismore private than what Google and Microsoft have implemented. The new technology will be downloaded on the iPhone of the user through an iOS update.

Additionally, some concerns were raised whether or not this new system can be implemented to more countries and used by governments to scan people's iPhones for other photos, such as, for example, political content. The issue is that some countries' laws could allow for such scanning to happen and the new technology could be used for surveillance and control.

However, Apple has responded to this claims with the statement that it will not allow governments to request it to add non-CSAM images to the hashing system. The hash file is the one that will contain known child abuse images, and will be used by AI to compare images and to determine CSAM.

Recommended Stories
"Apple will refuse any such demands", stated the company, adding that the CSAM detection feature will be built solety to detect known child abuse images stored in iCloud Photos. These known CSAM images have been identified by experts at child safety groups.

Apple additionally stated that the company has previously faced demands to build and deploy government-mandated changes to iPhones that would undermine users' privacy and it has always refused such demands. And plans to do so in the future. Apple underlined the technology will be limited to CSAM images in the iCloud.

Another worry of cryptogrophers is related to a previous statement by Apple CEO Tim Cook about a different matter, who said that the company follows laws in every country where it conducts business. This would be problematic with the new hashing system and countries such as China.

In the US, companies are required to report CSAM to the National Center for Missing & Exploited Children. In the case of not reporting when such material is discovered, companies face fines up to $300,000.

Nevertheless, the controversy over Apple's new system can become a threat to Apple's reputation of preserving user privacy at any cost.

Keep in mind that Apple also stated the new system will scan only photos uploaded to the iCloud, and not photos privately stored on iPhones. In defense of the new system, Cupertino also said the technology is significantly stronger and more private than other such systems by every privacy metric the company is currently tracking.

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless