Apple's child-abuse-scanning tech: open letter opposes the policy, Apple addresses concerns officially21
Cryptographers sign an open letter asking Apple to halt the new policy due to privacy concerns
The main concern that many cryptographers and privacy experts raised is related exactly to this - the possibility of the new scanning system to be exploited to surveil users' photos.
The concerns of the cryptography community were raised in an Open Letter, reports TheHill, singed by almost three dozen organizations and over 6,600 individuals. The letter states that despite the fact this system is meant to be used to combat child abuse, it could open a "backdoor" for other types of surveillance.
The letter states the new proposal and the possible backdoor it can introduce can undermine the fundamental privacy protections for all users of Apple products. An issue would be bypassing the end-to-end encryption and thus compromising users' privacy.
The open letter signatories request that Cupertino stops its new policy and issues a statement to reaffirm its commitment to end-to-end encryption and user privacy. It's a fact that Apple has been known as the company protecting privacy at almost all costs.
Apple's answer to the raised concerns
CNBC reports that Apple has addressed the concerns with the new child abuse material (CSAM) scanning system. Cupertino underlined that the new technology is more private in comparison to Google and Microsoft, which have also been eliminating illegal child abuse images from their servers. The company stated that governments cannot force it to add non-CSAM images to the system.
Addressing the privacy concerns, Apple stated the new system ismore private than what Google and Microsoft have implemented. The new technology will be downloaded on the iPhone of the user through an iOS update.
Another worry of cryptogrophers is related to a previous statement by Apple CEO Tim Cook about a different matter, who said that the company follows laws in every country where it conducts business. This would be problematic with the new hashing system and countries such as China.
In the US, companies are required to report CSAM to the National Center for Missing & Exploited Children. In the case of not reporting when such material is discovered, companies face fines up to $300,000.
Nevertheless, the controversy over Apple's new system can become a threat to Apple's reputation of preserving user privacy at any cost.