Apple may be working on a hashed algorithm for scanning photos on users' iPhones for child abuse

5comments
Apple may be working on a hashed algorithm for scanning photos on users' iPhones for child abuse
According to some rumors and reports, Apple will be announcing a new system that will scan the contents of your Gallery in order to prevent, discover, and stop child abuse such as child pornography, reports 9to5Mac. The feature will be powered by a hashed algorithm that will reportedly match the photo content with known child abuse materials.

Apple might introduce a hashed scanning system for illegal content on iPhone


The new system will reportedly happen on the device itself and the iPhone will download a set of fingerprints, so to say, that will represent the illegal data that will be checked. Then, as one would assume, any matches of illegal content on the iPhone will be reported for a human review.

However, Apple has not announced such an initiative yet.

Cryptography and security expert Matthew Green sheds some light on why such a move could be problematic. Usually, hashing algorithm can make mistakes and turn up false positives. Then, if Apple allows governments to control the fingerprint content database, it is possible this can be misused for other things apart from illegal child content.

Keep in mind that the photos uploaded to iCloud Photos for backup are not end-to-end encrypted, despite being encrypted on Apple's servers, the company owes the keys for decryption. The same is with any other photo backup service.

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless