Apple anti-child abuse features get delayed: will collect more input before release
Apple is pausing its CSAM (Child Sexual Abuse Material) features after the backlash it received. In a statement to 9to5Mac, the company states that it needs to look further into improving those features. Apple says it will gather more information from the different groups and advocates of those that were concerned.
One of the scanning features Apple planned to introduce involved iCloud Family. Namely, the company wanted to shield any children under the age of 13 from sending or receiving sexually explicit images in iMessage.
There were a lot of concerns from privacy advocates after the announcement of the new features. The concerns included misuse by authoritarian governments and/or a potential expansion into messaging or other parts of the device. Apple was on the offensive about the concerns, with the company stating that there will be a threshold before it sends a report to NCMEC, and that iMessage and iCloud will still have end-to-end encryption.
The risk of misuse was backed up by The Electronic Frontier Foundation (EFF). The foundation pointed out that a government can lie to Apple about digital fingerprints containing only CSAM, when in fact there is no way for either users or Apple to verify that.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.” - Apple for 9to5Mac
This statement doesn’t come as a shock, as the company was heavily scrutinized about its iCloud and iMessage scanning measures. Apple didn't provide a new release date for the CSAM features. They were supposed to come before the end of the year as part of iOS 15.
The feature would have worked by Apple blurring out the potentially explicit picture, stating it is sensitive and explaining why, and then letting the child decide whether to continue or not. If the child decided to see it anyway, their parents would receive a notification.
Separately, the Cupertino-based company announced that iCloud Photos would also scan for CSAM on-device. If Apple detected such materials, it would send a report to the National Center for Missing & Exploited Children (NCMEC) with the user's account information.
There were a lot of concerns from privacy advocates after the announcement of the new features. The concerns included misuse by authoritarian governments and/or a potential expansion into messaging or other parts of the device. Apple was on the offensive about the concerns, with the company stating that there will be a threshold before it sends a report to NCMEC, and that iMessage and iCloud will still have end-to-end encryption.
Things that are NOT allowed: