Apple announced a new initiative to combat the spread of child abuse images, and the technology company's approach is raising a wave of questions and concerns from the privacy and security communities.

The tech company plans to use new cryptography applications to scan users' iPhones for images deemed "Child Sexual Abuse Material." The system was trained on 200,000 sex abuse images collected by the National Center for Missing and Exploited Children. A team of reviewers would be alerted to the potentially illicit messages, who would then contact law enforcement for verification. The feature is set to be released with the launch of iOS 15 next month.

"This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account," Apple said in a blog post.

The photo scanning feature raised red flags among observers, many of whom point to Apple's past commitments to privacy as their source for confusion.

"To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption," the Electronic Frontier Foundation said in a blog post. "Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security."

Center for Democracy & Technology Security & Surveillance Project Co-Director Greg Nojeim called on Apple to abandon the changes and "restore its users’ faith in the security and integrity of their data on Apple devices and services.”

Princeton University Assistant Professor of Computer Science and Public Affairs Jonathan Mayer said Apple has posted plenty of details on how the cryptography will work with the device scanning, but "there’s little on the computer vision — and no evaluation of predictive performance. That’s essential for evaluating privacy impact."

IAPP Netherlands Country Leader Jeroen Terstegge, CIPP/E, CIPP/US, said on LinkedIn Apple's decision is another example of where device owners have decreased control over functionality.

"Another problem with Apple's plan is that it essentially boils down to private law enforcement," Terstegge wrote. "The government can't do anything unless it's allowed, and companies can do anything unless it's forbidden. But what we should not want is that companies use that freedom at will or at the request of a government to actively engage in (criminal) investigations. And by doing so circumvent the democratically established guarantees for the right to a fair trial and privacy that we have built in searches by the police and judicial authorities."

On Twitter, Johns Hopkins University Department of Computer Science Associate Professor Matthew Green discussed how the service could be used in China, a sentiment echoed by Asana Data Protection Officer and Privacy Counsel Whitney Merrill, CIPP/US. Edward Snowden said if Apple can scan for illicit child images today, "they can scan for anything tomorrow."

While questions and privacy concerns have been front of mind, some have spoken out in favor of Apple's decision, even if the praise isn't exactly glowing.

“I believe that the Apple PSI system provides an excellent balance between privacy and utility, and will be extremely helpful in identifying CSAM content while maintaining a high level of user privacy and keeping false positives to a minimum,” Bar-Illan University Cryptographer Benny Pinkas told Wired.

University of Surrey Computer Science Professor Alan Woodward told the Financial Times Apple's system is "less invasive" since the screening is done on the device and "only if there is a match is notification sent back to those searching. This decentralised approach is about the best approach you could adopt if you do go down this route."

ICSI Researcher Nicolas Weaver acknowledged the "slippery slope" Apple's plan presented, but adds the tech company would rather risk oppressive government use rather than the continued proliferation of illicit images, "and I don't blame them one bit."

No one is arguing Apple is not pursuing a worthwhile goal. Far from it. However, the debate about whether its device scanning initiative is a step too far will likely continue in the days and weeks ahead.

Photo by Rodion Kutsaev on Unsplash