Apple outlined why it opted against moving forward with its iCloud scanning tool to combat child sexual abuse material, Wired reports. The December 2022 decision was based on the potential for "new threat vectors for data thieves to find and exploit," according to Apple Director of User Privacy Erik Neuenschwander. He added CSAM scanning could also lead to "bulk surveillance" and create "a desire to search other encrypted messaging systems across content types."
5 Sept. 2023
Apple details CSAM scanning halt
Related stories
US Senate abandons proposed state AI law moratorium as compromise falls through
Navigate 2025: DOJ's antitrust unit zeroes in on consumer protection, innovation
Emerging trends, insights from public enforcement of US state privacy laws
Notes from the IAPP Canada: Building momentum to address youth privacy issues
How proposed AI enforcement moratorium cuts into US state-level powers