Apple outlined why it opted against moving forward with its iCloud scanning tool to combat child sexual abuse material, Wired reports. The December 2022 decision was based on the potential for "new threat vectors for data thieves to find and exploit," according to Apple Director of User Privacy Erik Neuenschwander. He added CSAM scanning could also lead to "bulk surveillance" and create "a desire to search other encrypted messaging systems across content types."
5 Sept. 2023
Apple details CSAM scanning halt
Related stories
Home sweet home or location, location, location: The best place for your company's privacy office
Transparency, good data and documentation: How HR can navigate the EU AI Act
Mind matters: Shaping the future of privacy in the age of neurotechnology
A view from DC: The FTC's next priorities
Notes from the IAPP Canada: CPS25 highlights privacy, AI, cybersecurity evolution