Apple outlined why it opted against moving forward with its iCloud scanning tool to combat child sexual abuse material, Wired reports. The December 2022 decision was based on the potential for "new threat vectors for data thieves to find and exploit," according to Apple Director of User Privacy Erik Neuenschwander. He added CSAM scanning could also lead to "bulk surveillance" and create "a desire to search other encrypted messaging systems across content types."
5 Sept. 2023
Apple details CSAM scanning halt
Related stories
Whose risk is it anyway? How positions and perspectives inform digital risks
Vietnam's PDPL in focus: What to know and watch for
Notes from the IAPP Canada: Breaches deserve swift, focused attention from all sides
Risk analysis is the foundation of data security, but regulator approaches differ
Nebraska, Vermont's Age Appropriate Design Codes look to bolster children's online safety