Apple outlined why it opted against moving forward with its iCloud scanning tool to combat child sexual abuse material, Wired reports. The December 2022 decision was based on the potential for "new threat vectors for data thieves to find and exploit," according to Apple Director of User Privacy Erik Neuenschwander. He added CSAM scanning could also lead to "bulk surveillance" and create "a desire to search other encrypted messaging systems across content types."