Technology companies are not utilizing available tools to stop the recirculation of child sexual abuse material online and are taking inconsistent approaches to detect the material, The New York Times reports. “Tech companies are far more likely to review photos and videos and other files on their platforms for facial recognition, malware detection and copyright enforcement. But some businesses say looking for abuse content is different because it can raise significant privacy concerns,” the report states. Laws against child sexual abuse material make it difficult for companies to detect so some are developing artificial intelligence systems to help identify the content. (Registration may be required to access this story.)
Full Story
Comments
If you want to comment on this post, you need to login.