A team at Facebook is working to combat nonconsensual sharing of intimate photos and videos, NBC News reports. The 25-member team is working full time to quickly remove content once it is reported and prevent it from being shared by using artificial intelligence to detect when it is uploaded. “In hearing how terrible the experiences of having your image shared was, the product team was really motivated in trying to figure out what we could do that was better than just responding to reports,” Facebook Head of Product Policy Research Radha Plumb said. While the company has also updated its reporting tools and process to better support victims, some victims and advocates said the company needs to do more.
If you want to comment on this post, you need to login.