Potential system bias issues have been found with crime prediction software developed by PredPol, according to an investigation by The Markup and Gizmodo. The specific bias pertained to targeting of neighborhoods home to Blacks, Latinos and low-income families rather than those neighborhoods with a greater white or middle-to-upper-income residents. "Communities with troubled relationships with police — this is not what they need," American Civil Liberties Union Senior Policy Analyst Jay Stanley said. "They need resources to fill basic social needs."
Analysis shows bias with AI-powered crime prediction software
Related stories
Notes from the Asia-Pacific region: Australia eSafety Commissioner launches social media age restrictions hub
EU Data operational impacts: The Data Act's interplay within the EU digital rulebook
Notes from the IAPP Europe: A focus on the Digital Networks Act
The 2025 Brazilian DPO: Navigating high risks with limited runways
PETs: Beyond privacy-enhancing
