The U.K. Information Commissioner’s Office warned it will investigate organizations using emotion analysis technologies irresponsibly, urging risk assessments before systems are implemented. The ICO said emotion analysis technologies collect, store and process data that is “far more risky than traditional biometric technologies” and bring a risk of systemic bias, inaccuracy and discrimination. “Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever. While there are opportunities present, the risks are currently greater,” Deputy Commissioner Stephen Bonner, CIPP/E, CIPM, said.
26 Oct. 2022
ICO warns organizations about use of emotion analysis tech
Related stories
US Senate subcommittee explores 'core principles' for federal privacy framework
Notes from the Asia-Pacific region: China unveils global AI initiatives
Notes from the IAPP Europe: New Council presidency, challenges to Commission leadership and key digital developments
General-purpose AI models: The European Commission's guidelines on the scope of obligations
Luminos.AI, ZwillGen partner on AI law platform to help scale common governance practices