Artificial intelligence and machine learning technologies are rapidly developing across virtually all sectors of the global economy. One nascent field is empathic technology, which, for better or worse, includes emotion detection. It is estimated that emotion-detection technology could be worth $56 billion by 2024. However, judging a person's emotional state is subjective and raises a host of privacy, fairness, and ethical questions. Ben Bland has worked in the empathic technology space in recent years and now chairs the Institute of Electrical and Electronics Engineers P7014 Working Group to develop a global standard for the ethics of empathic technology. We recently caught up to discuss the pros and cons of the technology and his work with IEEE.Â
Exploring the privacy, ethical issues with emotion-detection tech

Related stories
Notes from the Asia-Pacific region: Strong start to 2026 for China's data, AI governance landscape
A view from Brussels: How, when will the Omnibus yield results?
California 2025 legislative wrap-up: More privacy and first-of-its kind AI laws adopted
Why privacy teams are the missing link in AI governance
How AI liability risks are challenging the insurance landscape

This content is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.
