Artificial intelligence and machine learning technologies are rapidly developing across virtually all sectors of the global economy. One nascent field is empathic technology, which, for better or worse, includes emotion detection. It is estimated that emotion-detection technology could be worth $56 billion by 2024. However, judging a person's emotional state is subjective and raises a host of privacy, fairness, and ethical questions. Ben Bland has worked in the empathic technology space in recent years and now chairs the Institute of Electrical and Electronics Engineers P7014 Working Group to develop a global standard for the ethics of empathic technology. We recently caught up to discuss the pros and cons of the technology and his work with IEEE.
Exploring the privacy, ethical issues with emotion-detection tech
Related stories
Navigate 2025: Potential EU AI Act pause opens new questions on approach to global regulation
Rebuilding digital trust: How blockchain is making privacy a default
EU model contractual clauses for AI procurement: A practical guide
From compliance cost to competitive edge: How privacy leaders can command the executive table
What Brazil's ANPD expects from companies using generative AI
This article is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.