Artificial intelligence and machine learning technologies are rapidly developing across virtually all sectors of the global economy. One nascent field is empathic technology, which, for better or worse, includes emotion detection. It is estimated that emotion-detection technology could be worth $56 billion by 2024. However, judging a person's emotional state is subjective and raises a host of privacy, fairness, and ethical questions. Ben Bland has worked in the empathic technology space in recent years and now chairs the Institute of Electrical and Electronics Engineers P7014 Working Group to develop a global standard for the ethics of empathic technology. We recently caught up to discuss the pros and cons of the technology and his work with IEEE.
Exploring the privacy, ethical issues with emotion-detection tech
Related stories
A view from Brussels: Putting AI to the test on EU privacy, data protection developments
Why organizations should prioritize employee data protection to combat spear phishing
Notes from the Asia-Pacific region: Santa's on his way, bringing guidance from Singapore's PDPC, adoption of Vietnam's Data Law and more
EDPB weighs in on key questions on personal data in AI models
New tools aim to improve data activity monitoring, compliance efficiency
This article is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.