Wired reports the National Institute of Standards and Technology is looking to quantify user trust in artificial intelligence. The NIST is accepting public comments until July 30, saying it wants to identify areas of mistrust in AI and promote informed decisions in its use. A user trust score will be used to measure items such as the age, gender, cultural beliefs and AI experience of an individual using an AI system, while a trustworthiness score will explore technical concepts.
22 June 2021
NIST seeks to quantify user trust in AI
Related stories
Notes from the Asia-Pacific region: China's AI Plus initiative accelerates AI integration
A view from Brussels: A wind of pragmatism
California governor signs law requiring in-browser opt-out preference signal
IP obfuscation popularity undermines privacy compliance strategies
Engineering GDPR compliance in the age of agentic AI