Wired reports the National Institute of Standards and Technology is looking to quantify user trust in artificial intelligence. The NIST is accepting public comments until July 30, saying it wants to identify areas of mistrust in AI and promote informed decisions in its use. A user trust score will be used to measure items such as the age, gender, cultural beliefs and AI experience of an individual using an AI system, while a trustworthiness score will explore technical concepts.
22 June 2021
NIST seeks to quantify user trust in AI
Related stories
10 tips to prepare for the EU Cyber Resilience Act
A view from Brussels: State of the (European) Union
US senator aims to advance US AI leadership with sandbox, federal regulatory exemptions
Notes from the Asia-Pacific region: India's AI ecosystem, digital competition and more
Ninth Circuit takes cautious approach to privacy and data security standing