TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Daily Dashboard | NIST study shows inaccuracy, bias in facial-recognition scans Related reading: Study: Irish organizations lagging in GDPR compliance

rss_feed
GDPR-Ready_300x250-Ad
PrivacyTraining_ad300x250.Promo1-01

The National Institute of Standards and Technology announced the release of its "Face Recognition Vendor Test Part 3: Demographic Effects," which revealed many facial-recognition systems produce inaccurate scans due to racial bias. In its review of 189 recognition algorithms from 99 developers, NIST found that false-positive scans on African American and Asian subjects were anywhere between 10 to 100 times more likely with various systems. “While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied,” said Patrick Grother, a NIST computer scientist and the report’s primary author.
Full Story

Comments

If you want to comment on this post, you need to login.