A researcher at the MIT Media Lab found the accuracy of facial recognition technology depended on the subject’s skin color, The New York Times reports. Joy Buolamwini conducted an experiment to see if facial recognition technology could accurately identify a subject’s gender. When testing 385 photos of lighter-skinned males, the technology was accurate 99 percent of the time, but when testing 271 photos of darker-skinned females, the number of errors shot up to 35 percent. Buolamwini noted, "You can't have ethical AI that's not inclusive. And whoever is creating the technology is setting the standards." (Registration may be required to access this story.)
If you want to comment on this post, you need to login.