At a U.S. House of Representatives Committee on Oversight and Reform hearing, government and law enforcement representatives testified on the ways they're using facial-recognition technologies to aid in criminal investigations. The federal hearing occurred as cities in the U.S. consider banning or placing a moratorium on the technology until its shortcomings are corrected and its uses policed. San Francisco recently banned the technology outright, and Somerville, Massachusetts, has placed a moratorium on it. In addition, Michigan's state Senate has proposed a bill to ban it outright. 

Lawmakers expressed concern and — at times, dismay— that the Federal Bureau of Investigation has not implemented recommendations made by the Government Accountability Office in 2016 for the FBI's facial-recognition technology protocols. They noted alarming testimony, made by academics and privacy advocates in the first in this series of hearings, that testing of the technology indicates algorithmic deficiencies resulting in false-positive matches of suspects, disproportionately so for minorities and especially women of color. 

Committee Chairman Rep. Elijah Cummings, D-Md., opened by recalling privacy advocates' May 22 testimony citing serious concerns that the technology has been and continues to be deployed by government agencies without any real safeguards or oversight — including at U.S. airports. The committee is considering pushing for legislation that would protect consumers against the governmental acquisition of the technology and how it's used.

There's thus far broad bipartisan consensus that there's a problem here, perhaps best indicated by Rep. Jim Jordan, R-Ohio, who cited "a lot of agreement, a lot of common ground" among his peers. Jordan said "there are all kinds of mistakes made" when the technology is implemented, that there are "First Amendment and Fourth Amendment concerns."

Jordan repeatedly stressed the three years that have passed since the GAO made its recommendations to the FBI. 

For her part, the FBI's Kimberly Del Greco, deputy assistant director of Criminal Justice Information Services, said the FBI has two units using facial-recognition technology, and they interplay. There's the Next Generation Identification Interstate Photo System, a database of criminal mugshots associated with fingerprints.

Then there's the FBI's Face Services Unit, which leverages partnerships with federal and state actors who have agreed to share images, such as driver's license photos. Del Greco testified the FBI can only run a search query when there's an "open assessment" or "active investigation," and matches "are not to be considered positive identification" but rather are considered in a "ranked list" of candidates. Del Greco also cited the program places legal, training and security requirements on law enforcement with clearances to search and undergoes tri-annual audits.

Del Greco also testified that from fiscal year 2017 to April 2019, the NGI Interstate Photo System ran 152,500 searches. 

Rep. Carolyn Maloney, D-N.Y., wanted to know whether "there was any proof this system has been helpful to law enforcement in any way" and if any of the searches had led to convictions.

"Do you have that information?" she asked Del Greco. 

"I do not," Del Greco said. Maloney asked whether there was data on whether any misidentifications "led to the conviction of any innocent people."

Del Greco said, "Not to my knowledge, ma'am." 

"Maybe we should change your system, then," Maloney said. "Because we need accountability on whether this system is working or not or if it's just abusing people."

Jordan has been a vocal critic, at the first hearing and again Tuesday, that the FBI did not publish a privacy impact assessment when it deployed the technology in 2011, didn't file a proper system-of-record notice, didn't conduct a study on the accuracy of its photosystem, and didn't test the accuracy of the state systems with which it interfaced, facts the GAO's Gretta Goodwin confirmed.

Jordan continued, "Those five things they didn't do, but Ms. Del Greco said, 'We have strict standards ... you can count on us. ... That's what she told us. ... But when they stood up this system there were five key things they were supposed to follow that they didn't, and my understanding is that they still haven't corrected all those, is that accurate?" 

"That is correct," Goodwin said. 

"But we're supposed to believe, 'Don't worry, everything's just fine.' ... And we haven't even gotten to First Amendment concerns, Fourth Amendment concerns," Jordan continued. "We're just talking about standing up the system. ... How are we supposed to have confidence in strict rules you're going to follow when you didn't follow the rules when you set the thing up in the first place?"

Del Greco said the FBI hadn't followed the GAO's recommendations in part because it disagrees with the agency's assessment, specifically, its assertions as "there are problems with the accuracy of the system." 

Rep. Mark Meadows, R-N.C., had some advice for Del Greco. 

"If GAO isn't happy, I'm not happy," he said. "And so here's what I would recommend: Of the five outstanding [recommendations, my advice] is that you work with GAO to close those out. ... Are you willing to do that?" 

"Absolutely, sir," said Del Greco, adding there has been progress. The FBI has now completed 14 of 21 audits the GAO asked it to make.

Meadows asked for a status report within 60 days. 

Jordan also took issue with the fact that agreements between 20 participating states to share Department of Motor Vehicle's data for the purposes of facial-recognition matching were not signed-off on by an elected official. Goodwin testified data-sharing agreements have resulted in FBI access to a total of 640 million people's faces, prompting a scoff from Jordan that there are "only 330 million people in the country."   

Also on hand at the hearing was Charles Romine, director of the Information Technology Laboratory at the National Institute of Standards and Technology. NIST partners with federal agencies, including the FBI, to develop standards on technologies, such as biometrics, among many others. Romine testified that since the FBI implemented facial-recognition technology, there have been significant improvements in the technology itself, namely in a reduction of misidentification. 

NIST launched a "Face Biometric Grand Challenge" to the public and private sectors to break new ground and solve "performance" problems on biometrics, including algorithmic misidentification and human error, and Romine said 70 participants have since submitted facial-recognition algorithms, the best of which are performing at 99.7% against false accuracy rates. 

Del Greco said currently the FBI's facial-recognition technology algorithm has an 85% accuracy rate, but added it does not provide one-to-one matches; rather, the database returns "investigative leads" for two or more matches, which are then analyzed by trained humans. She added they are currently implementing NIST's Vendor Recognition Test algorithm for its higher accuracy rates. 

In the meantime, Rep. Jimmy Gomez, D-Calif., wanted to know from both the NIST and FBI what they're doing to thwart risks of gender and racial bias in facial-recognition algorithms, especially given research by both the ACLU and Georgetown's Center on Privacy and Technology indicating black women tend to be misidentified more than any other group, as well as concerns about the transgender community. 

NIST's Romine said he "has not done an analysis of accuracy rates for transgender community" and that he's "not sure how we would obtain the relevant data we need to do that" but has been made aware of concerns by the transgender community on "problematic use here." 

Gomez wanted to know how the FBI trains humans to be aware of such biases that may exist. 

"Is the FBI trained personnel on the potential inaccuracies and biases of facial recognition algorithms?" 

"Bias? No, sir," Del Greco said. "Our system doesn't look at skin tones and features, it's a mathematical computation that comes back, and they are to look at the mathematical features of the face."  

Cummings said the committee planned to bring law enforcement before it again in the near future. Jordan closed by thanking Del Grecco, "I hope you understand from both sides of the aisle there’s a real concern … I hope you understand how serious I think everyone is on this committee with this issue."

Photo by Matthew Henry on Unsplash