TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Tech | FBI facial recognition in spotlight at House Oversight hearing Related reading: Can the U.S. legal system adapt to biometric technology?

rss_feed

""

Representatives from several government agencies, along with members of industry and civil society, appeared Wednesday before the U.S. House Oversight Committee to explore privacy issues around the use of facial recognition technology, including the potential need for drafting legislation to limit some of its use. Much of the focus during the two-hour hearing centered on a Government Accountability Office report from May 2016 that was critical of the Federal Bureau of Investigation's use of the technology. 

Committee Chairman Jason Chaffetz, R-Utah, queried whether federal law enforcement should enroll every American citizen in a searchable database or whether that database should only include criminal actors. According to a report from the Center on Privacy & Technology at Georgetown Law, when including all of the databases to which law enforcement has access, nearly half of the U.S. population is in a facial recognition database. 

Notably, lawmakers from both sides of the political aisle expressed concerns about their own personal privacy while weighing the implications of law enforcement use of facial recognition. 

One major issue with the current state of facial recognition technology is its accuracy, particularly with young people, minorities, and women, several witnesses stated. 

"I have a lot of concerns about this," said Ranking Member Elijah Cummings, D-MD. "If you're black, you're more likely to be subject to this technology, and it's more likely to be wrong," Cummings continued. "That's one hell of a combination. Just let that sink in." 

FBI Deputy Assistant Director Kimberly Del Greco, who faced pointed questions from lawmakers throughout the hearing, said the agency's Next Generation Identification System provides images that are only used as investigative leads and not for a positive identification. She also said the agency is constantly facing relentless adversaries and using such technology helps them keep Americans safe. 

Electronic Frontier Foundation Attorney Jennifer Lynch, however, pointed out that systems are often trained based on older, white men, and that, as a result, false positives among minorities and women tend to be high. She added that backup human identification often fails as well. "If a person is not properly trained, they might also misidentify," she said.  

The most heated moment, however, came near the beginning of the hearing when Chaffetz pressed the FBI as to why they had not performed a privacy impact assessment before developing facial recognition technology and deploying it in the wild. According to the GAO, the Department of Justice originally approved a PIA of the Next Generation Identification-Interstate Photo System in June 2008, but in the time before it issued its next PIA in 2015 for its FACE Services programs, several changes had been made to the NGI-IPS system. 

Del Greco deferred to the Department of Justice and said the agency's privacy attorney was being advised throughout the process. "We don't believe you," Chaffetz said, "and you're supposed to make it public." 

Chaffetz went further and alleged that the FBI "went out of its way" to exempt its facial recognition database from the Privacy Act. "That's a big part of the concern," he added. Chaffetz also asked if the FBI was using social media data for its systems. Del Greco said it was not. 

GAO Homeland and Security Justice Director Diana Maurer said the agency in 2016 made six recommendations to the FBI and that the FBI "had not fully adhered to privacy laws and policies and had not taken sufficient action to help ensure accuracy of its face recognition technology." As of March 2017, the DoJ and FBI disagreed with three of the GAO's recommendations and "had taken some actions to address the remainder, but had not fully implemented them." 

One specific point of contention among committee members involved FBI access to state driver's license photos. Rep. Paul Mitchell, R-MI, asked why his face should be potentially scanned by law enforcement just because he renewed his driver's license: "To me, that's appalling." 

Rep. Gerald Connolly, D-Va., went further, saying to the FBI's Del Greco, "I think you're on very shaky legal ground" by having access to state driver's license photos. Georgetown Law's Alvaro Bedoya pointed out that the Driver's Privacy Protection Act became law nearly a decade prior to use of facial recognition technology. That prompted Connelly to suggest consideration of a bill that would provide limitations on law enforcement access to DMV photos.

Bedoya also said law enforcement is considering future use of facial recognition in body cameras. The International Biometrics + Identity Association's James Hutchinson pointed out that that kind of technology, the kind that could search huge databases in real time, is far off, but didn't deny that it could eventually come to fruition. 

"We need to target real criminals," Bedoya explained, adding they need to be targeted under specific conditions, to make sure the data is accurate, audited for abuse, and reported out for transparency. He said it makes sense for law enforcement to use the tech to keep people safe, but it should be narrow in scope and with oversight. Americans, he added, can have both safety and privacy. 

What's the next step? At least one lawmaker at the hearing suggested the need for new legislation to address privacy concerns raised during the testimony. Regardless, there appears to be some bipartisan for addressing how law enforcement using facial recognition going forward. 

And this hearing didn't even consider how the private sector uses it.

Comments

If you want to comment on this post, you need to login.