TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Experts Say Facial Rec Code Should Protect the Vulnerable: Adolescents and Minorities Related reading: Can a Facial Recognition Code of Conduct Be Built for a Technology So Nascent? And Will Anyone Abide by It?

rss_feed
DPC17_WebBanner_300x250-COPY
PrivacyCore_ad_300x250-01
iapp-privacycore

In trying to establish a code of conduct on the commercial uses of facial recognition technology, there's been much discussion among stakeholders participating in the National Telecommunications and Information Administration (NTIA) process about the potential harms that could arise if the technology isn't regulated. How will law enforcement use raw data collected via facial recognition systems? Will photos posted to social media sites be used to match supposedly anonymous participants at political rallies or for racial profiling?

At the NTIA's July 24 meeting, the eighth in the process, stakeholders called in a couple of experts to talk about real-life risks in detail to better understand who is most vulnerable to the potential harms.

One group that's particularly at risk, according to UCLA Assistant Prof. Adriana Galván, are adolescents. That's because during adolescence, which Galván defines as the "biologically initiated phenomenon that starts at puberty" and ends when it is "socially terminated with adult roles like financial independence," there are sharp increases in learning and reasoning skills. It’s a really risky time; there is a 200-percent jump in mortality rates.

That might be attributable to the fact that "reward regions are hyper-excitable in adolescents," according to Galván's research. In a test that compared brainwave responses between adults and adolescents given cash money, the adolescents responded with "exaggerated activation." But maybe that was simply a result of adolescents’ relative inexperience with money? No, in a second test, the same results appeared when the two groups were given sugar water.

"This means it's a more excitable reward system for people in this age group," Galván testified. "People in this age group are more responsive to incentives."

How does this relate, exactly, to facial-recognition technology?

"Adolescents are more susceptible to reward-based marketing," Galván said, "which should be reflected in policies." If facial recognition were used to identify that people were, indeed, adolescents, marketers might be able to unduly exploit their susceptibility to reward-based marketing.

Next up was Prof. Jerome Williams of Rutgers Business School, who discussed the ways the commercial marketplace caters to different racial groups—already a pervasive problem resulting in certain groups suffering disadvantages, often in subtle or obscure ways.

In one case Williams researched, one particular racial group represented only five percent of the total customers that entered the store but accounted for 95 percent of the customers stopped for suspicion of shoplifting.

"What that suggests is the security cameras only focus on that one group," Williams said.

While research indicates the group most likely to shoplift are white women in their 40s and 50s and not African Americans, he noted a Barneys department store case involving a young black male who claimed he was wrongly jailed after buying a $350 belt. The incident evoked a catchphrase to denote feelings within the black community that customers stopped were guilty of committing only the crime of "shopping while black."

This could have amplified implications given the potential uses of facial-recognition technology, Williams said.

"Essentially, you ID people and pay more attention to them not only in the store but also when they leave the store," he said. "Despite some people who feel this doesn't occur, this is definitely happening in the marketplace. It's been documented. Imagine using facial-recognition technology to only focus in on certain groups and perpetuate what we've already seen."

Though it wasn't heavily discussed at the meeting, the International Biometrics Industry Association (IBIA)  circulated a revised statement—following its hotly contested assertion within its Privacy Best Practices document shared at last month's meeting that "there is no anonymity if we choose to live in society"—stating that "opting out of anonymity does not constitute a loss of any right to privacy" and "the declared purpose of these NTIA meetings is to discuss consumer privacy and facial recognition without any reference to consumer anonymity." As such, the memo states "the IBIA will focus exclusively on the core issue of privacy and will update its best-practice recommendations as appropriate."

It’s a significant departure from its earlier statement, which is likely to appease many of the privacy advocates who took issue with the IBIA's former position, notably Chris Calabrese of the ACLU.

The next meeting is slated for sometime in September when all the privacy and technology folks in the room aren't attending one of the myriad, previously scheduled conferences the month brings.

Comments

If you want to comment on this post, you need to login.