TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Can a Facial Recognition Code of Conduct Be Built for a Technology So Nascent? And Will Anyone Abide by It? Related reading: Clearview AI reaches preliminary settlement in BIPA case



For the most part, participants in the National Telecommunications and Information Administration’s (NTIA) first stab at creating a stakeholder-driven, voluntary set of standards on mobile app transparency will tell you it was a fruitful process.

After all, it yielded an actual short-form notice for mobile app developers to adopt, prompted Intuit to develop open source code for it and saw Lookout’s release a customizable short-form privacy notice.

But the NTIA’s latest effort on facial recognition technology faces a different hurdle: Can a voluntary code of conduct be developed on technology so nascent and nuanced that its current and future applications aren’t fully understood?

That was the crux of the debate at the NTIA’s April 29 meeting on facial recognition technology, at which stakeholders from tech, industry, advocacy and government heard from technical experts, examined “what-if” scenarios and debated whether the time is right to start developing binding language.

I wasn’t thrilled with the last process, but at least we had active participants from the mobile app community. We need the users of facial recognition actively participating.

Susan Grant of the Consumer Federation of America

Stakeholders voiced particular concern over how facial recognition rules might take special consideration for the protection of teenagers and children—the way COPPA requires web companies to obtain parental consent before kids engage, whether using facial recognition for authentication and identification are the same or different things and how in the heck effective notice and opt-outs can be implemented.

Participants considered “use-cases” submitted by various stakeholders, including facial recognition data’s potential harm to participants at controversial rallies and the impact that could have on First Amendment rights, law enforcement or employer use of such data and the potential for retailers to use the technology to photograph and then track consumers—or potentially share those images with government or marketers or those with more nefarious intentions.

“For any shopper, there needs to be notice, a sign, that this is taking your picture and this is what it’s doing,” said Joni  Lupovitz of Common Sense Media, adding there should be “conspicuous notice so they have an option of avoiding it and can exercise some choice over having their picture taken.”

Jeff Chester of the Center for Digital Democracy, who has often been critical of this process and the previous effort as well, said the NTIA hasn’t done its part to bring in the technical expertise essential to stakeholders’ understanding of how facial recognition technology works.

“In what way, do you think, before you proceed to a code—do you think we should try to engage in a much more focused and informed discussion?” Chester asked the group. “Or do you want to enter into this area blindly, which makes me very nervous, because I know we all have concerns about civil liberties.”

Chester called on the tech company and industry association reps at the meeting to bring in their people, so to speak, to explain how facial recognition technology is integrated into current and future products.

Industry reps in the room scoffed at the idea. Companies won’t be interested in revealing to-be-launched products, some noted, adding there have been myriad—and often highly revered—technology, policy and law experts who’ve testified throughout the four meetings of the series thus far.

But Susan Grant of the Consumer Federation of America agreed that something was missing.

“I wasn’t thrilled with the last process, but at least we had active participants from the mobile app community,” she said. “We need the users of facial recognition actively participating.”

For any shopper, there needs to be notice, a sign, that this is taking your picture and this is what it’s doing.

Joni Lupovitz of Common Sense Media

Because of that perceived absence, Grant wondered if a code of conduct is the right aim.

“I wonder if we shouldn’t be thinking more in the way of principles and less in the way of a code of conduct,” she said. “We have no indication that the real players here are going to help fashion and adopt it anyway.”

It was then that industry association reps from Computer and Communications Industry Association (CCIA), the Direct Marketing Association, the Internet Association and the Interactive Advertising Bureau turned on their mics and one-by-one reminded the room they were present. If they haven’t been especially vocal in the debate yet, it’s because they need something more concrete to weigh in on, some said. There are a lot of hypotheticals, for now.

“We’ve been in this room for every meeting in this process, and the CCIA is absolutely interested in engaging and writing a code,” the CCIA representative. “As soon as we get to drafting, we will absolutely be participating.”

Carl Szabo of NetChoice agreed with industry reps that it was time to wrap up the “what-ifs” and start creating something.

“I do think we’re in a good position to start looking at some guiding principles,” he said.

The group decided to move the process forward by going back to their respective parties to draft some suggestions for guiding principles. They’re due to the NTIA’s John Verdi by May 14, and the next meeting will be held May 20.

For documents, agendas and resources from the four meetings thus far held in this series, click here.


If you want to comment on this post, you need to login.