TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Daily Dashboard | NTIA Holds First Meeting on a Facial-Recognition Technology Code of Conduct Related reading: Advancements in Facial Recognition Raise Privacy Questions

rss_feed

""

""

The Department of Commerce’s National Telecommunications and Internet Administration yesterday held the first of a series of meetings aimed at creating a voluntary code of conduct for development and implementation of facial recognition technology. The meeting, which hosted stakeholders spanning advocacy and industry, was primarily a chance for the group, as well as the 100 or so watching the live webcast, to hear from experts on how the technology works, how it’s currently being applied and for what reasons and what it might be capable of accomplishing in the future.

The bottom-line concern: Some believe facial recognition technology may mean the end of anonymity by merging the online and offline worlds, opening doors for such nefarious uses as unlawful tracking, stalking, racial profiling and a host of other nefarious uses if there are no regulations surrounding it.

In the discussion led by the NTIA’s John Verdi, stakeholders aired concerns over notice, opt-out controls and law enforcement and government use of the technology, among other concerns. Called for in the White House’s “Privacy Blueprint” last year, this process aims to apply the Consumer Privacy Bill of Rights to business contexts. If successful, the eight-meeting series will result in a voluntary code. The NTIA recently concluded a multi-stakeholder process for development of mobile app short-form privacy notices, a process with which the DOC was “thrilled” and stakeholders seemed largely pleased.

The first meeting of this series focused on the technology’s capabilities now and what they are likely to be in the future, the commercial applications to date and what kind of code might be needed to govern its uses.

Fundamentals of the Technology

Panelists Jan Macháček, chief scientist at Paycasso, and the NTIA’s Jonathon Phillips, an electronic engineer, described to stakeholders that facial recognition technology is increasingly being used in hospitals to ID patients, by law enforcement to identify threats, by retailers to ID customers and by casinos to ID problem gamblers. The biometric technology works by identifying the attributes of a face and creating a unique ID via algorithm. Compared against another photo—or, increasingly common, an entire database of photos—contemporary technology is capable of quickly mathematically determining whether one face matches another. Compared to fingerprint biometrics, it is considered to be slightly more accurate for positive matches, given completely favorable lighting conditions and a compliant subject not wearing a hat and glasses, and favored for its no-contact-required nature.

In addition, photos are widely available, especially with the proliferation of Big Data and the databases that store it. Passports aside, social networking sites have made photos publicly available to the masses. And once the face in a photo is identified, it’s good data for some time. In general, a photo has a shelf-life of about 10 years, Macháček said; assuming the person is healthy, their facial attributes don’t change rapidly.

“It’s a very, very powerful tool we have,” Macháček said.

How Is Facial Recognition Being Used Commercially Now?

While altruistic uses of the technology were noted, the focus of the meeting was on the ways in which the technology might be used for more nefarious purposes and the kind of safeguards that would be necessary as a result.

FaceFirst is a facial recognition software company specializing in large-scale mobile and surveillance systems. It operates on mug-shot photo databases for IDs and sends FaceFirst’s customers—retailers, law enforcement and airports—a list of potential identifications and, if it can’t detect a positive match, allows users to seek a secondary authentication to lower the margin of error.

“Using secondary authentication goes a long way to prevent mismatches,” said FaceFirst President Joe Rosenkrantz, speaking on a panel about the commercial uses.

Calling into the meeting, the World Privacy Forum’s Pam Dixon wondered whether people whose faces are captured via passive surveillance are being auto-enrolled into a back-end database unknowingly.

Rosenkrantz said that while any database does have the ability to archive for the purpose of future searches, FaceFirst is specifically not doing that.

“I think the industry is going in that direction,” he said.

Consumer Action’s Michelle De Mooy asked about law enforcement use. What about in cases like the Boston Marathon bombing? How does sharing work? Can police take retailer data?

Rosenkrantz said to date there have not been a lot of cases using facial recognition to prosecute. But Morgan Reed, executive director of the Association for Competitive Technology, wanted to talk about how codes of conduct can be developed to allow for opt outs and to avoid false positives. Using the example of his aging father, he wondered what recourse Dad might have if he was falsely identified by a retailer as being a shoplifter and subsequently banned from that location?

“Should we be considering how people can extricate themselves when wrongly ID’d?” Reed asked.

Rosenkrantz said his system facilitates best practices by design and doesn’t store sensitive data.

“It performs the matching using anonymized data in the form of templates and discards the data immediately for anyone who doesn’t match,” he said. “I think industry is on this page.”

He added that there are now guidelines and laws on surveillance, and “I think facial recognition fits naturally in that framework.”

Howard Fienberg, director of government affairs for the Marketing Research Association, sat next to Rosenkrantz on the panel and noted that, at one time, society was concerned about the privacy-invasiveness of photographs themselves, but we adapt.

“I happen to think some facial recognition uses could be creepy, but that doesn’t mean the technology in and of itself is,” he said.

What’s Needed in a Code?

The last panel of the day looked at technical privacy safeguards. Panelist Alison Howard, a senior attorney at Microsoft, noted Microsoft Kinect uses a glowing light to indicate to users that images are being captured of a user playing a game, and opting out is always an option.

Her co-panelist, Joseph Atick of Identity Counsel International, said notice should be given to subjects when the live image is being captured.

“We should be looking for iconic signs that would be relevant that will tell me one of the images I have entrusted to Facebook or LinkedIn was added to a database,” he said. “When do I get notice that someone has vacuumed that image in? Or is the consumer aware that once you put that image on LinkedIn, you don’t have measures to protect it?”

Joseph Lorenzo Hall, chief technologist for the Center for Democracy & Technology, said perhaps the answer lies with making rules for data brokers themselves.

Such a solution “might put limitations on the buying and selling of consumer data that would augment the (facial recognition) systems,” he said.

Another solution, experts said, is half-toning, in which a photo is sort of encrypted, so to speak. It involves a “crippling” of images, diminishing the risk that some webcrawler may one day add them to an undisclosed database.

But Atick said technical solutions aren’t enough.

“Technical measures alone will not be able to get this issues resolved and protect our privacy in a way we would like to see it,” he said. “Technical measures alone will only raise the barrier to someone intending to bypass.”

“It is about the data,” Atick said. “This has to be about consumer control. If you’re using my faceprint without my authorization, you should be liable for misappropriating something that belonged to me.”

The next meeting, which will be live streamed, is slated for February 25 at 1 p.m. EST and will see discussions on the potential scope of a code and establish a procedural work plan.

Comments

If you want to comment on this post, you need to login.