While the U.S. Federal Trade Commission has held workshops in the past, on everything from facial recognition technology to big data use, yesterday’s PrivacyCon event was clearly the commission’s most significant undertaking ever in the privacy and information security space. A no-nonsense event starting at 9 a.m. and finishing up well past 5:30 in Washington’s Constitution Center, 19 different research papers were presented, accompanied by five separate panel discussions of the papers’ findings, before some 600 attendees.

The result was a textured picture of consumer expectations and interactions with the digital marketplace, alongside a look at vulnerabilities in the Internet of Things and Mobile industries. With no opportunity for questions, it was an intense download of information for front-row FTC commissioners and attendees alike.

“Strong research informs strong policy,” said FTC Chairwoman Edith Ramirez in her opening remarks, and the cadence of PrivacyCon did sometimes feel like a drumbeat of privacy and security “problems,” followed by grasps at potential policy solutions.

As Ramirez noted, tech researchers were vital parts of enforcement efforts against the likes of Snapchat, Fandango, and Oracle, and it was clear the commissioners were eager to better understand where they should best focus their efforts in the future.

[quote]We must craft policies that are based on innovative thinking and breakthroughs we make through research. -FTC Chairwoman Edith Ramirez[/quote]

“We’re just now scratching the surface of what is to come as a result of technological advancement,” Ramirez said. “We must craft policies that are based on innovative thinking and breakthroughs we make through research.”

Some of the papers presented contained data that many privacy professionals wouldn’t find particularly surprising. A University of Pennsylvania project led by Joseph Turow, for example, found “most Americans don’t have the basic knowledge to make informed cost-benefit choices” when navigating the web. The paper, “The Tradeoff Fallacy,” comes to the conclusion that American consumers, rather than being okay with trading bits of personal information for discounts or services, are rather simply resigned to the fact that “they have no other choice if they want to live in this world,” Turow said.

In fact, this idea of a tradeoff “is a figleaf marketers use to justify what they’re doing.”

Of course, this idea that consumers don’t fully understand how their data is being collected and how it’s being used is something the privacy community has been grappling with since the advent of targeted advertising and data collection.

Ashwini Rao and colleagues at Carnegie Mellon University presented similar findings in “Expecting the Unexpected: Understanding Mismatched Privacy Expectations Online,” which focused on issues like consumer ideas that just the existence of a privacy policy means that a site can’t sell data to third parties. The paper suggests that policies could be simplified and targeted at just those issues that are commonly misunderstood.

Along the same lines, California-Berkeley’s Chris Jay Hoofnagle presented “Alan Westin’s Privacy Homo Economicus,” which challenges further the idea that consumers are making rational cost-benefit decisions with their personal data. “The key point,” Hoofnagle said, “is that Alan Westin’s theory was based on rational choice theory, and his main thesis was that public policy should serve the privacy pragmatists, people who weigh choices in the marketplace and make informed choices.”

However, the paper’s findings show “most [consumers] are simply uninformed.”

To pile on, Andelka Phillips of University of Oxford and Jen Charbonneau of University of Tasmania presented “Giving Away More Than Your Genome Sequence,” a look at genetic testing services that found “consumers often display inattentional blindness online.” Even when dealing with the most sensitive of data classes, genetic information.

So, what to do? Carnegie Mellon’s Norman Sadeh, CIPT, presented a still un-published paper describing an intriguing “personalized privacy assistant,” whereby a user could set general privacy preferences and have the assistant apply settings automatically when an app is downloaded or a web site visited. One of the barriers, however, is that it would require industry to open up application program interfaces that are currently closed for the most part. 

In the shorter term, “We could start seeing privacy policies as seals,” Hoofnagle offered. Like “organic” labels on fruit, “we could start saying that 'privacy' means certain things.” Already, he noted, if a policy says anything about security, the FTC has interpreted that to mean you have to actually provide some kind of baseline of adequate protection. 

Similarly, University of Oregon’s Heather Shoenberger and University of Florida’s Jasmine McNealy suggest in their abstract for “Offline v. Online: Re-Examining the Reasonable Consumer Standard in the Digital Context” that some use of iconography and heuristics may move the needle in aligning consumer expectations with reality.

Like the Ad Choices icon? Turow’s paper found no one seems to see it.

Genie Barton, director of the Advertising Self-Regulator Council’s accountability program, with a fresh pair of enforcement actions in hand regarding a missing Ad Choices icon, was willing to listen. “The Council of the Better Business Bureau’s whole mission is to increase marketplace trust by improving relationships between consumers and ethical marketers,” she said in a hallway interview. “I wish the research community and the industry community were less antagonistic toward one another.”

Barton noted that education is a problem for lots of industries. “I looked at the statistics around recycling knowledge,” she said, “and after years of seeing the recycling symbol on a trashcan in a public place, now a majority of people know what the symbol means. But under 30 percent of them knew what to recycle. And that’s a lot simpler than digital privacy.”

[quote]What about the contents of my Chipotle burrito?” he wondered aloud. “I don’t necessarily know where all the contents were sourced from, but it doesn’t really matter until I get sick. -Alan McQuinn, Information Technology and Innovation Foundation[/quote]

On a different tack, IAPP VP of Research and Education Omer Tene played Devil’s Advocate in suggesting consumers, rather than being resigned as Turow suggests, are “actually thrilled, even delirious, about these technologies. They can hail the Uber and rate the driver and get the new Android or iPhone and, yippee!, a selfie on their Snapchat … There seems to be something more complex at play here. We see it in other contexts: I care about health, but I eat a cheeseburger. I care about the environment, but I drive an SUV. Winters are rough in New England.”

As a corollary, the Information Technology and Innovation Foundation’s Alan McQuinn was among a few speakers who wanted to make sure harm was considered in the context of the papers. “What about the contents of my Chipotle burrito?” he wondered aloud. “I don’t necessarily know where all the contents were sourced from, but it doesn’t really matter until I get sick. We’re talking about what’s in the burrito, not the food poisoning.”

This led Jasmine McNealy to rejoin that if Chipotle mischaracterized the contents of their burritos, for example leading people to believe the contents were organic when they were not, most people would take issue with that, whether there was actual harm from eating the non-organic contents or not.

In the end, it was clear that the Federal Trade Commission has its hands full as it navigates its options for regulating the use of personal data and how that data is protected. New FTC Chief Technologist Lorrie Cranor, CIPT, herself a researcher at Carnegie Mellon, closed the conference with short remarks that noted her intention to bring policymakers and researchers more closely together, to continue the FTC’s mission of making good policy by using good data and information.

“Knowing what consumers want isn’t enough,” Cranor said. “What do we do with that information and how do we use it to start to create actual transparency?” 

The privacy profession will be heavily invested in the answers to those questions.