TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | RSA Dispatch: Talking FIPPs and Geeks with Google, Microsoft and McAfee Related reading: A Close Look at Latest Bill To Enhance Gov’t Surveillance

rss_feed

""

""

These are uncertain times. User trust is at an all-time low; the models upon which governing data-use principles were built are outdated, and it’s time for a shift in how policy people and engineers work together in order to address these problems.

Those were just some of the takeaways from a highly attended and wide-ranging RSA session Wednesday on “Hot Topics in Privacy,” moderated by IAPP CEO Trevor Hughes, CIPP, and featuring a panel of chief privacy officers from Google, Microsoft and McAfee.

But first they addressed the 800-pound gorilla in the room: the NSA.

Google Senior Privacy Council Keith Enright, CIPP/US, CIPP/G, started the session by making one thing clear: Google didn’t play the role in the “Summer of Snowden” some thought it may have, given the headlines at the height of the NSA surveillance revelations last year.

“One thing that’s important to understand … is that we did not grant direct access to the government to internal Google systems. We have no evidence that such access actually occurred,” Enright said.

Despite such allegations, knowing what governments are doing with the data of their citizenry is certainly important, Enright said, adding that transparency is key if accountability is a desired outcome. He added that without accountability, there can’t be change. Ultimately, Google’s success depends on user trust, Enright said, and the Snowden revelations turned user trust on its head.

Microsoft CPO Brendon Lynch, CIPP/US, said government surveillance is a complicated issue, because security professionals see it as a security issue while privacy pros see it as a privacy issue.

“And both are right,” Lynch said.

McAfee Vice President and CPO Michelle Dennedy, CIPP/US, CIPM, stressed it’s essential for the public to recognize that when we talk about government access to data, we’re talking about governments—plural, as in not just the United States.

Enright highlighted  the kneejerk reaction by some to restrict the global storage of data. German Chancellor Angela Merkel, for example, has recently called for a separate European Internet. That’s not the right move, the three panelists agreed.

“There’s been an emotive reaction to move to some sort of localization data,” Enright said. “But it’s important people understand localizing data does not prevent government access.”

Trevor Hughes, Michelle Dennedy, Keith Enright and Brendon Lynch.

Indeed, it would be the wrong approach, Dennedy said. “Balkanization of data is the easiest way to lose control for the average citizen and the easiest way for users to lose control of their data.”

Lynch agreed, saying he could “understand the temptation of those discussions. But there’s no silver bullet for this thing.”

Hughes noted that there’s now a lot of discourse over public policy both in the U.S. and in Europe. In the U.S., the FTC has been active—especially recently, in settling privacy violations. The White House has stepped in, introducing a Consumer Bill of Rights, and it will soon hold a number of workshops on Big Data. It’s also initiated a number of multi-stakeholder workshops on developing self-regulatory codes of conduct when it comes to privacy-threatening technologies.

Notice and Choice

Current models of public policy are based on the Fair Information Practice Principles (FIPPs)—pillars of which are notice and choice. But such concepts are strained in these times. Nobody reads privacy policies, and, as such, even informed users tend to simply yield to the default that’s offered them. So how do we move forward if the pillars on which we base fair practices no longer apply? Hughes asked the panel.

“Current applications of FIPPs are strained because of a reliance on notice and consent, but there are other principles such as accountability that perhaps need to be invoked more,” Lynch said. “The vast majority of people just want to be safe; they want the use of data to be providing value to them in an appropriate context. As we look forward to a world where computing is more ubiquitous and data collection is more ubiquitous, the Internet of Things and ‘wearables’ won’t necessarily have a user interface to have a notice-and-choice moment.”

A notice is an aspirational quality, but a notice is also a thing. It’s a verb. Engineers love verbs. How do we deliver that physically? That’s the language we need to speak with our engineers. And if you’re an engineer coming to your legal people, it’s okay to say ‘I don’t understand what the flip you’re talking about. Tell me in zeroes and ones and not with (vague concepts like) reasonable.’ It’s really hard to code ‘reasonable.’

McAfee VP and CPO Michelle Dennedy, CIPP/US, CIPM

Logistically, notice and choice may not be viable, he said. “Just the sheer volume is going to make it difficult to make decisions about the collection of every piece of data impossible.”

Lynch said the solution may lie in shifting some of the burden to the providers of the latest technologies.

“Responsible use, organizations being accountable and better expectations of what privacy norms would be is going to be key,” Lynch said.

Dennedy said the FIPPs are pretty good ideas, but we’ve over-relied on legal notice. Deciding how to treat data generated from the Internet of Things, context will be key, she said.

“What happens when you get in your car? Is the data going to Google or your optic provider to say your prescription is off today? We need to start with the basics,” she said, noting Lawrence Lessig of Harvard Law’s assertion that, no matter what the philosophy or intention, “Code is law.”

“This is a vast opportunity to design something that follows all 11 FIPPs,” she said.

But with context and giving users choice, how do we go about it?

Enright said there are three elements to think about. First, it’s now clear that a privacy policy isn’t the way to get users to understand what’s happening with their data at a fundamental level. Second, the technology itself is grinding up against old legal traditions and analogue laws. And finally, users are changing. “Possibly as quickly as the technology is changing. All over the world, users are getting more sophisticated,” he said.

While the legislative policy community works to try and solve these problems, consumers are moving a whole lot faster and aiming to use technology in new and creative ways. All of this is to say that there’s no clear answer, he said. But it will be important to continue to listen to users so innovative legislation isn’t happening in a vacuum.

“Keeping our fingers on the pulse of users and what they’re asking for is going to require a very, very thoughtful process,” he said.

Lynch agreed that you can’t make assumptions about privacy norms and what people want. Some people want different data experiences than others do. Some want fine-grain controls; others don’t really care and just want to make sure they’re safe. There’s a spectrum.

Comments

If you want to comment on this post, you need to login.