TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Digging deeper into ‘Privacy on the Ground’ Related reading: AG enforcement, algorithmic discrimination top PLSC line-up

rss_feed

""

At the IAPP Global Privacy Summit in 2016, Deirdre Mulligan and Kenneth Bamberger were presented with Privacy Leadership Awards and gave a keynote address based on the work that led to “Privacy on the Ground,” a book documenting the way privacy is actually done inside organizations around the world. 

But, says Ari Waldman, director of the Innovation Center for Law and Technology at New York Law School, “there is more to privacy than Bamberger and Mulligan’s work. They pioneered this research agenda, and I’m building on it, hypothesizing that there needs to be a buy-in from the engineers on the ground who are creating technology products.”

Such is the basis for “Designing Without Privacy,” Waldman’s paper-in-progress that was voted by conference participants as the IAPP award winner at the recent Privacy Law Scholars Conference in Berkeley, California. The research, to be published in the Houston Law Review, presented at the upcoming Privacy. Security. Risk. conference, and, hopefully, to eventually become a book-length work, moves the center of the research to a different spot in the organization.

While Bamberger and Mulligan focused on the chief privacy officer and the privacy office, Waldman focused on the work of the technologists, computer scientists and engineers working to develop all of these innovative products that present these new and interesting privacy dilemmas.

While Bamberger and Mulligan focused on the chief privacy officer and the privacy office, Waldman focused on the work of the technologists, computer scientists and engineers working to develop all of these innovative products that present these new and interesting privacy dilemmas.

What did he find? “Even at companies that are doing great work in integrating privacy into the core of their companies,” Waldman said on the phone from New York, “it’s not always getting down to the design meetings for various reasons.” Often, he said, the fundamentals of privacy by design were in place — polices, procedures, checklists to follow — “and, in some companies, they followed those checklists, and in some companies, no one really cared.”

The “biggest difference,” Waldman said, “is how powerfully the companies endorsed the implementation of the privacy protocols.”

This jibes with research done by the IAPP recently. In “Assessing and Mitigating Privacy Risk Starts at the Top,” a collaboration with Bloomberg Law, we found that leadership buy-in and corporate training are seen by privacy professionals as the two most important factors in mitigating privacy risk, beyond infosecurity measures and IT capabilities.

Waldman said the barriers to true privacy by design are often relationship based and didn’t have much to do with technology or the sophistication of the privacy policy. “Trust is important,” Waldman said, noting many privacy leaders are lawyers. “You have to build trust with your clients, and even when you’re in house, your clients are your company’s employees.”

In interviews with engineers and product designers, “people would say, ‘I get that that’s important, but that’s not my job. My job is to code this. My team has a lot to do. There are 58,000 things to do in the next few weeks. Other people do privacy.’”

Privacy professionals, many of them coming from a compliance or legal background, are great at following the rules. Technologists and engineers are often taught to break them, to forge new ground. Waldman said one of the areas he’s looking to explore more deeply is how we educate technologists as a society and “how that influences the way they talk to the other parts of the company.”

In interviews with engineers and product designers, “people would say, ‘I get that that’s important, but that’s not my job. My job is to code this. My team has a lot to do. There are 58,000 things to do in the next few weeks. Other people do privacy.’”

Further, he said, “there’s a territoriality to it. They say, ‘I don’t want lawyers in what I do, just like they don’t want me involved in what they do.’” That self-definition of what “I do” and what “they do” can lead to a breakdown in privacy by design.

“When you talk to technologists about things like differential privacy,” Waldman said, “they’re like, ‘Yeah, I get that.’ It’s reducing privacy to an equation; that’s their bailiwick. But the problem with that is that it oversimplifies the privacy concerns. It doesn’t effectively include ethics and discrimination issues that result from design. You can’t have an adequately designed product where the only thing that comes into the design process is equations.”

So, what to do? “It’s not all going to be bad,” Waldman protested, after a long talk on what’s wrong. “Some companies and schools are really doing a good job in this area.” He points to the Carnegie Mellon graduate program in privacy engineering, where ethics and privacy are front and center. “It’s about building trust and integrating interdisciplinary work. … And we need for lawyers to do that, too.”

He points to Georgetown’s coding for lawyers class. A technology for lawyers working group at his own school, where you can learn things like how behavioral advertising works.

But “interdisciplinary” isn’t just for schools. Companies and organizations, in general, need to do a better job of making sure employees in different portions of the organization actually communicate with one another. Leaders need to make it clear that banning lawyers from design meetings because they slow things down just isn’t allowed. And privacy teams can’t create policy without the input of the people who’ll actually be following it.

Technologists don’t come to big Silicon Valley firms and cool startups to follow policy. They come to solve hard problems, to push the envelope, to have resources and time to get into the weeds. Unfortunately, this means many of them can go “weeks and months,” Waldman said, “without seeing anyone in the company other than fellow computer scientists.”

That bubble can lead to ethical and discriminatory blind spots. Eliminating them starts — you guessed it — at the top.

Comments

If you want to comment on this post, you need to login.