TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Big data: To expectations and beyond Related reading: Are Your Privacy Impact Assessments Stuck in the 1970s?

rss_feed

""

Among the 10 privacy research projects awarded as much as $50,000 CND by Canada’s Office of the Privacy Commissioner this week is a project many privacy pros have likely already glimpsed a piece of: The Information Accountability Foundation’s Big Data Ethics Initiative. 

Two and a half years in the making already, the effort has been building toward a framework that organizations can use to make ethical decisions about the use of data that goes beyond what those organizations might have explicit consent for. How do organizations balance the nearly limitless possibility that big data analytics promise while still respecting the privacy of the individuals to whom that data belongs?

Pieces of the project have hit the marketplace in many fashions. At the recent Global Privacy Summit, the IAF’s Marty Abrams led a pre-conference workshop on big data project vetting “to assure fair and innovative data use.” At last fall’s Data Protection Commissioners’ Conference in Amsterdam, the IAF released a paper on “Enforcing Big Data Assessment Processes.”

Now, the IAF will bring together 20 Canadian companies to take the IAF’s model for big data ethical assessment framework and use it as a tool to ensure compliance with Canada’s PIPEDA, creating a “means to determine whether big data undertakings are legal, fair, and just and to demonstrate how that determination was reached.”

Because Canada’s law is built on consent and accountability, IAF head Abrams said in an interview with the IAPP that it is particularly well suited as a test bed.

format_quoteIf the framework can work in Canada, then it’s likely to slot well into helping organizations decide whether a project fits into the legitimate interest legal basis for processing that’s been created by the new General Data Protection Regulation in the EU.

If the framework can work in Canada, then it’s likely to slot well into helping organizations decide whether a project fits into the legitimate interest legal basis for processing that’s been created by the new General Data Protection Regulation in the EU, “so there’s great interest by EU regulators in this Canadian project,” he said.

“The process that we’re talking about for legitimate interest and implied consent begins to link to whether a process is unfair,” Abrams said. “That unfairness test is very similar to what you would do in Canada and what you would do in Europe to show that you’ve balanced the interests of society and the individual correctly.”

Other jurisdictions in Latin America, Asia, and elsewhere are likely to have a keen interest as well.

But what about the concept itself? Isn’t the very idea of going beyond a person’s expectations and consent a violation of privacy in the first place?

Even in the EU, “law anticipates places where it is incredibly important to use data even when consent is not in play.” To save a person’s life for example, but also simply to execute a contract and deliver promised goods.

“Part of why the Canadian project is so important,” Abrams said, “is that it brings in the views of advocates. We’re going to have a multi-stakeholder process to vet what comes out of this.”

Yes, there is an interest in privacy. But there is also an interest in better healthcare, better education, sharing the benefits of technology, economic growth. “There’s always been this anticipation of balancing interest,” Abrams said. “You need to recognize the full range of interests of all the stakeholders. Those folks who bear risk have to get a benefit and you have to be able to demonstrate that. Consent is always the preferred methodology, but in a world where we have interests beyond just privacy, it’s important to have a balancing process to look at the full range of stakeholders.”

He points to the unfair practices concept in the United States, whereby potential harm to the individual is weighed against benefits to that individual or society as a whole. Even if the data use is actually described in a privacy notice, the use of that data might go beyond expectations simply because no one could be expected to read the whole notice and understand it.

format_quote“The onus is put on the organization to make sure the risks are mitigated,” the IAF's Marty Abrams said, “and the benefits are enhanced.”

“The onus is put on the organization to make sure the risks are mitigated,” Abrams said, “and the benefits are enhanced.”

Should organizations looking to profit be the ones making these decisions, though? “I have a real concern about corporate paternalism,” Abrams countered. “I am concerned about the sense that businesses are the judges. But the businesses are making those judgments with the understanding that there are regulators who are overseeing the process, and increasingly those regulators are empowered. … The role of the regulator is so important.”

If the project goes as planned, not only will organizations have a framework for decision-making, but regulators will also have a framework for holding them accountable.

“There are times when we’re bringing data together to create insights with incredible value,” Abrams said, “and I’d much rather see that done in a governing process that understands the risks and benefits than to rely on a system that is based on notices that might be clear and honest but are just so long and so complex that they are beyond the understanding of the individual.”

Comments

If you want to comment on this post, you need to login.