Are you on board with data governance yet? Like “accountability” before it, “data governance” is a term catching on with regulators and maturing privacy programs to describe something that goes beyond compliance with current laws regulating the collection and use of personal data.
Perhaps the most apparent departure from the way typical privacy programs operate is in EDPG’s suggestion that personally identifiable information be but a piece of a larger look at how an organization collects and uses all types of data.
“The big change in the model we’re thinking about is that it’s not just about personal information, it’s about all data,” said Peter Cullen, CIPP/US, executive strategist at the IAF. “Increasingly, even innocuous data can be used for predictive insights about individuals. So, in effect, this model is about the data and the use of that data.”
Why the shift in focus? Cullen points to the data generated by a mobile device, by way of example. Many of the data points that could be collected about the device and its use would normally be seen as harmless, and certainly not sensitive: when it’s in use, how fast it’s moving, its three-dimensional position on the earth. Much of the data simply makes the device perform the functions the user asks of it. Taken together and analyzed using today’s data processing, of course, we could get a robust picture of a user’s health, which might be very sensitive information, indeed.
Were a privacy impact assessment to be done on any one of those data collection activities individually, it’s very possible no flags would go up.
That’s why the IAF is advancing a framework to provide more effective data protection governance, part of which is something more potentially meaningful for organizations collecting and using data: a comprehensive data impact assessment, or CDIA.
“It’s a more nuanced way to think about the data,” said Cullen. “But it puts the individual at the center. It’s more aligned with the way customers want organizations to consider their data, and more in tune with the way regulators are thinking.”
The CDIA would enable an organization to think about a piece of data’s possibilities, focusing on the nature of the data, itself. The great promise of big data is that it’s impossible to know what insights might become apparent once the data is collected and analyzed. That’s also, of course, where additional risks can rear their heads.
“We want knowledge to be created,” said Cullen, “but also a structure within an organization to help people notice when it might have an impact on the individual and then engage with the individual when this impact should be accompanied by meaningful choice and consent.” It’s forcing an organization to think more deeply about when “privacy” is in play and extending the risk-benefit analysis to the broader set of rights and interests of individuals and other stakeholders in the data ecosystem.
"We need to have this structured way of thinking to make sure that even when we have this automated way of processing information we can make sure the individual’s rights and interests are being protected.” —Peter Cullen, International Accountability Foundation
“We have to build that infrastructure,” argued Cullen, “because we’re moving into algorithmic-driven artificial intelligence, and these things are going to happen almost autonomously. We need to have this structured way of thinking to make sure that even when we have this automated way of processing information we can make sure the individual’s rights and interests are being protected.”
Already we’ve seen many organizations move privacy impact assessments earlier in the product-development lifecycle, from just prior to release all the way back to the drawing board. In a big data environment, the CDIA comes before the drawing board, when what might be seemingly innocuous data is collected, or combined with other data that also seems innocuous. It also looks forward to deal with the possibilities of subsequent data analysis.
“We’ve got to incorporate this right form the point of ingestion,” said Cullen, “especially when data is coming from outside the organization. It must be ingested with policies surrounding accuracy, the providence of the data, and what obligations come with it.”
“I might a year from now generate a new process that generates new insights about individuals,” said Cullen, “which is a new business model, even though the data itself might have been collected a year ago under the assumption there was no sensitive data. Contrast that with a PIA that’s emphasizing data collection and purpose specification. We have to think in more of a fluid manner. … We’re recognizing that there isn’t a single piece of data that only has one use for one purpose. When we bring that together with the data analytics we can do now, how do we create a model to make sure that the new use is looked at and the new analytics are considered and we are mitigating the risks to the company and the individual?”
Or flip that around: How do we ensure that the value of data to an organization isn’t going to waste? When you look at “Responsibly leveraging data in the marketplace,” a paper Cullen helped pen for PwC as their privacy innovation strategist, you see that in some places the real risk might come from sitting on your hands.
“I’m struck by the amount of effort that organizations are putting themselves into to prepare for the GDPR, and they’re really looking at it as a big compliance obligation.” —Peter Cullen, IAF
“I’m struck by the amount of effort that organizations are putting themselves into to prepare for the GDPR,” Cullen said, “and they’re really looking at it as a big compliance obligation.” Almost in parallel, in a different silo, those same organizations are demanding that their people find new ways to leverage the value of data.
“If you do all this right,” mused Cullen, “you move privacy from a cost of compliance to an enabler of this data-centric strategy. It’s a much more positive way to think about the discipline of privacy. It’s a way that privacy should be thought about: What’s my organizational strategy around data, and how should I think about the range of risk and opportunity and what governance capabilities my organization might need to fully manage those risks and opportunities?”
Maybe a start down that path is incorporating the concept of a CDIA and data-use governance into organizational thinking.
If you want to comment on this post, you need to login.