TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Finding the balance between innovation and risk managment Related reading: A view from Brussels: Behavioral advertising is an unstoppable current

rss_feed

""

Perhaps an overused moniker, big data – in its most concrete forms – has challenged traditional privacy concepts, particularly parts of the Fair Information Practice Principles. Well-established business norms around notice and choice, for example, are becoming difficult to implement in certain frameworks. Creating bright lines between what is high and low risk is often an imperative for senior management. But with the sheer amount of data that can easily and cheaply be collected, and the drive to innovate and bolster the bottom line, how can privacy pros navigate what is an acceptable innovation versus a potential privacy risk? 

“Senior management would prefer a brighter line, no doubt,” said Acxiom Ex Officio Chief Privacy Officer Jennifer Barrett Glasgow. “But the world of data has gotten so nuanced and context- and situation-specific, I don’t think excessive bright lines can be drawn. That’s why we’re seeing more and more that risk analysis is applied and that’s where judgment comes in.”

Glasgow, who was joined by FTC Commissioner Maureen Ohlhausen and Wilson Sonsini Parter Lydia Parnes at Privacy. Security. Risk., in San Jose, Calif., said making a correct judgment about what is an acceptable data use and what is not is something that makes companies that want to comply get very nervous. “Where we’re headed is to a more situational analysis that would take into account all stakeholders – something different from a privacy impact assessment – and takes in more considerations than privacy – this would include reputational and embarrassment factors,” she said.

"In some ways, we’re better at pointing out the scary things with emerging technology and innovation, but we’re not as good at pointing out the benefits." -FTC Commissioner Maureen Ohlhausen

For companies in the big data space, government regulators should not be the only factor that worries organizations. “We need to think about the fact that getting in trouble with the FTC isn’t the only governing factor here,” Ohlhausen pointed out. Brand can be tarnished, and consumer trust can be lost. “We've seen in the data security space a big impact on companies, even if they weren’t held liable.”

Finding an effective balancing test for the need to innovate against the potential alienation of an organization’s user base requires nuance and contextuality. “It comes down to a risk decision,” said Glasgow. “If it’s not clear if something we’re doing is right or wrong, we apply the creepy factor. If it begins to raise red flags, then you ought to think about using some mitigation around that data.”

At Acxiom, Glasgow said the company, for example, works with the health industry. On one level, it would be inappropriate to advertise to users health-related products based on their health records, but what if an individual uses an insulin pump, and a company has designed a newer and more effective pump? “If I have diabetes, I might want to know about this new pump, but we want to be sensitive about it,” she said. “At Acxiom, if we work with health data, we will only let companies in the health space use that data – not, say, a hotel or airline.” 

She added, “There are ways of doing these analyses: If it doesn’t feel right, what can I do to mitigate that?”

“There are ways of doing these analyses: If it doesn’t feel right, what can I do to mitigate that?” -Jennifer Barrett Glasgow, Acxiom

“One of the challenges in this space,” Ohlhausen noted, “as a regulator or a business, is that it’s such an evolving space. The social norms have to catch up with the technology, products and tools. In some ways, we’re better at pointing out the scary things with emerging technology and innovation, but we’re not as good at pointing out the benefits.”

“I’m all for taking a deep breath,” she added. “If you can put your finger on or identify a potential harm, I’d be very cautious.”

For Ohlhausen, businesses should really be concerned with sensitive data, noting that beneficial use less sensitive data is less concerning. She said companies should not have to be overly prescriptive in describing potential future uses of data: “If you have to specify potential future use of data at the point of collection in your privacy notice, that gets difficult and will likely create an uninformative privacy policy.”

She also said she’s in favor of applying the FIPPs to sensitive data.

“As we embrace big data and employ more data scientists,” said Glasgow, “they need to be well trained in data science and be aware of the sensitivities we’re talking about. They should not have to be walking and talking FCRA experts. But I think we should not set the expectation that we’re striving for perfection. Is 80 or 90 percent enough?”

She added, “Because of this, I hope businesses do not fall into not innovating, and I hope that regulators do not impose unrealistic expectations."

Comments

If you want to comment on this post, you need to login.