Andreas Weigend knows Big Data. As former chief scientist at Amazon and now consultant on social and mobile technologies to global firms like Best Buy and Nokia, he’s working daily with firms to help them navigate what he calls the Social Data Revolution.
He’s even addressed the United Nations General Assembly and the FTC on the subject, as you can see here:
But what is Big Data in Weigend’s eyes, and how does that concept integrate with prevailing ideas on privacy and data use?
“Big Data is a mindset,” he said in an interview with The Privacy Advisor. “It’s really how you think about interacting with data; it’s the questions you’re asking and the response time of getting answers and refining the questions.” And that’s where the privacy professional is crucial. Sure, said Weigend, you need the actual data, as well as tools and skills with which to work with that data, but without the overarching mindset, it all falls apart.
That’s why he’s eager to address the crowd at the IAPP’s Data Protection Intensive in April as a keynote speaker. “You’ll have people in the audience,” he said, “who are in the business not of creating data sets or skill sets, but actually changing the mindset…Big Data is characterized by fast feedback not by analysis paralysis.”
Similarly, just as one might see “Safety First” signs at every turn when visiting an oil refinery, so, too, must privacy be ingrained in any organization engaging in Big Data. That way, the organization can take advantage of what Big Data has to offer without blowing the whole place to smithereens.
“Personal data is the new oil,” Weigend said. “Oil is something that needs to be refined to be useful. Our economies depend on it. Wars have been carried out over oil.”
Of course, you might object, oil is valuable because of its scarcity; data is wildly abundant! Weigend has an answer for that. Just as oil is fairly useless coming out of the ground, and needs to be refined, it’s the refined data, the true Big Data, that’s rare in today’s economy. “Raw data is pretty useless for most people,” he said.
Further, just because there are privacy implications to analyzing data doesn’t mean it can’t be done correctly. “Just because an oil refinery can blow up,” said Weigend, “doesn’t mean you stop using oil…We need to have a good way of dealing with these uncertainties and high-risk, low-probability events.”
His suggestion? Basically, utter transparency.
Imagine a two-by-two matrix with four cells: Expected benefits, unexpected benefits, expected drawbacks and unexpected drawbacks. “To move the privacy discussion forward,” Weigend argued, “one really needs to distinguish those four cells.”
If, with any interaction, the user of a service has a clear idea of the value for each of those cells, the user can make an appropriate decision. Thereby, the privacy interaction becomes less oppositional and more cooperative. Each side is clear on potential benefits and drawbacks and there is potential for everyone to see a benefit in the transaction.
The unexpected downsides are where lots of attention is paid, of course. The refinery blowing up; your data being leaked. But those unexpected downsides often depend on your situation, Weigend said, and how much risk you’re willing to take on.
“The user fills in the weights,” he said. “If you have millions and millions of dollars in your bank account, you’re worried about the unexpected risk. But if you’re a poor person in India worried about having something to eat today, you’re not worried about some downside, some second-order risk. You’re worried about getting rupees today.”
He continued, “The privacy pro is the person who sees the consequences of the potential actions and can visualize and make tangible these probabilities, the benefits and the costs. The privacy pro can explain them so well that eventually individuals can make the right decisions for themselves.”
Editor’s Note: Hear more from Andreas Weigend at IAPP Europe’s Data Protection Intensive in London, April 23-25.
If you want to comment on this post, you need to login.