Facebook made
—for a positive reason, this time—by announcing a new set of privacy controls to help users understand with whom they are communicating. Last month, at the IAPP Global Privacy Summit,
foreshadowed this roll out by exclaiming, “If people are surprised, that’s not good for me.”
What did she mean, exactly?
Well, in a Big Data and Internet of Things world, where providing users with notice and choice often proves extremely difficult, if not impossible, surprise minimization is becoming a powerful tool businesses can use to help engender trust with consumers while avoiding the ire of regulators. In other words, this is about recognizing context, knowing what customers expect from your services and being accountable to them. A successful company should be asking: “Would our users be surprised by what we’re doing with this information? Is what we’re doing going to
?”
And take note: The regulators are certainly paying attention to surprise minimization.
Look at the
, for example. This groundbreaking guide for mobile app developers, platform providers and device manufacturers states, “Recognizing that the legally required general privacy policy is not always the most effective way to get consumers’ attention,
Privacy On the Go
recommends a ‘surprise minimization’ approach. This approach means supplementing the general privacy policy with enhanced measures to alert users and give them control over data practices that are not related to an app’s basic functionality or that involve sensitive information.”
And in Europe, regulators expect organizations to use surprise minimization as well.
At last year’s 35
th
Annual Conference of Data Protection and Privacy Commissioners, Dutch Data Protection Authority and former Article 29 Working Party Chairman Jacob Kohnstamm emphasized that a society becoming more “appified” needs control over their data.
, “In fact, he emphasized particularly that ‘surprise minimization’ is a major focus for global DPAs going forward.”
In yet another example, just-in-time disclosures—a sort of warn-and-notify-in-real-time approach—are an essential component in the Federal Trade Commission’s report on
. “Provide just-in-time disclosures and obtain affirmative express consent before collecting and sharing sensitive information,” it states.
Technology is changing so fast, users often don’t realize what’s being done with their information. In fact, just this week, the FTC reached a final order with
Goldenshores for deceiving customers. Who knew that a flashlight app would upload your contact list?
Surprise!
Beyond mobile privacy concerns, as connected cars, appliances and wearables become more ingrained in our daily lives and consumer-facing interfaces less so, surprise minimization will play an even more powerful role. If I buy a connected refrigerator, I might expect, that, yes, it can tell me I’m out of milk or cheese or, god forbid, beer, when I’m at the grocery store, but would I expect that the software maker shares that data with a health or life insurance firm? No. A solid business will notify and give me choice when the data being shared or the person it’s being shared with is outside reasonable expectations.
In another example, if I’m using Google Maps, sure, I understand my location is needed for it to work. I want it to use my location! But if it’s uploading my contact list, that surprises me. Instead, it should be notifying and providing me a choice to opt out of that function.
Really, surprise minimization is a way for businesses to regulate themselves. And this is important because it’s readily apparent consumers often face a gap between their expectations and the services provided.
Last November,
that showed the disconnect between social media users and the knowledge of what they’re actually sharing. Tweeting seemingly innocuous things like “what I’m having for lunch” or “my dog’s name” and then being approached by someone, in person and in that actual public space, who knew about such data opened a lot of eyes. One reader commented, “I would definitely say it creeps me out that a total stranger could get that kind of information about me.”
Yet people make it easy for that very thing to happen all the time.
On top of that, Twitter recently announced its
. People on Twitter were pretty excited, but did most realize the default was set to public tagging? As a user, I had an easy time going in and changing this, but I think it’s fair to wonder whether some users might be surprised when they find themselves tagged in a photo by someone they don’t even know or follow.
Making sure that the trust gap is as small as possible has deep value for a businesses. Just look at the recent
. According to the study, 80 percent of users are willing to give up personal information to a trusted company.
One of the best ways to keep their trust? Don’t surprise them!
Clearly, Facebook has realized that. Whether Facebook maintains and cultivates user trust may well depend on how well they predict when it’s appropriate to display that privacy dinosaur.
photo credit:
via