TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Appealing to the Consumer Through Privacy Policies Related reading: OCR issues rule for reproductive health care under HIPAA

rss_feed

""

With increased media coverage of privacy policies, consumers have become increasingly aware of privacy issues involving data collection. Ironically, some revamped privacy policies detailing exactly what data is collected, and how it is used, have spurred more outrage than praise.

Even if a company does not run afoul of Section 5 of the FTC Act, the language and format of its privacy policies may still turn consumers away from its products and services.           

For example, Microsoft's Windows 10 end user license agreement (EULA) stated it would "automatically check your version of the software and download updates or configure changes, including those that prevent you from accessing the Service, [or] playing counterfeit games." Rather than reading the actual terms of the policy, reputable websites began publishing their own articles with titles such as, “Windows 10 Can Find and Disable Pirated Games.”

The result? Hundreds of commenters on each article exclaimed that this was “another reason not to upgrade,” leading to a panic that may have cost Microsoft potential customers.

But here’s the thing: Microsoft cannot actually find and disable unauthorized hardware and software on your PC. In fact, the quoted statement came from Microsoft’s Service Agreement, not the Windows 10 EULA. This Service Agreement applies to services such as Skype, Xbox Live and Windows Live games. Given Microsoft’s recent ferocity against pirated online gaming, this statement likely means pirated games will not be playable online through Xbox Live or Windows Live.

The lack of due diligence by a single publisher caused a flood of backlash against Microsoft for an alleged privacy infringement it did not commit. Although this issue concerns piracy more than privacy, it provides an important example in which misinterpreted policy language can spread rapidly. Had this been a smaller company without such a global presence, the negative initial consumer reaction could have spelled life or death for the new product.

In another example, AVG, a free anti-virus company, made a good-faith attempt to be clear and transparent with its users on how and why it uses their data.

In the first sentence of its privacy policy, AVG stated that it uses “data to improve products and services . . . and to make money from our free offerings with non-personal data.” The mere idea that a company could make money off the collection of consumer data left many with an uneasy feeling.

In fact, the media spent more energy playing up the audacity of selling data for money instead of analyzing whether AVG’s new privacy policy meets reasonable expectations. And like the Microsoft example, mass media and publishers have facilitated the consumer’s fears through headlines such as, “AVG says it can sell your browsing data,” and “AVG uses non-identifying data to make money.”

We will likely have to wait until OPSWAT’s January 2016 report to find out how severely AVG’s market share has been affected, if at all. We do know, however, that AVG fell short on its third quarter Wall Street estimates while its stock dropped 26 percent between August and November.

Why It Matters

When creating or updating privacy policies, companies are usually focused on complying with Section 5 of the FTC Act through the manta: Do what you say, say what you do. Most companies, however, stop at just that. Some companies have taken it upon themselves to ensure their privacy policies are fully transparent and void of legal jargon, but they fail to take it one step further—they fail to appeal to the consumer.

Writing policies in clear layman’s terms may seem helpful at first, but, as seen in the AVG case, such an endeavor can create more problems than solutions. AVG could have put less emphasis on its "money-making" purpose, causing consumers and reporters to focus more on the ethics of selling data for money instead of the actual terms of AVG’s updated privacy policy.

This is not to say that companies should make their policies any less clear or omit potentially concerning clauses. Rather, companies should simply shift the focus of their privacy policies to aspects that benefit the user.

Because the average user is becoming more aware of privacy issues, privacy policies are now being used as marketing tools. When companies offer similar products and services, consumers look to their privacy policies to help make the final decision.

In order to take advantage of this new trend, companies need to cater their privacy policies to consumers’ interests and concerns.

It is important to keep in mind that even though privacy policies are emerging as a new marketing tool to get an edge over competitors, consumers will still spend little to no time reading them. The goal, then, is to pitch the advantages of your policy as soon as possible, in the fewest words possible.

Here are three methods to help achieve that goal:

First, start the policy with the good. Tell consumers what is protected before telling them what is collected. Tell consumers what rights they have before telling them what rights third parties have. Think about why your policy is better for consumers than other competitors. Then start with that reason. For example, instead of writing, “we sell your non-identified data to third parties,” try, “we protect your data by excluding your name and email when that data is shared.”

Second, use examples. Enumerated examples will ensure peace of mind with your consumers. If used in conjunction with the first method, consumers will know right off the bat exactly how and what data is protected.

Third, make benefits conspicuous. No one likes reading walls of text. Make ample use of bullets, numbered lists and typographical emphases to distinguish the parts of the policy you want consumers to notice first.

This last method applies equally to layered policies and just-in-time disclosures. Layered policies help ease the burden of information presented to consumers; just-in-time disclosures allow consumers to give affirmative express consent when needed.

Consumers are more likely to read media articles about privacy policies than the actual policies themselves. While policy advocates and journalists help promote better practices through increased scrutiny, privacy policies should aim to make it as easy as possible for media outlets to promote the positive aspects of their policies.

Bottom line: A privacy policy will become an effective marketing tool if it starts with the good, uses examples and makes benefits conspicuous.

photo credit: DSC_0077 via photopin (license)

2 Comments

If you want to comment on this post, you need to login.

  • comment Dale Smith, Jr. • Jan 4, 2016
    Sean:  I totally agree that privacy notices resonate best when they are designed and delivered with the consumer's point of view in mind.  Jed Bracy touched on this topic in a previous iAPP Daily Dashboard post: https://iapp.org/news/a/on-building-consumer-friendly-privacy-notices-for-the-iot/
    
    While AVG and others who have pioneered simple consumer-friendly disclosure have met with criticism, it is difficult to imagine a privacy world two years from now (GDPR time) wherein obfuscation of PII stewardship continues to be rewarded.  IMO, the trend to "Say What You Do, Do What You Say" is inevitable.
    
    Dale Smith, CIPT
    Futurist, PrivacyCheq
    
    The best practices you suggest can be particularly effective in building and delivering privacy notices for IoT products,  where a full-blown boiler-plate privacy policy is simply impractical.
  • comment Jay Libove • Jan 6, 2016
    Hi Sean, while mostly I agree, I have to take issue with the recommended alternate text you proposed for AVG's example of the underlying protection ("we sell your NON-IDENTIFIED data" [emphasis mine]) they use in support of "we make money". We must always read all privacy language that we write with the eyes and sentiments of the 'worst-smartest' person who will read our language in the real world; that is to say, someone who can understand and can coin a phrase and who can light the firestorm. (And then be picked up by the irresponsible media).
    
    That person would read the proposed alternate text "we protect your data by excluding your name and email when that data is shared" and - correctly - point out (in very catchy, repeatable prose) all of the clear examples of re-identification of inadequately anonymized data.
    
    There's no easy answer here. Sure, AVG might have come up with a better balance in terms of managing readers' reactions, but I think they were right to be frank about it rather than risk appearing Orwellian "We do (something, not enough) to protect you while (on the other hand, really, the thing you could accuse us of trying to distract you from is that) WE MAKE MONEY OFF OF YOU! WUHAHAHAHAH!".
    
    [To be clear, I don't think AVG is evil, and possibly is better than some of its competition in terms of its real privacy practices; I'm just putting myself in the 'opponents'' shoes]
    
    I feel that AVG threw itself on the privacy grenade for the rest of us. It chose to be the first one to really clearly (okay, maybe not quite clearly enough, but we're all still learning here) say "Look, we do a lot for 'free', but it's not really free - TANSTAAFL, and this is how you pay for it, and we hope you agree with the balance".
    
    We're all - those who make any use whatsoever beyond the most core privacy principles (collect only what is absolutely necessary, use it only as absolutely necessary to deliver only what the customer directly, knowingly asked for) which is all of us - going to have to handle that grenade. The GDPR brings the grenade closer.
    
    We can learn from AVG's fail? experience? noble effort?, but I urge great caution in 'learning' a wrong lesson - to say things in a way which the reader could feel is coercive or hiding the truth.