TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout


A blog post on Freedom to Tinker caught my eye on Tuesday. In it, Arvind Narayanan talks about using encryption as protest. As he points out, in computer science, privacy-enhancing technologies (PETs) using crypto are generally used to protect anonymity or confidentiality against an adversary. But Narayanan also connects us with the thinking of Helen Nissenbaum on the political and ethical theory of obfuscation, which is “a strategy for individuals, groups or communities to hide; to protect themselves; to protest or enact civil disobedience, especially in the context of monitoring, aggregated analysis and profiling.”

Going further, Narayanan looks at “the hypothesis that users of encryption tools also have protest and civil disobedience in mind, instead of (or in addition to) self-defense or anonymity.”

Both good actors and bad have long used PETs to accomplish their goals—whether it’s a journalist protecting a source in a repressive regime or a child pornographer accessing and distributing illegal images. Yet, in another side effect of the Snowden disclosures and the reaction of people around the world to the overpowering surveillance capabilities of the state, PETs and encryption may now have a new and significant role. Ubiquitous surveillance, particularly by the NSA, has made some consumers gravitate to PETs, but it’s also affected how many non-U.S. users view U.S. business—enough to motivate the big tech companies to start rolling out and integrating PETs and encryption.

We’re seeing encryption and privacy make its way into the mainstream. Google is encrypting traffic on its servers, building an extension for PGP on Gmail called End-to-End and is even playing the name-and-shame game by identifying domains that don’t encrypt data. DuckDuckGo is now teaming up with Apple to allow for builtin private searches in the Safari browser and more companies—including Yahoo, Facebook and Twitter—use HTTPS by default.

A few months back, I decided that I should know how to use some basic encryption tools. I downloaded the Tor browser, read up on encrypted e-mail and put the Wickr app on my phone. Yet, I haven’t been actively using them. The Tor browser can be pretty slow, I still haven’t gotten around to using encrypted e-mail and I downloaded Wickr but don’t use it because my friends don’t use it.

(More on this to come: I plan on using and documenting my experiences as a novice PET user to see just how easy or difficult it is to use them.)

And that’s the catch about which Narayanan writes. “There is an inescapable trade-off between convenience and security—tools that put security first require user training and informed decisions, whereas insecure ones pitch themselves as one-click solutions.” And as he points out, “those that cater to the lowest common denominator will win.”

We know, because of a new survey from EMC, people love the ease-of-use of their devices, but are not willing to trade more privacy to get more of those services. In fact, only 21 percent said they were willing to make that trade. But how many are willing to trade ease-of-use for more security and anonymity?

Narayanan goes on to ask an important question about what can be learned from a market that naturally sinks to the lowest common denominator. “First, security and privacy researchers should study how users actually use PETs instead of assuming that all users have the same set of values and preferences.”

That’s sound advice. Privacy and security pros, ask yourselves this: Why do users want your services to be encrypted? Will the privacy and security enhancements be easy to use and therefore embraced, or will giving users the privacy they say they want inevitably drive them to a different product?

photo credit: VCU CNS via photopin cc


If you want to comment on this post, you need to login.