In June of 2013, I wrote a piece wondering if the Snowden disclosures would give a massive boost to the privacy-enhancing technology industry. Well, 16 months later, it’s safe to say, it has. Not only are start-ups springing up left and right—think Ello, Blackphone, Wickr—but the tech giants like Apple and Google are moving to encrypt user data as well.
In a panel discussion at the Privacy Academy a couple weeks back, both the Blackphone and Wickr CEOs stated that, without question, the flow of venture capital toward privacy-enhancing technologies has increased by leaps and bounds.
But some people are less than keen on these crypto lockdowns.
During the last week, law enforcement representatives, from the top on down, have been calling on technology companies to avoid strong encryption. Ominously choosing the Global Alliance Against Child Sexual Abuse Online conference to make his announcement, U.S. Attorney General Eric Holder said he hoped tech companies “would be willing to work with us to ensure that law enforcement retains the ability, with court-authorization, to lawfully obtain information in the course of an investigation, such as catching kidnappers and sexual predators.”
Others have gone further in their language. Last week, U.S. Federal Bureau of Investigation Director James Comey argued the encrypted technology recently rolled out by Apple on its iPhones allows customers to “place themselves beyond the law,” while Washington Metropolitan Police Chief Cathy Lanier told Bloomberg that encrypting mobile data “is a very bad idea.”
Does the average person really think that locking their personal data on their mobile devices puts them beyond the law? I certainly don’t think so.
In an op-ed, since corrected, former Assistant Director of the FBI Criminal Investigative Division Ronald T. Hosko wrote that Apple and Google’s new encryption rules make law enforcement’s jobs more difficult. He even went as far as saying that if their new encryption rules were in place previously, then the agency would not have been able to rescue a kidnapping victim in Wake Forest, NC, forcing The Washington Post to later write, “This is not the case. The piece has been corrected.”
Without getting too immersed in the cases for encryption—which I am for—there are several sources worth checking out, if you’d like to delve deeper. The Electronic Frontier Foundation has put together a blog post detailing “Nine Epic Failures of Regulating Cryptography”; Mashable published a nice piece rounding up how encryption won’t stymy investigations and includes relevant tweets from various crypto exerts and pundits, and in an open letter published by ZDNet, computer scientist David Gewirtz wrote an in-depth explanation to Comey on why U.S. citizens expect privacy.
To boil it down, law enforcement still has a number of ways to investigate and catch pedophiles, kidnappers, rapists and terrorists. And I hope they continue to do so. But it shouldn’t be at the expense of everyone’s privacy and security. Just because I use a privacy-enhancing technology, it doesn’t mean I’m committing a crime or even trying to hide something. There’s simply an inherent, even human, feeling of freedom in knowing that your every move isn’t being tracked.
To put it another way, should law enforcement have a master key to everyone’s locked house or car? Of course not. How is someone’s personal mobile device any different?
Or, to put it yet another way, am I a criminal for using the Tor browser, or Wickr, or for wanting my phone to be locked down? Take for example, reports that a telecom has allegedly said the Tor browser was “illegal” and threatened to cut off Internet service to Tor users.
The old “nothing-to-hide” argument has time and again been disproved. Just ask Justice Antonin Scalia.
And what about law enforcement’s knowledge of basic mobile digital technology? In the past, I’ve reported on how lawmakers and the U.S. Supreme Court have had difficulty understanding digital technology. Why would some in the law enforcement community be any different? Seems like, for example, some could learn the basics of how encryption works. I’m not calling out the entire community—I certainly have a lot to learn myself—but understanding that an encrypted iPhone will not prevent authorities from finding a kidnapper seems pretty basic.
Plus, Gewirtz, in his open letter to Comey, raises, I think, the crux of the issue here: privacy expectations.
It’s not just U.S. citizens who expect privacy; naturally, most people the world ‘round do. For companies trying to sell their products based on privacy, this feeds into the bottom line. Want to do business in Germany? How about Brazil? Or China? Ensuring that backdoors and other security vulnerabilities aren’t built in is a necessity. Apple, for example, can now sell the iPhone 6 in China, but that was after an exhaustive analysis by the Chinese on whether the devices had backdoors for U.S. intelligence agencies. And now we see Google, Microsoft, Yahoo, among others, explicitly compete on privacy claims.
Whether you’re a start-up like Prism Skylabs selling your in-store video analytics platform using privacy as a competitive differentiator or you’re tech giant Google finally enabling your three-year-old encryption services by default, businesses and people want these products. Yes, much of it is because of the Snowden disclosures.
As The New York Times reported, we are living in the post-Snowden age, and this gives privacy professionals a huge role.
And though he doesn’t explicitly say anything about law enforcement and encryption, I think this data ethics should extend into appropriate cryptography. Yes, there is a delicate and mercurial balance between government and individual rights—this has long been a debate for political scientists—but respecting people’s basic privacy expectations and strengthening the overall security and privacy of the digital world should not be considered enabling criminality.
Rather, it is both basic and necessary.
If you want to comment on this post, you need to login.