TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Data Privacy Day: Reflections on five years in the privacy profession Related reading: Using sensitive data to prevent AI discrimination: Does the EU GDPR need a new exception?


This Data Privacy Day — Jan. 28 — marks my fifth year in the privacy profession. While five years in a profession hardly makes me seasoned, it’s an eon in technology terms. I have learned much from many patient, kind, and intelligent colleagues, and that started on day one.

I remember much about my first day in the profession. I attended a bootcamp on the then-new requirements to the Children's Online Privacy Protection Act. In the margins of a handout, I noted participants argued that the 2013 amendments to the rules around personally identifiable information and verified parental consent would be the death of kid's apps and mobile innovation. Five years later, this prediction has not borne out, a running theme when it comes to prognosticating the future of privacy and technology. Instead, there seems to have been some palpable relief when the Federal Trade Commission recently clarified that COPPA applies to connected toys, and COPPA compliance has become a badge of honor in an internet of insecure things.

Yet, as the cliche goes, the more things change, the more they stay the same. Looking back over my first five years in privacy (and the admittedly limited horizon of experience that provides me), it is fascinating to see how privacy debates have shifted, even as the substantive issues remain the same. For sentiment's sake, I thought I might spend this Data Privacy Day highlighting the five trends that seem to have stuck with me most.

1. From CPBR to GDPR

So much ink has been spilled on the EU General Data Protection Regulation. But as someone working largely as a U.S.-based privacy advocate, even I have been struck by the degree to which conversations about data have been impacted by the EU's looming comprehensive privacy regime. Five years ago, however, the Obama administration's Consumer Privacy Bill of Rights was in its infancy, launching endless conversations about how to give users more control over their information and how to properly respect the context in which data is collected. Though warmly received, a serious legislative proposal never materialized, derailed, first, by political paralysis and, second, by a shift in privacy debates toward corporate facilitation of U.S. surveillance and not industry's own practices.

In the process, U.S. leadership on commercial privacy has been lost. The Federal Trade Commission would earnestly and sincerely dispute this, but just look at the trajectory of the IAPP's Westin Fellows from combing FTC cases to analyzing any scrap of data protection guidance out of Brussels. Privacy advocates cautioned that the U.S. would be left behind global data protection trends, and now, our outlier status is complete. Instead of creating a privacy baseline, the U.S. Congress even went so far as to repeal rules that would have limited how internet service providers collect and use browsing history. The prevailing rationale wasn't that such information wasn't sensitive or undeserving of protections but rather that everyone from edge providers to data brokers already have treasure troves of data, so why not ISPs, too? Meanwhile, it is worth noting that the GDPR incorporates elements of those dastardly COPPA rules and data breach notification laws first dreamed up on our side of the Atlantic.

2. The shadow of surveillance grows

The sixth annual Privacy Law Scholars Conference was brought to a halt by bombshell revelations about U.S. surveillance practices from Edward Snowden. The passion and energy of that moment made privacy something tangible not just to every American but every citizen under the watchful eyes of government around the globe. Within a month, metadata entered the lexicon of every privacy professional. The U.S. intelligence community suddenly faced an empowered Privacy and Civil Liberties Oversight Board and even saw its powers curtailed for the first time in decades via passage of the USA Freedom Act, where Congress made bulk collection under the PATRIOT Act illegal. And, of course, the U.S.-EU Safe Harbor agreement was effectively brought down on the back of Edward Snowden's thumb-drive. More importantly, major technology companies and platforms were compelled to explain to their customers — and the wider public — that they were not providing governments with direct access to user information. Transparency reporting has become more common, and companies have gone on the offensive against government gag orders and the ability to disclose surveillance orders.

These are all tremendous positives, but at the same time, shifts in the political landscape suggest the moment for extensive reform has faded. And the United States’ worst habits are spreading. Other countries surveillance laws are equally permissive to the U.S. with potentially less oversight or transparency. Rather than improve protections for human rights, global surveillance activities only continue to grow. Even major democracies like Germany, France, and the U.K. have enacted laws expanding their surveillance apparatuses with gaps in oversight or procedural protections. At home, the recent reauthorization of Section 702 of the FISA Amendments Act was demoralizing to privacy advocates, who won't get another shot at significant reform of that authority for six more years. And while it shouldn’t be relegated to a footnote, the Crypto Wars also continue without end as strong encryption continues to be criticized by elements of the U.S. Department of Justice and by authorities in other governments, as well.

3. From big data to automated decisions

Buzzwords and trends track any profession, but in privacy, the evolution of big data into the triple threat of algorithms, artificial intelligence, and automated decision-making has been especially jarring for me. One of my first privacy projects was to pull together a full-day conference on how to bridge the gap between privacy principles and big data; whether it was in the hands of data brokers, big businesses, or governments, there was a tangible fear that big data would serve as a catalyst for new forms of digital discrimination, profiling, tracking, and exclusion. On the other hand, the Obama administration's seminal 2014 big data report would declare that big data would transform the way we lived and worked; by contrast, the word automated only pops up eight times in seventy-nine pages.

Both sides of this coin were never really resolved, but rather than continue to argue over how many "Vs" made up big data — volume, velocity, and variety would be augmented by another dozen or so terms over the years — policy discussions have rapidly moved toward questions about the objectivity and bias in automated systems. Whereas de-identification and anonymization technologies were offered as potential solutions to the privacy ails of big data, it is unclear what any of that truly gets us in the context of automated decisions driven by opaque machine learning. At the same time, these technologies have been deployed throughout our societies, with a less than ideal regard for their efficacy, and this challenge has captured the attention of private initiatives and, increasingly, public officials. The GDPR may even portend a potential right to meaningful information and explanation to understand some of how these processes work, but further interrogating these systems will be a primary challenge for privacy professionals moving forward.

4. Transparency is dead, long live transparency!

It takes twenty-five days per year, reading non-stop, to read all of the privacy policies we come into contact with each year. That was the conclusion of Lorrie Cranor and Aleecia McDonald six years ago and the situation has not improved. The FTC's staff report on mobile privacy disclosures, issued in February 2013, suggested a future of privacy symbols and icons, standardized short-form notices, and layered policies. Certainly, some of these measures have been tried but few can argue that they have become widespread, accepted by regulators, or understood by the public. The internet of things was poised to fundamentally alter traditional conceptions of notice, but leading IoT self-regulatory frameworks double down on transparency via privacy policy. It's arguable, in light of both the GDPR and well-intentioned legislative efforts to use further disclosure as a nudge to improve corporate privacy practices, that privacy policies have only gotten longer.

I read privacy policies everyday; I rarely feel like they answer all of my questions. Companies obfuscate their data-use practices and hide beyond privacy notices and controls that avoid some of the macro-level challenges. Privacy advocates have sought further transparency and lawmakers have responded in kind. It pains me each Data Privacy Day to continue to suggest to users that they ought to start to understand data collection and use by reading a privacy policy. That was poor advice five years ago, and it is worse today.

5. But the death of privacy

Despite some of my pessimism, the future of the privacy profession looks bright. The IAPP will point to rising membership as signs that the profession is maturing, but I would point to the caliber of individuals that are expressing interest in not just big-picture privacy challenges but even something as opaque and confounding as advertising technologies. I lucked into working on these issues. That is very hard to do anymore. 

As a lawyer, I am amazed at the growth in technology and privacy programs in U.S. law schools. Some of this is certainly in response to the job market, but some of it is likely because students find this area interesting, and with good reason: Technology touches every part of our lives and our data goes hand-in-hand with that. We shouldn't look just at the next generation of lawyers interested in privacy work either. I have always considered the Privacy Law Scholars Conference to be the hotbed of up-and-coming thinking on data privacy, and its roster now includes sociologists, economists, and computer scientists, suggesting that privacy has gone far beyond the realm of law. On that note, Carnegie Mellon University continues to churn out remarkable individuals via its privacy engineering program, and the pipeline of young professionals tackling the intersection of privacy and policy is impressive.

Kids don't care about privacy, I was told my first day as a privacy professional. Five years later, that sentiment strikes me as wishful thinking at best, an intentional distortion of the truth at worst. While I dare not try to predict what the next five years will bring, I am tempted to spend this year's Data Privacy Day gritting my teeth, reading a COPPA-compliant privacy policy, and thinking about how far we've come.

photo credit: Diego3336 nVidia G71 GPU via photopin (license)

1 Comment

If you want to comment on this post, you need to login.

  • comment Dale Smith • Jan 26, 2018
    Well said, Joe.  Remembering that sentence 1 of Recital 1 of the GDPR reads 
    "The protection of natural persons in relation to the processing of personal data is a fundamental right.", maybe 25May2018 signals a new bootcamp opportunity for all of us in the United States.