TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | From Cambridge Analytica to GDPR: Enter digital supply chain management Related reading: A view from DC: Will Maryland end the era of notice and choice?

rss_feed

""

100

With senators and members of congress scolding its leader, the press competing for salacious details about data scandals, and regulators launching investigations on three continents, Facebook released its latest earning report a couple of weeks ago, revealing soaring advertising revenues and an indomitable user base. As CBS Moneywatch put it, “People keep flocking to Facebook and the social network keeps raking in money by selling them ads.” 

Surprised? You shouldn’t be. Privacy experts said all along that elections manipulation aside, little if anything about the Facebook-Cambridge Analytica story was news.

As is often the case in privacy debacles, public perception is key. For several years, Facebook has repeated the mantra that it doesn’t sell users’ data to advertisers or other third parties. While that may be true, it’s also true that Facebook regularly gave users’ data away — for free — to many thousands of app developers, websites and services through its API and social login, and tolerated lawful and unlawful (assuming terms of use are “law”) data collection by every Tom, Dick and Harry. Researchers, data brokers and every national security agency on the planet were active data players on the social network — and to be clear, on other social networks — using data for good, for bad and for everything in between. Users, for the most part, were indifferent, or at least never cared enough to shift their attention elsewhere. Of course, when data collection by Russian trolls hit the front page of The New York Times, there was a stir. But as the earning reports show, things quickly sprung back to normal.

Tellingly, the epic privacy battles of our age rage about technical minutiae of little or no import to the average internet user. Should encrypted data be anonymized or merely pseudonymized (the latter)? Do you need a legal basis to share data with a processor (no)? Do you have to pop a cookie banner before dropping the cookie itself (yes)? Does European law apply to European citizens residing in the U.S. when they visit Europe on their grandmother’s birthday (probably)? These questions and many more will no doubt stimulate the imagination of an entire universe of privacy experts for a decade ahead. To regular people, alas, they will continue to seem like an exercise in Talmudic deliberation.

Indeed, even Max Schrems’ ongoing challenge to cross-border data mechanisms — the stormiest of international privacy debates, occupying European and American policymakers, regulators and courts for years — revolves around an issue that, in the grand scheme of things, is as arcane as the nesting habits of the Yellow Warbler; namely, which data transfer mechanism is most appropriate: standard contracts, adequacy decisions or binding corporate rules. If you’re willing to risk it, ask your neighbor how they feel about it, and you’ll get a vacant look. It also sometimes seems like Kabuki theater: The main culprit in the Schrems case — the U.S. government — actually has more detailed privacy protections in place than most of its European counterparts. Despite admitting this openly, European critics continue to claim the moral high ground even as their own intelligence agencies collaborate with the NSA, advocates rail against government surveillance, regulators threaten to suppress data flows and industry warns against dampening innovation.    

Indeed, even Max Schrems’ ongoing challenge to cross-border data mechanisms — the stormiest of international privacy debates, occupying European and American policymakers, regulators and courts for years — revolves around an issue that, in the grand scheme of things, is as arcane as the nesting habits of the Yellow Warbler.

For ordinary people — the billions of Apple, Google, Microsoft, Amazon and Facebook users — the much heralded EU General Data Protection Regulation, too, is unlikely to cause a big stir. New consent banners will appear, users will click through, the show will go on. Under European law, users have had access and rectification rights for two decades. Austrian law students excepted, their exercise of these rights was sparse, court cases nonexistent and regulatory oversight lax. But the GDPR can still leave a dent, especially in light of the Cambridge Analytica revelations. It can do so by cementing sound privacy practices in organizations, reinforcing corporate data governance, and putting in place mechanisms for what Nicole Wong coined “digital supply chain management.”

For many years, organizations in retail and consumer goods have struggled to deal with ensuring employee rights and human freedoms in manufacturing supply chains. Decades ago, after being shamed in public for labor practices that badly tarnished its brand, image and — eventually — bottom line, Nike responded by focusing executive attention on weeding out child labor, appalling work conditions, and other abuses from sweatshops in remote parts of Indonesia and Bangladesh. Over the next years, international organizations, governments, businesses and nongovernmental organizations worked together to create frameworks and procedures, such as the United Nations Guiding Principles on Business and Human Rights, requiring companies to implement “human rights due diligence” up and down their value chain.

Now, while revealing merely the tip of the data-driven economy’s iceberg, the Cambridge Analytica story brought to the fore the ubiquity of data sharing among digital actors that bear little if any accountability to consumers, regulators or even the platforms they share data with. Researchers who furnish data to trolls, data brokers who profile individuals into categories such as “Bible Lifestyle” or “Diabetes Interest,” and health care providers that maintain lax security practices, are just some of the malefactors lurking in the midst of legitimate businesses’ digital supply lines.

Where the GDPR will be felt is in corporate offices. There, the new law will have a cascading effect.

Together with the GDPR’s elaborate requirements for transparency, due diligence, risk analyses, documentation and security, this rude awakening should create a perfect storm. It can incentivize organizations to understand, prioritize and deploy digital supply chain management to ensure the sources of their data are wholesome, their vendors resilient and their customers accountable for their actions. It can channel budgets to chief privacy officers, pave their path to senior management and boards, and empower them to weigh in on product engineering and design choices at an early stage.

Consequently, the GDPR rollout won’t play out on the public stage. May 26 will not be a day of reckoning for internet users. The internet won’t break because no one has the will to break it. Just like consumers continue to have very little insight into the source of the components of their new Nikes or iPhones, individuals will remain indifferent to the data flows in the bowels of the machine. Where the GDPR will be felt is in corporate offices. There, the new law will have a cascading effect. The biggest challenge for organizations won’t be demonstrating accountability to consumers or regulators, but rather to their business partners, the customers and vendors from whom they get data and with whom they share it.

On the heels of the Cambridge Analytica debacle, the GDPR is ushering in a new era of digital supply chain management.

photo credit: wuestenigel Close up of rustic chain in the sand via photopin (license)

Comments

If you want to comment on this post, you need to login.