OneTrust_GDPRCompliance_square-banner1
GDPR16_London_Web_300x250-FRENCH-v2
Webcon_TE_300x250_ad_Sept_2016-01
Corporate Accountability Is Important, But Consumers Still Need Meaningful Control

As readers of this blog are already well aware, a White House Review Group last week released its long-awaited report on Big Data. The IAPP’s Angelique Carson, CIPP/US, has published an excellent summary of the report’s findings. From my point of view, the report does a very good job of summarizing the challenges to privacy that Big Data presents and makes some very good recommendations.

However, at times the report seems to discount the role of the individual. Certainly, improved corporate accountability is important, but we should not presume that consumers don’t want meaningful control over the collection of their personal information.

Though much of the report is high-level and descriptive, there’s a lot in there for consumers to like. The Review Group calls for the advancement of consumer privacy legislation—recognizing that our existing framework hasn’t adequately protected personal privacy to date—and calls for long-overdue reform of the Electronic Communications Privacy Act. Calls for intelligence reform, however, are notably absent.

The Review Group does seem to endorse an outsized role for corporate accountability and use limitation in privacy protection going forward—perhaps at the expense of transparency, data minimization and personal control.

The report also puts an important spotlight on the potential for Big Data to be used to discriminatory and unfair ends. Big Data has the potential to make our lives better, but it also has the potential to reinforce existing power imbalances between individuals and companies. Big Data should not simply mean greater information disparities that companies can leverage to dynamically change prices to extract the highest amounts consumers are willing to spend—or to disadvantage or exclude minority and underserved populations. It will be absolutely essential that consumer and civil rights protections are embedded into privacy protection frameworks to ensure socially just outcomes.

On the other hand, we—and other consumer groups—have expressed concern about the Review Group’s emphasis on regulating data usage instead of data collection and retention and the minimization of the role and rights of the individual.

To be fair, the Review Group does not go as far as President’s Council of Advisors on Science and Technology (PCAST) does in its accompanying report—which goes so far as to argue that collection limitations and controls should be eliminated entirely from the administration’s Commercial Privacy Bill of Rights.

However, the Review Group does seem to endorse an outsized role for corporate accountability and use limitation in privacy protection going forward—perhaps at the expense of transparency, data minimization and personal control.

Certainly, it is fair to say the so-called notice-and-choice framework has not served consumers particularly well.

It’s worth considering, however, that what we call notice and choice today is really neither. Consumers are presented with long, inscrutable privacy policies and are presumed to have assented to them. Arguably, the solution to poor transparency and control mechanisms should not be to make them worse or eliminate them entirely. Rather, we need to look for ways to give consumers better information with which they can make easier privacy choices.

Arguably, the solution to poor transparency and control mechanisms should not be to make them worse or eliminate them entirely. Rather, we need to look for ways to give consumers better information with which they can make easier privacy choices.

And both reports do, in fact, explore ways to let consumers make persistent, global choices about how their personal information is treated—such as the long-delayed Do-Not-Track process. Companies would then have an obligation to honor these standardized privacy instructions. These are promising ideas and absolutely worth exploring. However, those controls must extend to data collection and retention—not just some as-yet-determined bad uses of our information.

The PCAST report especially embraces a logical fallacy that just because people may be comfortable sharing data with some companies for some purposes, people no longer have a privacy interest in how any of their data gets collected and shared or by whom. I may have a security camera in my bedroom that stores footage in the cloud—does that mean I have no privacy interest in who can observe and access that data? My phone’s microphone is constantly on in case I say “OK Google” to ask it to do something—does that mean I should necessarily expect that everyone has the ability to record and analyze everything I do and say?

The PCAST report includes a very evocative example of what a world looks like where we have no control over data collection:

Taylor Rodriguez prepares for a short business trip. She packed a bag the night before and put it outside the front door of her home for pickup. No worries that it will be stolen: The camera on the streetlight is watching it; and in any case, it has a tiny RFID tag. Any would-be thief would be tracked and arrested within minutes. Nor is there any need to give explicit instructions to the delivery company, because the cloud knows Taylor’s itinerary and plans; the bag is picked up overnight and will be in Taylor’s destination hotel by the time of her arrival.

Taylor finishes breakfast and steps out the front door. Knowing the schedule, the cloud has provided a self-driving car, waiting at the curb. At the airport, Taylor walks directly to the gate—no need to go through security. Nor were there any formalities at the gate: A twenty-minute “open door” interval is provided for passengers to stroll onto the plane and take their seats (which each individually highlighted in his or her wearable optical device). There are no boarding passes and no organized lines. Why bother, when Taylor’s identity (as for everyone else who enters the airport) has been tracked and is known absolutely? When the known information emanations (phone, RFID tags in clothes, facial recognition, gait, emotional state) are known to the cloud, vetted, and essentially unforgeable? When, in the unlikely event that Taylor has become deranged and dangerous, many detectable signs would already have been tracked, detected, and acted on?

Instead, everything Taylor carries has been screened far more effectively than any rushed airport search today. Friendly cameras in every LED lighting fixture in Taylor’s house have watched her dress and pack, as they do every day. Normally these data would be used only by Taylor’s personal digital assistant, perhaps to offer reminders or fashion advice. As a condition of using the airport transit system, however, Taylor has authorized the use of the data for ensuring airport security and public safety.

The PCAST report envisions that citizens in the future will inevitably make the privacy tradeoff of convenience and security over their personal privacy—in this case, letting a future TSA watch us getting dressed the morning of a short business trip.

On the other hand, they might not!

If all data collection, by everyone, everywhere, is permissible, wiretap laws are suddenly invalid. Revenge porn and peeping tom laws will be impossible. Does data security even matter anymore, or do we expect criminal syndicates to adhere to reasonable use limitations?

Not everyone trusts their government to be so completely benign. The Review Group report recommends amending some of our outdated government access laws, but they haven’t been fixed yet, and government abuse of commercial databases will always be a concern. Even if you’re not worried about being arrested for what you do in your own home, you still might not feel comfortable with TSA agents joking about your body, even if you can’t articulate a concrete harm from such surveillance.

And we haven’t agreed on what the unfair or discriminatory uses of data might be.

Due to the power imbalances augmented by Big Data, will consumers ever be sure they’re not being discriminated against? Will they have a legal action to challenge bad data uses, or will we be deemed to have agreed to binding arbitration clauses when we eat a bowl of Cheerios? The Review Group report assumes a completely trustworthy environment, but dubious consumers who have been discriminated against before might prefer to keep some of their activities to themselves.

And if data collection limitations are invalid, so too are data transfer and data publication prohibitions; these have nothing to do with data usage but are just new collections by new people. If all data collection, by everyone, everywhere, is permissible, wiretap laws are suddenly invalid. Revenge porn and peeping tom laws will be impossible. Does data security even matter anymore, or do we expect criminal syndicates to adhere to reasonable use limitations?

Perhaps citizens will eventually embrace this Panopticon, but I suspect that the PCAST report fundamentally mistakes human nature: People want personal spaces where they can feel comfortable that they won’t be observed—or at least that they can control who can observe them.

A Panopticon.

Fortunately, we’re starting to see commercial trends embrace that reality. Teens are using social networks that expose less personal data. And in response, social networks themselves are adapting to offer users better controls—including over data collection itself.

All of which is not to say that corporate accountability and use limitations don’t need to be strengthened.

On the contrary, I strongly believe that data collection is absolutely going to dramatically increase—I just believe there need to be limits and controls—and the greatly expanded data sets that will necessarily exist will need robust internal protections.

Consumers may want to limit data collection in some—or many—circumstances, but privacy professionals will still have a lot of work to do.

Ultimately, the challenges presented by Big Data are going to require both. Stronger corporate privacy management will have to work in tandem with more effective individual control to meet the significant privacy challenges in the era of Big Data. And demonstrating more responsible privacy management will help create an environment of trust to encourage consumers to share their data. Just, maybe, not all of it.

photo credit: paolotrabattoni.it via photopin cc

Written By

Justin Brookman

Comments

If you want to comment on this post, you need to login.

Related

Board of Directors

See the esteemed group of leaders shaping the future of the IAPP.

Contact Us

Need someone to talk to? We’re here for you.

IAPP Staff

Looking for someone specific? Visit the staff directory.

Learn more about the IAPP»

Daily Dashboard

The day’s top stories from around the world

Privacy Perspectives

Where the real conversations in privacy happen

The Privacy Advisor

Original reporting and feature articles on the latest privacy developments

Privacy Tracker

Alerts and legal analysis of legislative trends

Privacy Tech

Exploring the technology of privacy

Canada Dashboard Digest

A roundup of the top Canadian privacy news

Europe Data Protection Digest

A roundup of the top European data protection news

Asia-Pacific Dashboard Digest

A roundup of the top privacy news from the Asia-Pacific region

Latin America Dashboard Digest

A roundup of the top privacy news from Latin America

IAPP Westin Research Center

Original works. Groundbreaking research. Emerging scholars.

Get more News »

IAPP Communities

Meet locally with other privacy pros, dive deep into a specialized topic or simply share a common interest, IAPP Communities are for you.

IAPP Job Board

Looking for a new challenge, or need to hire your next privacy pro? The IAPP Job Board is the answer.

Join the Privacy List

Have ideas? Need advice? Subscribe to the Privacy List. It’s crowdsourcing, with an exceptional crowd.

Find more ways to Connect »

Find a Privacy Training Class

Two-day privacy training classes are held around the world. See the complete schedule now.

NEW! Raise Staff Awareness

Equip all your data-handling staff to reduce privacy risk, with Privacy Core™ e-learning essentials.

Online Privacy Training

Build your knowledge. The privacy know-how you need is just a click away.

The Training Post—Can’t-Miss Training Updates

Subscribe now to get the latest alerts on training opportunities around the world.

Upcoming Web Conferences

See our list of upcoming web conferences. Just log on, listen in and learn!

Train Your Team

Get your team up to speed on privacy by bringing IAPP training to your organization.

Learn more »

CIPP Certification

The global standard for the go-to person for privacy laws, regulations and frameworks

CIPM Certification

The first and only privacy certification for professionals who manage day-to-day operations

CIPT Certification

The industry benchmark for IT professionals worldwide to validate their knowledge of privacy requirements

NEW! FIP Designation

Recognizing the advanced knowledge and issue-spotting skills a privacy pro must attain in today’s complex world of data privacy.

Certify Your Staff

Find out how you can bring the world’s only globally recognized privacy certification to a group in your organization.

Learn more about IAPP certification »

Get Schooled in Privacy

Looking to get some higher-ed in privacy? Check out these schools that include data privacy courses in their curricula.

Privacy Vendor List

Find a privacy vendor to meet your needs with our filterable list of global service providers.

NEW! Raise Staff Awareness

Equip all your data-handling staff to reduce privacy risk, with Privacy Core™ e-learning essentials.

The Industry of Privacy

Take stock, compare your practices to those of other organizations, and get budget with these studies on the industry of privacy.

More Resources »

Time to Get to Work at the Congress

Thought leadership, a thriving community and unrivaled education...the Congress prepares you for the challenges ahead. Register today.

GDPR Comprehensive London: Register Onsite for In-Person or Online for Virtual

Online registration for in-person attendance is now closed. For in-person, register onsite the day of. For virtual, register online throughout the programme.

Call for Speakers at Summit 2017

Are you an engaging speaker with privacy expertise to share? We want you! Submit a proposal today! The Call for Speakers closes Oct. 2, 2016.

GDPR's Top Impacts - Webcon Delivered in French

Rejoignez des experts pour en savoir plus : Les 10 conséquences pratiques les plus importantes du RGPD. S’inscrire maintenant.

Intensive Education at the Practical Privacy Series

The Series is returning to DC, this year spotlighting Data Breach, FTC and Consumer Privacy, GDPR and Government privacy issues. It’s the education you need now!

Sponsor an Event

Increase visibility for your organization—check out sponsorship opportunities today.

More Conferences »

Become a Member

Start taking advantage of the many IAPP member benefits today

Corporate Members

See our list of high-profile corporate members—and find out why you should become one, too

Renew Your Membership

Don’t miss out for a minute—continue accessing your benefits

Join the IAPP»