TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Corporate Accountability Is Important, But Consumers Still Need Meaningful Control Related reading: Notes from the Asia-Pacific region, 19 April 2024

rss_feed

""

As readers of this blog are already well aware, a White House Review Group last week released its long-awaited report on Big Data. The IAPP’s Angelique Carson, CIPP/US, has published an excellent summary of the report’s findings. From my point of view, the report does a very good job of summarizing the challenges to privacy that Big Data presents and makes some very good recommendations.

However, at times the report seems to discount the role of the individual. Certainly, improved corporate accountability is important, but we should not presume that consumers don’t want meaningful control over the collection of their personal information.

Though much of the report is high-level and descriptive, there’s a lot in there for consumers to like. The Review Group calls for the advancement of consumer privacy legislation—recognizing that our existing framework hasn’t adequately protected personal privacy to date—and calls for long-overdue reform of the Electronic Communications Privacy Act. Calls for intelligence reform, however, are notably absent.

The Review Group does seem to endorse an outsized role for corporate accountability and use limitation in privacy protection going forward—perhaps at the expense of transparency, data minimization and personal control.

The report also puts an important spotlight on the potential for Big Data to be used to discriminatory and unfair ends. Big Data has the potential to make our lives better, but it also has the potential to reinforce existing power imbalances between individuals and companies. Big Data should not simply mean greater information disparities that companies can leverage to dynamically change prices to extract the highest amounts consumers are willing to spend—or to disadvantage or exclude minority and underserved populations. It will be absolutely essential that consumer and civil rights protections are embedded into privacy protection frameworks to ensure socially just outcomes.

On the other hand, we—and other consumer groups—have expressed concern about the Review Group’s emphasis on regulating data usage instead of data collection and retention and the minimization of the role and rights of the individual.

To be fair, the Review Group does not go as far as President’s Council of Advisors on Science and Technology (PCAST) does in its accompanying report—which goes so far as to argue that collection limitations and controls should be eliminated entirely from the administration’s Commercial Privacy Bill of Rights.

However, the Review Group does seem to endorse an outsized role for corporate accountability and use limitation in privacy protection going forward—perhaps at the expense of transparency, data minimization and personal control.

Certainly, it is fair to say the so-called notice-and-choice framework has not served consumers particularly well.

It’s worth considering, however, that what we call notice and choice today is really neither. Consumers are presented with long, inscrutable privacy policies and are presumed to have assented to them. Arguably, the solution to poor transparency and control mechanisms should not be to make them worse or eliminate them entirely. Rather, we need to look for ways to give consumers better information with which they can make easier privacy choices.

Arguably, the solution to poor transparency and control mechanisms should not be to make them worse or eliminate them entirely. Rather, we need to look for ways to give consumers better information with which they can make easier privacy choices.

And both reports do, in fact, explore ways to let consumers make persistent, global choices about how their personal information is treated—such as the long-delayed Do-Not-Track process. Companies would then have an obligation to honor these standardized privacy instructions. These are promising ideas and absolutely worth exploring. However, those controls must extend to data collection and retention—not just some as-yet-determined bad uses of our information.

The PCAST report especially embraces a logical fallacy that just because people may be comfortable sharing data with some companies for some purposes, people no longer have a privacy interest in how any of their data gets collected and shared or by whom. I may have a security camera in my bedroom that stores footage in the cloud—does that mean I have no privacy interest in who can observe and access that data? My phone’s microphone is constantly on in case I say “OK Google” to ask it to do something—does that mean I should necessarily expect that everyone has the ability to record and analyze everything I do and say?

The PCAST report includes a very evocative example of what a world looks like where we have no control over data collection:

Taylor Rodriguez prepares for a short business trip. She packed a bag the night before and put it outside the front door of her home for pickup. No worries that it will be stolen: The camera on the streetlight is watching it; and in any case, it has a tiny RFID tag. Any would-be thief would be tracked and arrested within minutes. Nor is there any need to give explicit instructions to the delivery company, because the cloud knows Taylor’s itinerary and plans; the bag is picked up overnight and will be in Taylor’s destination hotel by the time of her arrival.

Taylor finishes breakfast and steps out the front door. Knowing the schedule, the cloud has provided a self-driving car, waiting at the curb. At the airport, Taylor walks directly to the gate—no need to go through security. Nor were there any formalities at the gate: A twenty-minute “open door” interval is provided for passengers to stroll onto the plane and take their seats (which each individually highlighted in his or her wearable optical device). There are no boarding passes and no organized lines. Why bother, when Taylor’s identity (as for everyone else who enters the airport) has been tracked and is known absolutely? When the known information emanations (phone, RFID tags in clothes, facial recognition, gait, emotional state) are known to the cloud, vetted, and essentially unforgeable? When, in the unlikely event that Taylor has become deranged and dangerous, many detectable signs would already have been tracked, detected, and acted on?

Instead, everything Taylor carries has been screened far more effectively than any rushed airport search today. Friendly cameras in every LED lighting fixture in Taylor’s house have watched her dress and pack, as they do every day. Normally these data would be used only by Taylor’s personal digital assistant, perhaps to offer reminders or fashion advice. As a condition of using the airport transit system, however, Taylor has authorized the use of the data for ensuring airport security and public safety.

The PCAST report envisions that citizens in the future will inevitably make the privacy tradeoff of convenience and security over their personal privacy—in this case, letting a future TSA watch us getting dressed the morning of a short business trip.

On the other hand, they might not!

If all data collection, by everyone, everywhere, is permissible, wiretap laws are suddenly invalid. Revenge porn and peeping tom laws will be impossible. Does data security even matter anymore, or do we expect criminal syndicates to adhere to reasonable use limitations?

Not everyone trusts their government to be so completely benign. The Review Group report recommends amending some of our outdated government access laws, but they haven’t been fixed yet, and government abuse of commercial databases will always be a concern. Even if you’re not worried about being arrested for what you do in your own home, you still might not feel comfortable with TSA agents joking about your body, even if you can’t articulate a concrete harm from such surveillance.

And we haven’t agreed on what the unfair or discriminatory uses of data might be.

Due to the power imbalances augmented by Big Data, will consumers ever be sure they’re not being discriminated against? Will they have a legal action to challenge bad data uses, or will we be deemed to have agreed to binding arbitration clauses when we eat a bowl of Cheerios? The Review Group report assumes a completely trustworthy environment, but dubious consumers who have been discriminated against before might prefer to keep some of their activities to themselves.

And if data collection limitations are invalid, so too are data transfer and data publication prohibitions; these have nothing to do with data usage but are just new collections by new people. If all data collection, by everyone, everywhere, is permissible, wiretap laws are suddenly invalid. Revenge porn and peeping tom laws will be impossible. Does data security even matter anymore, or do we expect criminal syndicates to adhere to reasonable use limitations?

Perhaps citizens will eventually embrace this Panopticon, but I suspect that the PCAST report fundamentally mistakes human nature: People want personal spaces where they can feel comfortable that they won’t be observed—or at least that they can control who can observe them.

A Panopticon.

Fortunately, we’re starting to see commercial trends embrace that reality. Teens are using social networks that expose less personal data. And in response, social networks themselves are adapting to offer users better controls—including over data collection itself.

All of which is not to say that corporate accountability and use limitations don’t need to be strengthened.

On the contrary, I strongly believe that data collection is absolutely going to dramatically increase—I just believe there need to be limits and controls—and the greatly expanded data sets that will necessarily exist will need robust internal protections.

Consumers may want to limit data collection in some—or many—circumstances, but privacy professionals will still have a lot of work to do.

Ultimately, the challenges presented by Big Data are going to require both. Stronger corporate privacy management will have to work in tandem with more effective individual control to meet the significant privacy challenges in the era of Big Data. And demonstrating more responsible privacy management will help create an environment of trust to encourage consumers to share their data. Just, maybe, not all of it.

photo credit: paolotrabattoni.it via photopin cc

Comments

If you want to comment on this post, you need to login.