The United States Senate Commerce Committee, as part of a series of public hearings it is holding on privacy, heard the call for a new national data protection and privacy law from industry a few weeks ago. It heard it again more recently from privacy advocates.
The conversation now is about the shape that law should take.
To inform that goal, the committee’s hearing with consumer advocates examined lessons learned from the EU General Data Protection Regulation and the new California Consumer Privacy Act of 2018. These laws constitute a sincere effort on the part of policymakers to empower consumers and update law for the digital age. We support that effort and welcome the conversation, hard thinking, and debate that these new laws have engendered around the world. It is a good and just cause.
...the GDPR and CCPA have in their construct either left some problematic ambiguity or built some rules on mistaken assumptions that Congress should consider and correct in the construction of new U.S. federal legislation...
Respectfully, however, we believe that building on those laws, iterating on those ideas, taking what’s good, and redesigning the rest, Congress can do better by consumers and the digital economy. New law should ensure that the consumer has a right to fair treatment and legal protection from unreasonable data practices. Consumers should know and control who in the ecosystem gets access to their data, the volume of data they hold, and the way they use and distribute that data. Law and self-regulation should not leave consumers to their own devices in a complex marketplace for data. What the GDPR and CCPA have gotten right is the need to place the consumer at the center of the digital ecosystem. Now, we need legislators to support the Federal Trade Commission with the power and resources to enforce consumer rights in the digital age.
But both the GDPR and CCPA have in their construct either left some problematic ambiguity or built some rules on mistaken assumptions that Congress should consider and correct in the construction of new U.S. federal legislation.
The free-rider challenge
In the GDPR and CCPA, there is a central question for the providers of advertising-supported services as to what they can or should do if a consumer chooses to reject behavioral advertising. Both the GDPR and CCPA argue that a consumer opposed to interest-based advertising should suffer no penalty as a result of that decision. The outstanding question is what constitutes a penalty and whether or not the consumer should be allowed to use the service without payment.
CCPA posits that a service provider cannot deny a service on the basis of a consent choice — but it can make up the monetization lost through some other form of compensation. Interpretation of the GDPR and the construction of Europe’s draft ePrivacy Regulation have not made the European definition of penalty clear yet.
The CCPA concept is good in the sense that it recognizes that the provision of services is not free and that service providers have a right to require some form of compensation if they cannot monetize through advertising. But by stating that the service provider can only charge in an amount equal to that of the data lost, CCPA creates a form of rate regulation that will be tough to understand, quantify and police. Further, unless the service is a necessary utility, the service provider should not be forced to provide services to anyone. While access to the internet may be considered a utility, necessity or human right, it is not true that access to all the services made available over the internet falls into that category, as well.
First versus third parties
First-party collectors of people’s information are the companies and websites that consumers deal with directly. Third-party collectors are those that acquire information from someone other than the consumer.
Neither first nor third parties are inherently better at protecting consumer privacy due to where they sit in relation to the consumer. There are good and bad first-party stewards of data; there are good and bad third-party stewards of data. What matters most in the commercial stewardship of personal data is not whether the consumer knows the commercial actor but rather what data the actor collects, the context in which it was collected, the sensitivity of the data, how it is used and secured, and to whom it is disclosed and for what purposes.
The California law establishes an opt-out mechanism to deny companies the ability to sell consumer data. But as Facebook CEO Mark Zuckerberg likes to point out, Facebook doesn’t sell user data; it sells access to user audiences to advertisers. The California bill isn’t clear that it covers that circumstance or not and, if not, why not.
Favoring first parties gives large, walled gardens unnecessary and unreasonable advantages over an open ecosystem. That is bad for consumers, competition and privacy.
Risk of treating pseudonymous information the same as traditional PII
Pseudonymous identifiers allow an observer to recognize a device or consumer without identifying who that person actually is. Though that does not prevent misuse or abuse targeted at specific audiences, if a content creator wants to manipulate people or do harm to them, it can reduce risk and should be recognized and valued as such in the construction of law and regulation.
To understand and respect consumers, market actors need a mechanism for recognizing them. Enabling pseudonymous identification of a unique user across websites and devices is likely to lead to better services and marketing for that consumer than a series of walled-off and siloed buckets of data within first-party websites or platforms with traditional personally identifiable information such as the names, addresses and phone numbers of those consumers.
If the law discourages or prohibits the ability of advertisers to recognize a consumer through some pseudonymous identifier across unaffiliated or commonly owned sites, then only the largest walled gardens will know those consumers. They will know them intimately and more so than necessary for advertising purposes. That data advantage will feed itself, creating ever-growing pools of PII-based silos, which, breached or manipulated, could do great harm. Further, by constructing these silos, from one walled garden to another and across unaffiliated sites, advertisers will more repeatedly target the same consumer for the same product because the advertiser will not be able to manage message and campaigns across properties. And the advertiser as a consumer of media on which to reach consumers will be locked in to two or three providers, making for a very inefficient, non-transparent, non-competitive, less innovative, and less effective marketing environment, ultimately making the marketplace itself less responsive to consumer needs than it could be.
CCPA has a very expansive definition of personal data and information that includes pseudonymous identifiers, as well as traditional PII. The incentive that creates is for firms not to go through the work of creating pseudonymous identifiers and to instead collect truly personally identifiable information and distribute it broadly. It is well understood that pseudonymous information can be used to get you to a known person, but that does not mean it has no value from a privacy perspective and that using it to identify a specific person can be made illegal and should be.
Allowing for flexibility and innovation
Lastly, the GDPR and CCPA may not create enough flexibility to allow for exceptions or experimentation with new technologies that may be in the public interest because they enable more efficient and effective use of data, make products and services more affordable, or create new areas of work.
Artificial intelligence, for instance, allows for automated decision-making by machines using data. The demand to require an explanation for machine-made decisions may make it difficult to use this new technology. It is unclear whether or not the GDPR will limit the development of AI in Europe, and we will see if it does in implementation. We do not believe that was the intent of the law, and it is likely reasonable to require an explanation in AI decision-making that has particularly serious financial repercussions or could result in some form of tangible harm.
The internet of things, many of which will have no screen or mechanism to convey notice and gather consent, could face similar challenges. For the purposes of experimenting with new technologies that may require waiving a right to notice and consent or the right to deletion of information held, there should be some mechanism for petitioning for allowance of use of those technologies on public or legitimate interest grounds as the GDPR may allow, but for which we have not yet seen specific decisions.
Moving forward
At MediaMath, we support the intent and spirit of the new laws that policymakers are pursuing. Again, though we have listed critiques of aspects of the GDPR and CCPA, we respect the work behind those laws and will work with policymakers in those jurisdictions cooperatively on implementation. It may very well be that our concerns can be addressed within that process. Further, we believe that privacy is a fundamental right. Our request for legislators and regulators is to ensure that consumer interests are being well served in a healthy, data-driven marketplace and considered alongside the construction of mechanisms to give consumers greater control, agency and protection in the management and stewardship of their digital identity throughout the ecosystem.
photo credit: Geoff Livingston US Capitol via photopin (license)