When he vetoed the Vermont Legislature’s recent privacy bill, Gov. Phil Scott, R-Vt., declared that it "created an unnecessary and avoidable level of risk," in large part because it had a "narrow" private right of action. Because he feared that it would make Vermont a "national outlier," Scott returned the bill to the legislature, which had an opportunity to override the veto, but failed to get the two-thirds majority to do so.
In vetoing the bill, Scott was right in one significant respect — Vermont’s privacy bill would have made the Green Mountain State a national outlier. But beyond that, he is deeply wrong — and wrong in ways that are bad for privacy — and for trust in business.
The Vermont bill the governor vetoed was vital in several ways. Unlike some recent half-hearted privacy laws, it was committed to data minimization. It would have restricted malicious designs — particularly ones that targeted children. Its most controversial provision was its private right of action — a right that would have given people the right to sue companies that violated the law — but only after those companies had been given notice of their violation and decided after three months not to cure it.
It is perfectly understandable companies don't want to be sued, but the governor's veto contributes to a dangerous trend in privacy, data protection, and consumer protection law: that the mere existence of lawsuits when there has been a privacy violation is bad for business. Such a claim suggests that companies ought to be above the law, and that they should have the right to use sensitive human information for whatever purposes they desire, entirely without consequence or check. That might be good for business, but it would also put companies above the law. And it would be terrible for privacy.
Over the past quarter century, companies of all sizes have aggressively sought human information to drive their business practices. From search engines to social media, and from advertising technology companies to old-fashioned "brick-and-mortar" retailers, these companies have used our data for their own purposes with scant regard for the consequences. The legal regime that has governed this aggressive data capture and exploitation has been the familiar "notice and choice" framework, in which "notice" can be little more than a fiction that consumers read vast quantities of dense and vague privacy policies and the "choice" can be no more than the choice of whether to participate in modern life or not. This regime has led to widespread exploitation of consumer data, from Cambridge Analytica to Snowden to dark patterns to data breaches. While consumers and citizens have been exposed and manipulated, companies have profited with practical impunity.
The response to this crisis of human data has been a call for regulation, to bring the excesses of the human information trade within the rule of law. Virtually everyone agrees that notice and choice is both a farce and a failure, with companies, citizens, advocates and civil society all calling for regulation of the human information economy.
But lawmakers' response to this call has been underwhelming, to say the least. While Europe's ambitious EU General Data Protection Regulation has certainly set the standard for privacy regulation, the response in the U.S. has been, overall, a failure. The California Consumer Privacy Act represents a good faith effort to bring the human information economy under control, as does Colorado's law and regulations and the Illinois Biometric Information Privacy Act.
However, the remainder of state "comprehensive" privacy laws have done little more than entrench the failed "notice and choice" framework they were supposed to fix. What's worse, these laws not only ratify notice and choice, but they offer "rights" that give people no right to sue when those rights are violated. Laws like these (particularly the Connecticut law that Governor Scott praised in his veto statement) are worse than doing nothing — they create the illusion that the state legislature has solved the problem, even when the "solution" does virtually nothing to address the true scope of our exposure. Such laws make a mockery of the rule of law, and the legislators that pass them ought to be ashamed.
In short, the median state privacy law is a failure that entrenches and ratifies the most dangerous business data practices. This is precisely why the Vermont bill was a "national outlier" — because it actually tried to fix the real problem.
The Vermont bill sincerely tried to go beyond the Connecticut "solution" of doing very little and declaring the problem to be solved by mandating data minimization and protections against the use of design to manipulate children. Critically, the bill had a narrow, but meaningful private right of action. It recognized consumer rights, and gave people the right to complain if their rights were being violated. If a company waited three months from receiving a complaint without curing it, only then could people sue those companies to enforce their rights.
The bill wasn't perfect. For example, it could have imposed a duty of loyalty to stop companies from using people's data to betray them. It could have prohibited abusive trade practices that leverage people’s own limitations against them. The private cause of action was (as the governor noted) quite weak. It was a tempered, political compromise, but it represented a significant improvement on the weak Connecticut model that is worse than doing nothing.
Instead of protecting the people of Vermont, the governor has instead signaled that he doesn't want to impose safeguards that would interfere with industry's exploitative business models. The next steps are sadly predictable. Lobbyists will encourage lawmakers to rally around a weaker version of a privacy law that will do nothing to meaningfully protect people while giving lawmakers a talking point that they "did something" on privacy.
We are at a critical point for state privacy legislation and our wellbeing hangs in the balance.
On one side are the wet napkin privacy laws with no substantive rules or meaningful accountability mechanisms. On the other are the more meaningful spate of rules proposed in states like Vermont and Massachusetts, and crucially passed in Maryland.
These are just the best of the traditional data protection models. Better laws are possible. We deserve more than just window dressing privacy rules. But we will only get them if state lawmakers realize that the privacy rules favored by industry are worse than nothing. Governor Scott's veto has stopped Vermont from being a "national outlier," and stopped privacy protections from getting meaningfully better.
Neil Richards is the Koch Distinguished Professor in Law at Washington University in St. Louis.
Woodrow Hartzog is the Professor of Law at Boston University School of Law.