It’s all grown up; my how time flies. The Privacy Act turned 40 this year. To celebrate, Georgetown Law’s freshly launched Center on Privacy and Technology last week held a birthday party of sorts. Well, a birthday party where the guest is partly celebrated but mostly roasted by all of its friends and foes and then told how it can improve.
But who doesn’t love a good roast?
Hosted by the center’s executive director, Alvaro Bedoya, the event featured an all-star roster of panelists—some who were on the front lines when President Gerald Ford signed the Privacy Act into law in 1974 and others who’ve, in subsequent years, become intimately familiar with both the act’s benefits and flaws—discussing how the law came to be and why it was so desperately needed 40 years ago, as well as how its provisions now in many ways fail to protect Americans because of technology, the ascension of big data and some pretty sneaky workarounds created to circumvent the law’s prohibitions.
The all-said-and-done consensus? Changes are needed.
James Rule of UC Berkeley’s School of Law talked about the political climate preceding the act; he recalled privacy protection as a social movement, which grew out of public unrest.
“Things were nasty,” he said, specifically recalling the mid-1950s, when the government was actively hunting those suspected of being associated with the Communist Party and collected lists of individuals, aiming to track their movements. “Just when things couldn’t get any nastier, there was the Nixon administration,” Rule said, alluding to the Watergate scandal. “All of these things involved personal data and led to the rising tide of the mistrust of authorities by all individuals in American life and a tremendous backlash. Privacy then intruded itself onto the public consciousness, even for folks who were not in the least privacy activists or specialists.”
The Privacy Act was intended to respond to the creation of computerized databases. It requires government agencies to show individuals the data stored on them, follow the Fair Information Practice Principles in collecting and handling their data and places restrictions on data sharing among agencies, and allows individuals to sue the government for violating these provisions.
But there are exemptions to the rule. For example, agencies need not acquire an individual’s consent for data disclosures in cases of “routine use” if the disclosure is made to Congress or law enforcement or in response to a court order.
EPIC’s Khaliah Barnes said she regularly sees abuses of “routine use” claims. For example, an agency will sometimes claim it’s sharing data with law enforcement under a routine use, but law enforcement must make a formal written request to an agency for that to be true, which is not always the case.
Bob Gellman, a privacy consultant who worked closely with the Privacy Act on a government subcommittee charged with overseeing it in the '70s, said the law is antiquated because the entire concept of having a system of records is obsolete.
Here’s what he means: The law relies on the term “systems of records (SORs)” to mandate agencies announce their databases in the Federal Register for all to read. But the way an SOR is defined is a database that retrieves information by personally identifiable information like a person’s name and Social Security number. To get around having to register an SOR, businesses have been known to simply create databases that classify individuals by another identifier, panelists said.
“It’s a meaningless concept today,” Gellman said of SORs. “Admittedly, there’s a problem there.”
EPIC’s Barnes said while both Congress and agencies take a very narrow view of what a SOR is, that wasn’t Congress’s intent when it passed the law four decades ago.
“Congress’s intent was that agencies would not be able to ingest records of individuals” and that individuals would have rights when it came to those records, she said, adding that of course we're now “in an environment very different from when the Privacy Act was enacted.”
Jonathan Cantor, deputy chief privacy officer at the Department of Homeland Security, built upon the point about how technology has changed the law's effectiveness: “The regular failing of the law is on the tech side,” Cantor said.
Gellman agreed, recalling that when the act was passed, computers were run “by a high priesthood of specialists.” That model has since changed as databases have become pervasive and algorithms have taken over the tasks humans used to do manually.
Panelists resoundingly agreed that exactly which disclosures should be allowed and which shouldn’t be is a problem that no one has solved. Also drawing consensus: “Routine use” is too broad a term. Abuse of it is pervasive, Gellman said, specifically calling out the Department of Justice and the Inspector General as being the “worst offenders.”
But because SORs as a concept is antiquated doesn’t mean there shouldn’t be a process for accurate and timely information to be maintained, Gellman said.
For all of its flaws, the Privacy Act still offers a level of protection that’s worth noting, Cantor said.
“Forcing people to go through that methodology is still an incredibly important exercise,” he said. “It gets people thinking about why they do what they want to do; the purposes for which we’re going to use and collect the data … It forces an analysis, forces people to engage in conversations that would not otherwise happen.”
While some of the law’s provisions don’t always work the way they were intended, having a general rule that there’s “no disclosure without consent” is “extremely helpful,” Cantor said.
He did concede, however, that besides falling short on technology provisions, the act also doesn’t address vendor management, contracted employees or the degree to which agencies are sharing data.
“The silence in that area is deafening,” he said of the latter.
Editor’s Note: Look for part two of this story in tomorrow’s Daily Dashboard for proposed changes to the Privacy Act from the panelists mentioned in the story above as well as Deirdre Mulligan of UC Berkeley, Alan Raul of Sidley Austin, Peter Swire of Georgia Tech Scheller College of Business, Paul Ohm of the University of Colorado Law School and Alvaro Bedoya of Georgetown Law’s Center on Privacy and Technology.
If you want to comment on this post, you need to login.