The Privacy Law Scholars Conference held their ninth annual gathering in Washington at the beginning of this month, bringing together academics and practitioners to present papers that are still in development. The workshop environment is a closed circuit – no tweeting or blogging about what happens there is allowed, and papers may or may not ever be published.
However, papers and ideas inevitably rise to the top, and the IAPP recognizes two of those with its annual IAPP Papers Award, voted on by attendees. This year, we saw both repeat winners and iterative ideas, emphasizing the way the conference features communally progressive work.
Kim has focused much of her work in the past on employment law, and she brings that background to privacy by looking at the issue of discrimination. “The whole use of big data in employment has raised some concerns from privacy folks,” she said, “considering the sensitivity of the information … There are those people who write about discrimination and those who write about privacy, and those two things are really coming together. They need to start talking to each other more and more. This paper is the start of that conversation in the employment context.”
Building on the work Barocas and Selbst did to demonstrate the way that algorithms can discriminate, Kim looked at “why this use of data is so different from other uses of data.” Essentially, data use normally focuses on making sure variables are chosen correctly, and parsimonious models are valued. “The use of data mining is the opposite of that,” she said. “Parsimony is not valued; we don’t’ have a theory, we just see what comes out of it.” The result can be some ambiguity about correlation and causation.
In employment decisions, this can be a major discriminatory issue, as it’s impossible to see how those not selected would have done the job.
In the second half of the paper, Kim then looks at whether Title VII of the Civil Rights Act of 1964 is able to handle this discrimination issue, as it is normally applied to employment rights. “There’s a provision in Title VII,” she noted, “that’s been largely ignored and speaks in terms of employers not limiting, segregating or classifying in ways that tend to deny opportunities … I think that could be developed to address algorithms and I start to spin that out and look at what it would mean to use that language to develop a coherent theory, rather than take direct doctrine that was developed in the context of industrial physiology and workplace testing.”
There is clearly a large different between a test administered to a potential employee that evaluates their effectiveness for a job and an algorithm that might determine that people with certain browsing habits online are better fits than others.
“Employers cannot defend an algorithm just by saying it’s job-related,” Kim argued. “That might have made sense when there was a skilled test. That doesn’t make sense when it’s just correlation spit out by a machine.”
In fact, if you look to history, you’ll see that it was a group of 10 attorneys general who first went to Silicon Valley in 1998 to talk to companies about not having privacy policies. “When the FTC was arguing self-regulation,” Citron noted. “The state AGs said, ‘no, no, we think it’s an unfair practice.’”
Former Assistant AG of California Susan Henrichsen recalled for Citron in an interview that the AGs were hoping a consultative approach would get the burgeoning internet sector to respond to AG fears of deception and lack of transparency.
“And Susan said the companies said, ‘Come on, people trust us! They know that they can trust us,’” said Citron. “So those negotiations were a failure … and so in 1998 the state AGs brought enforcement actions against DoubleClick and others, arguing that if you don’t tell people what you’re doing, that’s unfair and deceptive, and that’s the first time we started talking privacy notices in an enforcement way.”
Through an extensive series of interviews and freedom of information access requests, Citron has built in her paper an argument for “ways state attorneys general can function more effectively through informal and formal proceedings,” while also urging “state enforcers to act more boldly in the face of certain shadowy data practices.” Along the way, she identifies the ways that states team up to build on each other’s strengths and bodies of knowledge and documents their various enforcement styles.
All of which serves to shine a spotlight on the AG’s role in the bigger picture of enforcement and policy-making. Everyone focuses on the FTC’s role as privacy enforcer of consumer privacy issues in the United States, said Citron, “and they are the mother ship now in many ways, but the state AGs have been important leaders and partners and we sort of forget them.”
Look for Kim’s work to be presented at the Privacy. Security. Risk conference in San Jose this fall, and for Citron’s work to appear at the Global Privacy Summit in Washington this coming spring.
If you want to comment on this post, you need to login.