If the phrase "privacy by design" hadn't already been coined for its special purposes, Woodrow Hartzog might well have taken it as the title of his smart new book instead of the rather oblique title it now bears. In his new book, "Privacy's Blueprint: The Battle to Control the Design of New Technologies," Hartzog, who is on the faculty of the Northeastern University School of Law and a prolific writer for legal journals, addresses design in the usual sense of the term: How things (products, interfaces, web sites) look to those who use them. In an informal, jargon-free style, he argues that elements of design like buttons, links, the shape of screens, and especially language, determine how we interact with the technologies that pervade modern American life, leading us to share information unwittingly and unwisely, and therefore design needs to be subjected to improved law, regulation, and standards.
To Hartzog, design is not trivial, but decisive. It shapes our use of information technologies in ways that inevitably promote or degrade human values: "Design is everywhere; design is power and design is political," he writes. And yet design gets almost no attention in current systems of regulation, which depend on three principles that Hartzog considers vague and outdated: "the Fair Information Practices," the so-called "FIPs" developed by the Federal Trade Commission decades ago that require notice and consent, "do not lie, and do not harm." Examples from popular media, legal opinions, and technical journals show the inadequacy of these guidelines as they fail utterly to ensure that we make informed decisions about the information we allow to be taken from us. Weakly regulated as it is today, the design of information-hungry technology leads us to accept unreadable privacy policies, to trust companies that share data with who knows what partners and processors, and to accede to default settings that may cause us unpredictable forms of harm months or years in the future.
Hartzog outlines a design agenda that can mitigate these risks. This is his "blueprint." It features interventions on a scale from soft standards to robust laws intended to establish the three values he believes to be most painfully absent from our interactions online: trust, obscurity, and autonomy. We should be able to trust digital companies we interact with as we trust the stores we do business with offline. We should expect some of our activities online will be obscure, or not easily findable through search engines or other means, thereby granting us the freedom to experiment and play in some protected online environments, as we all wish to do at times. And we should be granted autonomy, which requires truly effective, informed control over what data we give up and how it is used. Hartzog is at his frequent best as a writer when illustrating how these values suffer in a regulatory regime built on the outdated FIPs and inadequate notions of harm.
The book ends with an extended discussion of how the blueprint can reduce the privacy harms inherent in three specific forms of connected life: social networks, surveillance mechanisms (like facial recognition), and the internet of things. The problems posed by these technologies vary, and Hartzog is suitably wide-ranging in his solutions, which are sometimes legal and sometimes technical. To improve social media, for example, he suggests the creation of a "privacy box," a repository for sensitive information that would require anyone wishing to view the contents to accept enhanced confidentiality terms. This extra step would constitute an increased transaction cost, likened by Hartzog to a speed bump that would slow down exposure or exploitation of the data contained in the box.
At times the book suffers, or rather the reader suffers, from an excessive explicitness in its style. Hartzog tells you what he is about to say, he says it, then he tells you what he just said, a technique that paradoxically makes it harder and not easier to understand the argument. And could anyone reading this book really need its description of how a speed bump works?
The arguments are strong enough to stand without this kind of rhetorical overkill. They will repay the attention of designers, privacy professionals, and anyone who wants to learn how design guided by strengthened laws and regulations might help us emerge from today's swirl of privacy problems.