TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Engineers and Lawyers in Privacy Protection: Can We All Just Get Along? Related reading: A view from Brussels: EDPS sends signal on data transfers 

rss_feed

""

In March 2013 we participated in a panel titled “Re-Engineering Privacy Law” at the IAPP Privacy Summit. The topic of the panel closely matches the topic of this book, how to bring together and leverage the skill sets of engineers, lawyers, and others to create effective privacy policy with correspondingly compliant implementations. As a software engineering professor (Antón) and a law professor (Swire), we consider four points: (1) how lawyers make simple things complicated; (2) how engineers make simple things complicated; (3) why it may be reasonable to use the term “reasonable” in privacy rules but not in software specifications; and (4) how to achieve consensus when both lawyers and engineers are in the room.

1. How lawyers make simple things complicated. A first-year law student takes Torts, the study of accident law. A major question in that course is whether the defendant showed “reasonable care.” If not, the defendant is likely to be found liable. Sometimes a defendant has violated a statute or a custom, such as a standard safety precaution. More often, the answer in a lawsuit is whether the jury thinks the defendant acted as a “reasonable person.” The outcome of the lawsuit is whether the defendant has to pay money or not. We all hope that truth triumphs, but the operational question hinges on who can prove what in court.

The legal style is illustrated by the famous Palsgraf case. A man climbs on a train pulling out of the station. The railroad conductor assists the man into the car. In the process, the man drops a package tucked under his arm. It turns out the package contains fireworks, which explode, knocking over some scales at the far end of the platform. The scales topple onto a woman, causing her injury.

From teaching the case, here is the outline of a good law student answer, which would take several pages. The answer would address at least four issues. For each issue, the student would follow IRAC (Issue, Rule, Analysis, Conclusion) form, discussing the issue, the legal rule, the analysis, and the conclusion: (1) Was the man negligent when he climbed on the moving train? (2) When the railroad conductor helped the man up, was the conductor violating a safety statute, thus making his employer, the railroad, liable? (3) When the man dropped the fireworks, was it foreseeable that harm would result? (4) Was the dropping of the package the proximate cause of knocking over the scales? In sum, we seek to determine whether the railroad is liable. The law student would explain why it is a close case; indeed, the actual judges in the case split their decision 4-3.

The lawyer may agree in theory that simplicity is the key to elegance, but law students and lawyers have strong reasons to go into far more detail.

Engineers design and build things. As such, they seek practical and precise answers. Instead of an IRAC form, engineers seek to apply scientific analytic principles to determine the properties or state of the “system.” The mechanisms of failure in the Palsgraf case would be analyzed in isolation: (1) The train was moving, therefore, the policy of only allowing boarding while the train is stopped was not properly enforced, thereby introducing significant safety risk into the system. (2) The scales were apparently not properly secured, thus a vibration or simple force would have dislodged the scales, introducing safety risk into the system. Is the railroad liable? An engineer would conclude the compliance violation and unsecured scales means that it would be liable. The engineering professor would congratulate the engineering student for the simple, yet elegant, conclusion based on analysis of isolated components in the system. In engineering, simplicity is the key to elegance.

The lawyer may agree in theory that simplicity is the key to elegance, but law students and lawyers have strong reasons to go into far more detail. The highest score in a law school exam usually spots the greatest number of issues; it analyzes the one or two key issues, but also creates a research plan for the lawyers litigating the case. For example, the railroad has a safety rule that says the conductor shouldn’t help a passenger board when the train is moving, but surely there are exceptions? In the actual case (or the law school exam), the lawyer would likely analyze what those exceptions might be, especially because finding an applicable exception will free the railroad from liability. The good exam answer may also compare the strange chain of events in Palsgraf to other leading cases, in order to assess whether the plaintiff can meet her burden for satisfying the difficult-to-define standard for showing proximate cause.

In short, lawyers are trained to take the relatively simple set of facts in Palsgraf and write a complex, issue-by-issue analysis of all the considerations that may be relevant to deciding the case. The complexity becomes even greater because the lawyer is not seeking to find the “correct” answer based on scientific principles; instead, the lawyer needs to prepare for the jury or judge, and find ways, if possible, to convince even skeptical decision-makers that the client’s position should win.

2. How engineers make simple things complicated. A typical compliance task is that our company has to comply with a new privacy rule. For lawyers, this basically means applying the Fair Information Privacy Principles (FIPPs), such as notice, choice, access, security, and accountability. The law is pretty simple.

The engineer response is: How do we specify these rules so that they can be implemented in code? Stage one: specify the basic privacy principles (FIPPs). Stage two: specify commitments expressed in the company privacy notice. Stage three: specify functional and nonfunctional requirements to support business processes, user interactions, data transforms and transfers, security and privacy requirements, as well as corresponding system tests.

The lawyer may agree in theory that simplicity is the key to elegance, but law students and lawyers have strong reasons to go into far more detail.

As an example, some privacy laws have a data minimization requirement. Giving operational meaning to “data minimization,” however, is a challenging engineering task, requiring system-by-system and field-by-field knowledge of which data are or are not needed for the organization’s purposes. Stuart Shapiro notes that an implementation of data minimization in a system may have 50 requirements and 100 associated tests. Input to the system is permitted only for predetermined data elements. When the system queries an external database, they are permitted only to the approved data fields. There must be executable tests—apply to test data first and then confirm that data minimization is achieved under various scenarios.

For the lawyer, it is simple to say “data minimization.” For the engineer, those two words are the beginning of a very complex process.

3. Why it may be reasonable to use the term “reasonable” in privacy rules. Swire was involved in the drafting of the HIPAA medical privacy rule in 1999–2000. Antón, the engineer, has long chastised Swire for letting the word “reasonable” appear over 30 times in the regulation. Words such as “promptly” and “reasonable” are far too ambiguous for engineers to implement. For example, consider HIPAA §164.530(i)(3): “the covered entity must promptly document and implement the revised policy or procedure.” Engineers can’t test for “promptly.” They can, however, test for 24 hours, 1 second, or 5 milliseconds. As for reasonable, the rule requires “reasonable and appropriate security measures”; “reasonable and appropriate polices and procedures” for documentation; “reasonable efforts to limit” collection and use “to the minimum necessary”; a “reasonable belief” before releasing records relating to domestic violence; and “reasonable steps to cure the breach” by a business associate.

The engineer’s critique is: How do you code for “promptly” and “reasonable”?

The engineer’s critique is: How do you code for “promptly” and “reasonable”? The lawyer’s answer is that the HIPAA rule went more than a decade before being updated for the first time, so the rule has to apply to changing circumstances. The rule is supposed to be technology neutral, so drafting detailed technical specs is a bad idea even though that’s exactly what engineers are expected to do to develop HIPAA-compliant systems. There are many use cases and business models in a rule that covers almost 20% of the US economy. Over time, the Department of Health and Human Services can issue FAQs and guidance, as needed. If the rule is more specific, then the results will be wrong. In short, lawyers believe there is no better alternative in the privacy rule to saying “reasonable.”

The engineer remains frustrated by the term “reasonable,” yet accepts that the term is intentionally ambiguous because it is for the courts to decide what is deemed reasonable. If the rule is too ambiguous, however, it will be inconsistently applied and engineers risk legal sanctions on the organization for developing systems not deemed to be HIPAA compliant. In addition, “promptly” is an unintentional ambiguity that was preventable in the crafting of the law. By allowing engineers in the room with the lawyers as they decide the rules that will govern the systems the engineers must develop, we can avoid a lot of headaches down the road.

4. How to achieve happiness when both lawyers and engineers are in the same room. Organizations today need to have both lawyers and engineers involved in privacy compliance efforts. An increasing number of laws, regulations, and cases, often coming from numerous states and countries, place requirements on companies. Lawyers are needed to interpret these requirements. Engineers are needed to build the systems.

Despite their differences, lawyers and engineers share important similarities. They both are very analytic. They both can drill down and get enormously detailed in order to get the product just right. And, each is glad when the other gets to do those details. Most engineers would hate to write a 50-page brief. Most lawyers can’t even imagine specifying 50 engineering requirements and running 100 associated tests.

Despite their differences, lawyers and engineers share important similarities.

The output of engineering and legal work turns out to be different. Engineers build things. They build systems that work. They seek the right answer. Their results are testable. Most of all, it “works” if it runs according to spec. By contrast, lawyers build arguments. They use a lot of words; “brief” is a one-word oxymoron. Lawyers are trained in the adversary system, where other lawyers are trying to defeat them in court or get a different legislative or regulatory outcome. For lawyers, it “works” if our lawyers beat their lawyers.

Given these differences, companies and agencies typically need a team. To comply, you need lawyers and engineers, and it helps to become aware of how to create answers that count for both the lawyers and the engineers. To strike an optimistic note, in privacy compliance the legal and engineering systems come together. Your own work improves if you become bilingual, if you can understand what counts as an answer for the different professions.

We look forward to trying to find an answer about how to achieve happiness when both lawyers and engineers are in the room. Antón presumably is seeking a testable result. Swire presumably will settle for simply persuading those involved. However, we both agree that the best results come from collaboration because of the value, knowledge, and expertise that both stakeholder groups bring to the table.

6 Comments

If you want to comment on this post, you need to login.

  • comment Yves Le Roux • Jan 14, 2014
    You may be interested to have a look at My IAPP EUROPE 2013 Presentation entitled "Disruptive Technologies: Can the Legislators Keep Up?" which is available at https://www.privacyassociation.org/media/presentations/13DPC/DPC13_Disruptive_Technologies_PPT.pdf
  • comment Estella Cohen • Jan 15, 2014
     Excellent and timely article by Peter Swire and Annie Antón ... Couldn't agree more about getting the best results that come from both lawyers and engineers collaborating in privacy compliance efforts. They can no longer operate in silos.  I am pleased to let IAPP folks know that Dr. Ann Cavoukian, Information and Privacy Commissioner of Ontario, will soon be releasing a new white paper, entitled “Privacy Engineering: Proactively Embedding Privacy, by Design.” Written with co-authors Jason Cronk and Stuart Shapiro,  this paper surveys the emerging discipline of privacy engineering. Privacy engineers require multidisciplinary knowledge and skills. To be effective, they need to have an understanding of both technical and non-technical considerations. Privacy engineers are tasked with managing risks. The paper reviews several risk models that they can adopt, some emphasizing Fair Information Practice Principles and legal compliance, others focusing on harms and contextual integrity. Privacy engineers must then apply systematic risk analyses, using tools such as privacy impact assessments, to measure and quantify identified risks. Finally, privacy engineers must design controls to mitigate those risks, including privacy-respecting architectures, effective privacy policies, and a range of data management methods including minimization, anonymization, aggregation, and the use privacy-enhancing technologies.  
    
  • comment Eric Lachaud • Jan 17, 2014
    I would just add to complete your interesting analysis that there is normally in IT projects some people especially in charge to translate the business needs in technical rules. Maybe the involvement of these business analysts if they are trained to understand the challenge of privacy regulation could improve and accelerate the process.
  • comment Jim Adler • Jan 17, 2014
    Exactly right. Geeks, suits, and wonks must work together -- a theme I've discuss in my Usenix talk here: http://jimadler.me/post/61023316747/video-my-invited-talk-at-the-usenix-security-symposium.
  • comment Ian Oliver • Jan 21, 2014
    A bit of a long reply, apologies, but yes, engineers and lawyers HAVE to communicate - not just talk.
    
    Probably on the of the best articles I've seen on the apparenty dichotomy between privacy lawyers and the engineers who must build the systems. Actually I'd add a 3rd group called privacy advocates who tend to side with the lawyers and believe that the engineers are there to do their bidding.
    But let's take a specific example as mentioned in the text, that of data minimisation. Actually it turns out that [software] engineers rarely gather too much - it really isn't in our nature to overcomplicate the already complicated process of building information systems. Indeed one of the ways to upset engineers is to repeatedly tell them not to collect data they aren't collecting in the first place. Each new data point usually involves additional validation, verification, tests and obviously more code. More code => more bugs => more test => more time => more expense etc...
    But we also see other problems such as the emergence of the privacy cabal - a group of predominantly lawyers and advocates who do not understand the discipline and complexities  of engineering. Again the article above quite well explains this. Indeed Jim Adler in his PII2012 talk called this the Privacy-Industrial Complex: a mechanism for churning our policies, guidelines, edicts on the topic of privacy but without any basis in engineering reality. The engineers role is reduced to a mere bystander, a group of people to be handed orders from the cabal and privacy priesthood on high.
    When things go wrong, it is invariably the engineers' fault, while the priests of privacy claim that their policies conform to best practice and take solice in the Privacy by Design Commandments. Take Target for example, their POS system collected only necessary data - credit card numbers  - was this an example of applying Fair Information Policy Principles? Is collecting and processing credit card numbers necessary for processing credit card numbers? Do our privacy principles accurately capture subtle requirements such as caching data in memory, types of encryption algorithm required, the human-computer interface etc Aside: OK, Target failed in many respects and the results of that investigation are to be seen.
    Requirements such as don't collect PII ... is an IP address PII? How do I stop collecting this when the very protocols of the internet require such addresses just as the postal system relies upon physical addresses.
    The position of the emerging privacy engineer will become one of the most important positions in the field of privacy. A group of people who understand the fundamentals of information systems from their engineering to their mathematical foundataions is critically required. Ignore these foundations and privacy becomes an hand-waving, powerpoint generating inconvenience to be humored and tolerated rather than an integral part of the business ecosystem.
    When lawyers and engineers work together AS EQUALS we get some truly AMAZING work done. It happens rarely and it takes a lot of work from both sides just to build a common framework of understanding. Engineers have the ability to properly deconstruct and understand the finer workings of privacy compliance - ignore this as privacy will remain as we described above: a tolerated inconvenience.
    
  • comment Debra • May 15, 2015
    Great article! Most privacy issues are related to how personal data are collected and used across an organization's business processes (rather than systems and networks - which involve security and IT). Therefore, I think there is huge value in a role that is not mentioned in this article - Privacy Operations - the person who translates legal requirements into business functional requirements and works with Product and Engineering to develop product requirements and acceptance testing criteria. In many organizations, I think that the Chief Privacy Officer should focus on operationalizing privacy, as opposed to traditional attorney responsibilities. The CPO should work with Privacy Counsel, business managers, product managers, and engineers to translate requirements.