Flying is, statistically speaking, the safest way to travel. It’s the safest because of the methods the aviation industry uses to develop their tools, procedures and how they investigate and act upon accidents. This is a very holistic, yet extremely precise attitude to safety and is evident throughout the whole development, maintenance and working lifecycle, accomplished through standard operating procedures, training and even the psychology of human behavior—especially relating to how humans make mistakes.
The safety theories and resultant practices developed in aviation have found their way into civil engineering, the nuclear industry, anesthesia and surgery. It’s high time they made their way into privacy as well.
I’ll explain.
If an accident or incident occurs in any of the above industries, it is normally reported in the news as a major event. This is partly a product of sensationalism, but mainly it is because accidents are remarkably rare and thus newsworthy. These industries require and necessitate detailed reporting and investigation, and they must act upon the findings of such investigations.
So then, I'll pose a question: Should privacy be considered a safety-critical aspect?
We as privacy professionals tend to work in a reactive mode. When a privacy incident occurs—a data breach typically—we focus primarily on how to stem and mitigate the breach. Subsequent analysis will generally be limited to the security failures of the hosting system, and we will continue as normal with the repercussions fading from the news, while the chain of events leading to the incident remain woefully under investigated.
Rarely, if ever, do we take the approach from safety-oriented industries and investigate why the privacy incident occurred from the perspective of the communication channels between lawyers and engineers, the employees and the company culture. James Reason—ostensibly the “father” of modern safety thinking—terms this the Swiss Cheese Model: Despite the multiple layers of protection, you can still find an obscure path of events where the holes in the Swiss cheese line up.
We must seriously consider our response to any privacy incident. Imagine if we responded in the way aviation accident investigators do. We’d have a designated, multi-disciplinary investigation team of experts (lawyers, engineers, system administrators, programmers etc) and be required to produce detailed public reports through a country’s legal system, demonstrate accountability and legally enforced mitigations to the whole industry thereafter.
Now with that in mind, let’s focus on how aircraft are operated and in particular refer to a training video made by American Airlines, called Children of the Magenta. While we have highly automated aircraft, in situations where things appear to be going wrong, it is better to step down a level of automation and for the pilot to take direct control. If you think this is a little abstract, consider how you use cruise control on your car. Do you let it manage the speed of the car in heavy traffic or do you switch it off and take direct control?
In the privacy world, much of the time we operate in a highly automated fashion: We produce documents explaining our privacy policies, and we make guidelines on cookie usage as well as the effective design of the user-interface when presenting such things to the user.
Such policies and requirements are then given to the compliance, design and engineering teams and developed through code into systems, products and services. This whole process happens somewhat automatically (or even automagically).
When we as privacy professionals investigate a privacy incident of any nature, do we ever consider that it could have stemmed from the whole process of mapping policy into code rather than a failure in one specific step?
When was the last time post-breach that the fault was detected in the particular wording of a privacy clause combined with the management and communication structures within a company and traced right down the lines of code involved?
If we feel something isn't right, we must take responsibility and step down our level of automation and get much more involved in how implementing our products works. Indeed, I would even advocate that privacy officers and lawyers spend time sitting down with the very programmers to interpret requirements and policy (after many interactions and transformations) into C, Java, Python or whatever coding language is used.
Make no mistake, regardless of the whole software engineering process, writing code is extremely hard. Even seemingly unrelated decisions, such as the wording of a clause in the privacy policy or the choice of a management system, can have subtle repercussions that expose our systems to privacy failures.
When a failure does occur, do we blame the programmer or do we fix the whole system, end-to-end?
In some ways we are acting like the aviation industry in the 1920s and 1930s, in that we are automatically blaming the pilot.
As privacy professionals, we must realize that privacy incidents aren’t the fault of poor engineering, bad security, or overlooked typos in the privacy policy document, but result from a complex interaction of all—and I really mean ALL—parties within an organization.
In fact, the act that made the big difference in aviation was the understanding that each individual action, regardless of its seemingly irrelevance or innocuous nature, is a contributory factor.
As a profession, we might not be ready for this level of scrutiny, but the attitude and culture we need to foster and develop to the whole privacy discipline, especially as we’re now talking so much about privacy engineering, needs to develop in the above manner.
Aside: when writing my book on privacy engineering, despite being versed in safety critical techniques, software engineering and safety-critical systems, it constantly came as a shock how naïve we are as a profession in how everything from our policies to requirements engineering to risk management to code all fit together. However, as a profession we have the benefit of hindsight and the teachings from other industries and disciplines. We ignore these at our peril.