TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Tech | A Safety-Critical Approach to Privacy: Lessons from the Aviation Industry Related reading: Demystifying Privacy Engineering



Flying is, statistically speaking, the safest way to travel. It’s the safest because of the methods the aviation industry uses to develop their tools, procedures and how they investigate and act upon accidents. This is a very holistic, yet extremely precise attitude to safety and is evident throughout the whole development, maintenance and working lifecycle, accomplished through standard operating procedures, training and even the psychology of human behavior—especially relating to how humans make mistakes.

The safety theories and resultant practices developed in aviation have found their way into civil engineering, the nuclear industry, anesthesia and surgery. It’s high time they made their way into privacy as well.

I’ll explain.

If an accident or incident occurs in any of the above industries, it is normally reported in the news as a major event. This is partly a product of sensationalism, but mainly it is because accidents are remarkably rare and thus newsworthy. These industries require and necessitate detailed reporting and investigation, and they must act upon the findings of such investigations.

So then, I'll pose a question: Should privacy be considered a safety-critical aspect?

We as privacy professionals tend to work in a reactive mode. When a privacy incident occurs—a data breach typically—we focus primarily on how to stem and mitigate the breach. Subsequent analysis will generally be limited to the security failures of the hosting system, and we will continue as normal with the repercussions fading from the news, while the chain of events leading to the incident remain woefully under investigated.

Rarely, if ever, do we take the approach from safety-oriented industries and investigate why the privacy incident occurred from the perspective of the communication channels between lawyers and engineers, the employees and the company culture. James Reason—ostensibly the “father” of modern safety thinking—terms this the Swiss Cheese Model: Despite the multiple layers of protection, you can still find an obscure path of events where the holes in the Swiss cheese line up.

We must seriously consider our response to any privacy incident. Imagine if we responded in the way aviation accident investigators do. We’d have a designated, multi-disciplinary investigation team of experts (lawyers, engineers, system administrators, programmers etc) and be required to produce detailed public reports through a country’s legal system, demonstrate accountability and legally enforced mitigations to the whole industry thereafter.

Now with that in mind, let’s focus on how aircraft are operated and in particular refer to a training video made by American Airlines, called Children of the Magenta. While we have highly automated aircraft, in situations where things appear to be going wrong, it is better to step down a level of automation and for the pilot to take direct control. If you think this is a little abstract, consider how you use cruise control on your car. Do you let it manage the speed of the car in heavy traffic or do you switch it off and take direct control?

In the privacy world, much of the time we operate in a highly automated fashion: We produce documents explaining our privacy policies, and we make guidelines on cookie usage as well as the effective design of the user-interface when presenting such things to the user.

Such policies and requirements are then given to the compliance, design and engineering teams and developed through code into systems, products and services. This whole process happens somewhat automatically (or even automagically).

When we as privacy professionals investigate a privacy incident of any nature, do we ever consider that it could have stemmed from the whole process of mapping policy into code rather than a failure in one specific step?

When was the last time post-breach that the fault was detected in the particular wording of a privacy clause combined with the management and communication structures within a company and traced right down the lines of code involved?

If we feel something isn't right, we must take responsibility and step down our level of automation and get much more involved in how implementing our products works. Indeed, I would even advocate that privacy officers and lawyers spend time sitting down with the very programmers to interpret requirements and policy (after many interactions and transformations) into C, Java, Python or whatever coding language is used.

Make no mistake, regardless of the whole software engineering process, writing code is extremely hard. Even seemingly unrelated decisions, such as the wording of a clause in the privacy policy or the choice of a management system, can have subtle repercussions that expose our systems to privacy failures.

When a failure does occur, do we blame the programmer or do we fix the whole system, end-to-end?

In some ways we are acting like the aviation industry in the 1920s and 1930s, in that we are automatically blaming the pilot.

As privacy professionals, we must realize that privacy incidents aren’t the fault of poor engineering, bad security, or overlooked typos in the privacy policy document, but result from a complex interaction of all—and I really mean ALL—parties within an organization.

In fact, the act that made the big difference in aviation was the understanding that each individual action, regardless of its seemingly irrelevance or innocuous nature, is a contributory factor.

As a profession, we might not be ready for this level of scrutiny, but the attitude and culture we need to foster and develop to the whole privacy discipline, especially as we’re now talking so much about privacy engineering, needs to develop in the above manner.

Aside: when writing my book on privacy engineering, despite being versed in safety critical techniques, software engineering and safety-critical systems, it constantly came as a shock how naïve we are as a profession in how everything from our policies to requirements engineering to risk management to code all fit together. However, as a profession we have the benefit of hindsight and the teachings from other industries and disciplines. We ignore these at our peril.


If you want to comment on this post, you need to login.

  • comment Allen • Jul 7, 2015
    Terrific idea, thanks for sharing. Even with an aviation background, I never looked at privacy in the holistic way you describe, but instead limited my look to the privacy or InfoSec controls in place, not the entire organization.
    Great thought.
  • comment Kwame • Jul 7, 2015
    I love this! Very motivating and inspiring. A great eye-opener too. Thank you for sharing!
  • comment John • Jul 7, 2015
    Airline safety as an analog for delivering adequate privacy controls has a visceral appeal, but it may not fly.  Keeping airplanes in the sky is a very specific, high-stakes, essential commercial activity that benefits from a tightly integrated supply chain and shared commitment of all players.  Does that sound like the systems that either have been or could be designed or grown to support privacy?
    Think about the privacy supply chain for a minute.  Even those in it can't identify all the others; some don't want to be seen and others, whether software or hardware or services players may no longer exist.  Load a webpage and count the number of companies building the UI and planting code.  Third parties may be recognized at a distance, but what about the fourth, fifth and sixth parties?
    The holistic methodology that leads to an understanding of failure can work in an industry with reasonable regulation, consistent practices and high-barrier to entry, works to achieve great results.  Again, does that sound like privacy?
    In fact, based only on what we know about the source of data breaches, the biggest problems will always include a healthy portion of, let's call it "passenger error."  How much more do we need to hear about easy and unchanged passwords, delayed patches, lost laptops and social engineering to confirm some of the weakest links in the privacy supply chain?
    Much but not enough is likely said about such things.  I mean, who wants to alienate their customers when there are always other options?  When a airplane crashes, no airline benefits, except future passengers who are safer because of the industry's demand to understand failure.
    Safety is critical, but aviation is not so much a model as it is a measure of how far we have to go.
  • comment Ian • Jul 21, 2015
    I'm not saying that we become an analog of the aviation industry, but rather we look closely at the lessons in "safety" that have been made through that industry's experiences.
    Indeed the same can be said of many other industries with a strong "safety" component. Note that the defintion of safety can be taken in many ways, not just keeping aircraft flying and passengers arriving at their desintations (with or without their baggage).
    I chose aviation because it has become a very visible manifestation of safety practices. Not just in terms of the supply chain, but in terms of management, auditing and basic, everyday practices. Because of aviation the humble checklist is no longer a list of things to do but a very deep process that ties together both technology and human nature.
    Indeed it is this last aspect that I would like to see develop more within the privacy community and our practices: that we should become much more disciplined as an industry/technology/practice in how we approach systems.
    The statement:
    "When a airplane crashes, no airline benefits, except future passengers who are safer because of the industry's demand to understand failure."
    is very illuminative of how we don't work in privacy...when a data breach happens we patch up and hope that the media forgets over time. There isn't any privacy industry wide change that prevents the next similar breach. We don't change our software development practices with an eye on prevention for example.
    Safety is critical, and aviation does show a model - now whether it is *the* model that we wish to follow is debatable, I will agree with you there. In fact, my particular choice of model to follow is more aligned with how safety practices have emerged in areas such as surgery and anaesthesia - a much more "agile" industry with much in common with software and privacy engineering. 
    However it was the learnings from aviation that spurred the direction that safety - in its various forms - took in those areas.
    That, to me, is where we need to look for our lessons.