TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | An Introduction to Privacy Enhancing Technologies Related reading: California privacy: 2022-23 legislative wrap-up




By Steve Kenny

As increasing amounts of personal data are being held, the risk of breaching data protection legislation and regulation has grown ever greater. At the same time, data protection laws are tightening across the world in response to consumers' and citizens' concerns. As part of a broader information governance strategy, some organisations are making greater use of more automated controls to manage data protection.

More than a decade ago, the Dutch and Ontario Data Protection Authorities recognised the role of technology in protecting privacy and coined the term Privacy Enhancing Technologies (PET). Today, European Data Protection Authorities routinely refer to PET as an approach to help achieve compliance with data protection legislation. Here, Steve Kenny, former PET expert with the European Commission and the Dutch regulator, provides a quick introduction to PETs and how they can contribute to the management of privacy risk.

There are no uniform definitions of PET; but it typically refers to the use of technology to help achieve compliance with data protection legislation. The business case for adopting PETs is frequently not limited to the confidentiality of personal information, for example many of the technologies referred to as PETs can protect corporate confidential information and protect revenues by securing the integrity of data

  1. Encryption. Encryption today is a relatively mature technology, though still in a state of advancement. Encryption supports the security and proportionality principles of data protection law. In the past two years we have seen an increasing trend for regulators to become more prescriptive in their approach to encryption, for example, in the PCI DSS standard for credit card data, and in a recent announcement by the UK data protection regulator. Encryption is relatively simple to implement and can be a highly effective tool.
  2. Metadata and Digital Rights Management. Metadata and Digital Rights Management are far newer technologies than encryption. Metadata is data about data, providing a framework to describe semantics—the meaning of different types of data. Metadata can be very useful for achieving compliance with data protection legislation because one can differentiate between data, personal data and sensitive personal data. Certain data types such as religious information can be ‘tagged' as being sensitive so that different rules of processing can be automatically triggered for example. More advanced applications of metadata revolve around uses of Digital Rights Management (DRM) applications. Originally intended to protect electronic copyright, DRM technology can be adapted to provide users with a very high level of control over how their personal information is used. When deployed on trusted infrastructure, it provides strong controllability, auditability and transparency (‘CAT')—the privacy equivalent of information security confidentiality, integrity and availability (‘CIA')—over the governance of personal information. These technologies tend to support in particular the purpose binding (controlling secondary use) and transparency (informing data subjects of their rights) principles of data protection law. Metadata schemes are relatively simple to implement for structured data sets and can be highly effective. Digital Rights Management technologies are less mature with implementation being more complicated. In terms of effectiveness however they offer significant support for privacy.
  3. Application programming. Software packages processing personal data execute rules which determine how personal data is processed. It follows that those rules should uphold data protection regulatory requirements. When implicated rules are binary, for instance ‘opt-in'—‘opt-out,' as typically is the case with a permission management system, this is relatively straightforward, conceptually at least, and can be configured by the user organisation. Where rules are ‘fuzzy,' a term which can be used to describe evolving jurisprudence (of privacy), the solution is more involved. Here, a given rule unlike a binary yes/no rule can effectively take on multiple different values given inherent imprecision. Embedding such ‘fuzzy' rules into code requires a series of steps, translating given sets of jurisprudence into a formal logic, implementing that logic in a given programming language, and then verifying if that implementation maps to its logical instantiation through the use of formal methods. These activities are typically pursuant upon the software provider rather than the data controller.
  4. System development governance. IT governance is a moderately mature area, and while many organisations include data protection controls in their system development lifecycle frameworks, those controls are often ‘check-list' orientated. The inherent weakness with this approach is that to be effective, legal privacy risks need to be translated into language that makes sense to a IT professional, and controls to manage those risks need to be technologically prescriptive yet make sense to a lawyer. These two conditions are rarely met today. The result is that a large amount of non compliance is designed into applications. Governance over system development can be a highly effective and efficient approach to supporting compliance.
  5. User interface. Many professional user interface designers have only recently started to apply the discipline of ‘engineering psychology' in their work. This applies principles of how the human mind constructs reality to the experience a user has of the two dimensional screen with which they interact. The primary legal requirement compelling the use of engineering psychology is that of the spirit of the transparency principle, primarily defined in Article 10 of Directive 95/E46/EC, the directive covering all personal processing in EU member states. Engineering psychology influences include font sizes, colours, sequencing of tasks, shapes, spatial movement and use of imagery. Data protection requirements, for a given situation (i.e. using a HR system, a CRM system, a social networking site), can be integrated into workflow through the use of engineering psychology. This strongly supports an individuals' cognition of their rights.
  6. Identity management. A fundamental principle of data protection is data minimisation. This can be applied to separate authorization (do you have the right to perform this action?) from identification (who are you?). Encryption can be selectively applied to partition access to particular categories or tables of personal data, keeping separate different identities users adopt for different spheres in their lives. This tends to empower people with the cryptographic property of ‘conditional linkability', where users consent to any aggregation of their identities, with cryptography making it mathematically implausible to deduce one identity from another, in some contexts. Identity management tends to support the purpose binding (control over secondary uses), security and proportionality (restricting data usage) principles of data protection law.
  7. Architecture. Architects can have a fundamental affect on data protection compliance. Design decisions on inbound and outbound system interfaces, use of cryptographic acceleration, database design, incorporation of anonymizer front-ends to Web servers and the application of trusted infrastructure are just a few situations where architectural choices can support privacy requirements and expectations. Architecture can become a PET strategy in itself as it has the potential to support all principles of data protection law.

      Driving an effective and efficient allocation of responsibilities for privacy requires the CIO to adopt an active custodianship role for data protection controls. While the controls should be owned by the business, they normally have to be executed by IT staff. Organisations need to evolve their approaches to information governance to improve the contribution their IT function makes to managing privacy risk. Orchasiting the seven areas of PET as defined in this article provides the CIO organisation with a technology strategy to deliver PET for business benefit. Market incentives to do this have never been as strong as they are today.

Steve Kenny is KPMG's Privacy Services Leader responsible for strategy definition and execution. Previously, he was appointed by Law to the European Commission to advise all EU member state data protection commissioners. Contact Steve at


If you want to comment on this post, you need to login.