As businesses across the world have begun adjusting to life under the EU General Data Protection Regulation, an important question continues to crop up: Should encrypted data be treated as personal data? The answer to this question has significant ramifications for the modern e-commerce world.

At its most basic, encryption is a way of protecting the privacy of your data. It actually spans much of human history. Cryptography dates back to the ancient Greeks, who used simple ciphers to encode messages.

Conceptually, modern encryption is not too complicated an idea, although the actual technology that goes into it is complex. It works like this: The party doing the encryption takes data and uses an “encryption key” to encode it so that it appears unintelligible. The recipient uses the encryption key to make it readable again. The encryption key itself is a collection of algorithms that are designed to be completely unique, and without the encryption key, the data cannot be accessed. As long as the key is well designed, the encrypted data is safe.

The GDPR generally follows a binary approach to data in that it’s either personal or it’s not. If data is considered to be personal data, the full weight of the GDPR’s regulatory regime applies to any entity processing that information.

Whether data is considered personal depends on whether it relates to a person who “can be identified, directly or indirectly[.]” The crucial question then becomes: How much effort would a potential data controller have to expend to identify a person so that the data would be considered personal?

The more logical answer would be that there is some sort of “reasonableness” limitation, which seems to be accepted thinking. This approach lines up with passages within the GDPR itself, such as Recital 26.

Where this “reasonableness” line is drawn has huge implications for encryption and, by extension, the future of online commerce. If under the GDPR, encrypted data is regarded as personal data, thus subjecting any businesses that process the data to regulation and potential liability, it will hamper the growth of the digital economy.

Today, the question of how encrypted data would be viewed under the GPDR is an open one.

The GDPR is clearly in favor of encryption. For example, Article 34, Section 3(a), frees data controllers from having to notify affected individuals about a personal data breach if the controller has implemented protection measures, “in particular those that render the personal data unintelligible to any person who is not authorised to access it, such as encryption.”

However, the issue of how encrypted data should be treated gets knotty due to the GDPR’s inclusion of two other concepts: anonymization and pseudonymization.

In short, “anonymized” data is that which has been irreversibly stripped of any way of identifying the underlying individual, even by the organization that did the anonymizing. It is like locking the data up and throwing away the key.

Pseudonymization, on the other hand, involves “the processing of personal data in such a way that the data can no longer be attributed to a specific data subject without the use of additional information.” While it does provide additional safeguards, pseudonymized data, unlike anonymized data, is still unequivocally considered personal data under the GDPR, as noted in Recital 26.

Commentators have often concluded that because encryption is more conceptually similar to pseudonymization, encrypted data would also be considered personal data under the GDPR.

The problem with that approach is that encryption is neither pseudonymization nor anonymization. The fundamental difference is: Who holds the metaphorical key?

In anonymization, no one does. The key is gone. In pseudonymization, the same party who pseudonymizes the data usually does. However, with encryption, many of the parties who are processing the data, such as cloud storage providers, do not have the encryption key to unscramble that data. The encryption key stays with the generator or the end user of the data.

This difference is crucial. For all intents and purposes, a cloud provider who hosts encrypted data is not processing personal data. The cloud provider cannot access that data, and even if its servers were breached, data subjects would be at little risk from a privacy standpoint since the data would also be unintelligible to the wrongdoers. The GDPR itself recognizes this concept, although in a slightly different context since it frees organizations from having to notify data subjects of a breach if the data was encrypted. It therefore also makes sense that a cloud provider should not have to comply with all the GDPR’s requirements if it isn’t really processing personal data.

This argument rests, of course, on the assumption that the encryption algorithm is well designed and that the encryption is properly carried out. One possible solution for achieving balance between data protection and not restricting the growth of the digital economy would be for the European Data Protection Board to maintain an up-to-date list of proven encryption technologies. Any personal data that was encrypted with those technologies vetted by the EDPB would be considered “not personal” for any parties who did not have the encryption key.

While this is one possible solution, there are certainly others that could be explored. However, one thing is certain. As more of everyday life and business shifts to the digital realm, encryption will continue to grow in importance. If encrypted data is regarded as personal data under the GDPR, thus subjecting any businesses that process the data to regulation and potential liability, it will hamper both the growth of the digital economy and the motivation for companies to encrypt their data. Let’s hope that the supervisory authorities and European Data Protection Board see the light and officially conclude that, when processed by parties that do not have access to the encryption key, encrypted data should not be considered personal data under the GDPR.