Impediments to Data Transfer
The possibility of foreign government access to personal information is often cited as a concern with respect to the international transfer of data. Even Canadian governments were not immune to these concerns notwithstanding the close trading and political relationship between Canada and the United States. In an early response to what was considered overbroad government access, the province of British Columbia adopted amendments to its Freedom of Information and Protection of Privacy Act (BC FIPPA) in 2004 prohibiting public bodies from storing personal information outside of Canada or allowing access to that information outside of Canada, with very limited exceptions. Nova Scotia followed suit with the Personal Information International Disclosure Protection Act. A similar legislative provision exists with respect to personal health information in New Brunswick in its Personal Health Information Privacy and Access Act. Even when in the absence of a statutory restriction, it is not uncommon for public and private sector organizations in Canada to have concerns regarding transfers of personal information to the United States because of the perception of inadequate protections from government access.
Formal statutory prohibitions and informal objections to the transfer of personal information to the United States pose a significant barrier to organizations taking advantage of cloud-based services. Recently, Elizabeth Denham, the information and privacy commissioner of British Columbia, released a very thoughtful analysis of “tokenization” technology that may provide a path forward. Commissioner Denham also signals a fresh willingness to consider approaches to de-identification that could have broader implications in thinking about the role of de-identification in balancing the tensions between protection of privacy and the use of new technologies.
What is Tokenization?
In essence, tokenization is the process of replacing a meaningful data value with a randomly generated token that has no meaning. This process can be used to replace sensitive data, in this case personal information, with data that is not sensitive because it has no meaning. However, in order to ensure that the tokened data can be de-tokened, the process includes the creation of a look up table to cross-reference the token with the original data.
The process of tokenization is different from encryption because the token is random and not generated by an algorithm. The concern with encryption is that a person with the data set could, with enough computing power, reverse engineer the encryption algorithm to decrypt the data. Of course, with strong encryption methods, the risk may be more theoretical than practical. Nevertheless, to date, encrypted data has not been exempt from the application of BC FIPPA’s restrictions on international data transfer.
The Proposed Scheme
The issue of whether tokenization could prove to be a solution to the restrictions of BC FIPPA arose in the context of a privacy impact assessment conducted by the Ministry of Social Development and Social Innovation for its Services to Adults with Developmental Disabilities (STADD) program. A critical objective of the STADD program is to coordinate the numerous professionals who might interact with and support an individual with a developmental disability. To accomplish this, the ministry proposed to develop an integrated network with a common assessment platform and an online collaborative planning space for these professionals.
Information used in the STADD program would be hosted outside of Canada. However, the blocking provision contained in section 30.1 of BC FIPPA states that “[a] public body must ensure that personal information in its custody or under its control is stored only in Canada and accessed only in Canada” unless one of the exceptions applies. The exceptions include express consent in a prescribed form, which is usually impractical for a large government initiative such as the STADD. Other exceptions are where the information is stored or accessed in another jurisdiction for the purpose of a disclosure permitted by BC FIPPA or where the public body is collecting a payment. None of the exceptions applied to the STADD program.
The ministry proposed to comply with section 30.1 of BC FIPPA by replacing all information that would have the potential to identify an individual with random tokens prior to the information being transferred outside of Canada. The table used for de-tokening would remain in Canada and would not be accessible outside of Canada. Only Canadian government employees would have access to the de-tokened information.
Not Personal Information
Whether the ministry’s proposal complied with section 30.1 of BC FIPPA depended on whether the tokened data continues to be personal information. The prevailing test in Canada for whether data is personal information is whether there is a reasonable expectation that the individual could be identified when the data in issue is combined with information from sources otherwise available.
Arguably, the token remains personal information since the token could be combined with the information in the token table to create de-tokened data. However, Denham stated that the test for what constitutes personal information should be applied contextually. The purpose of section 30.1 of BC FIPPA was to protect against disclosure to foreign governments and their agencies. Therefore, in applying the test for whether the information was personal information for the purposes of section 30.1 of BC FIPPA, the question was whether anyone outside of the ministry who gained access to the tokened information would be accessing information about an identifiable individual.
Denham concluded that the token would not be personal information if adequate tokenization was applied and the table used to de-token the data was protected and not accessible outside of Canada. If these conditions were met, the data could be transferred to the United States without violating section 30.1 of BC FIPPA. This would be a major advancement in the ability of public bodies in British Columbia being able to access and use cloud-based services.
Lingering Concerns
Denham was, however, cautious in her report. She noted that not all information would be tokened under the STADD program. Denham concluded that the non-tokened information required analysis to ensure that it would not be capable of being used to re-identify the tokened information. In other words, the surrounding information in the records might be able to be combined to identify the individual notwithstanding the tokenization of the more sensitive information.
Another concern for Denham was whether a U.S. court or government agency could order a U.S. company to provide access to the table. Obviously, if the table were solely within the control of the ministry, this would minimize the risk of extraterritorial reach of U.S. laws. In other cases, Denham noted that a number of considerations would come into play, including the degree of control a U.S. company may have over the entity holding the table. If that entity is a subsidiary of a U.S. company, the degree of control over the Canadian entity, including its board of directors would have to be analyzed.
The Path Forward
Commissioner Denham’s report is significant. Although Denham noted that there were still issues to be resolved, the report suggests that, in theory, tokenization could be a process to successfully de-identify personal information for the purposes of complying with BC FIPPA. The process may also satisfy the requirements of the Nova Scotia legislation and that of New Brunswick and provide an alternative method of addressing concerns of organizations with respect to using cloud computing services hosted in the United States (leaving aside whether those concerns are justified, which is a matter of debate).
But perhaps even more importantly, Denham’s openness to considering this method of de-identification illustrates a practical commitment on the part of Canada’s data protection authorities to revisiting the issue of de-identification, which could have broader implications for data processing and data use.