Last week, while privacy professionals in Canada were still contemplating Bill C-26 on cybersecurity, the much-anticipated Digital Charter Implementation Act, 2022 — Bill C-27 — was introduced by the federal government. It is a reintroduction and, some may agree, an improvement of Bill C-11, first introduced in 2020 and failed on the order paper as a result of the federal election in 2021.

The new statutory framework in Bill C-27 governs private sector personal information protection practices and, if passed, would enact the following three new statutes:

  • The Consumer Privacy Protection Act would repeal and replace Part 1 of the Personal Information Protection and Electronic Document Act. Part 2 of PIPEDA will be renamed to "An Act to provide for the use of electronic means to communicate or record information or transactions," or the Electronic Documents Act.
  • The Personal Information and Data Protection Tribunal Act would establish an administrative tribunal to review certain decisions made by the Privacy Commissioner of Canada and make orders for contraventions of the CPPA.
  • The Artificial Intelligence and Data Act, which is new and perhaps unanticipated by many, will regulate international and interprovincial trade and commerce in artificial intelligence systems by establishing common requirements, applicable across Canada, for the design, development and use of these systems.

In this article, we aim to highlight some key elements of the new bill. We don't try to cover Bill C-27 in its entirety and we do not attempt to compare it with its predecessor, Bill C-11. Many folks sharing red-lined documents that compare the bills on social media have already done a good job of that!

Consumer Privacy Protection Act 

The CPPA grants the Privacy Commissioner of Canada broad order-making powers and prescribes significant administrative monetary penalties up to $10 million CAD or 3% of global revenue. Fines are augmented in the case of serious contravention resulting in offenses that may attract a maximum penalty of $25 million CAD or 5% of global revenue. Additionally, a new private right of action is conferred to any individual who suffers losses or injuries as a result of the contravention of the CPPA.

Consent continues to be an important gatekeeper, but the CPPA lifts some burden from the individual to understand and give consent by focusing more on the organization's accountability and transparency. For example, there is a new requirement for an organization to implement a privacy management program. In developing such a program, the organization must consider the volume and sensitivity of the personal information under its control. The commissioner may access the policies, practices and procedures developed under the privacy management program and, after reviewing them, provide guidance on or recommend corrective measures for the organization about its privacy management program.

To allow for flexibility for businesses, the CPPA permits the following list of exceptions to the requirement of consent under the head of Business Operations, including:

  • Business activities.
  • Transfer to the service provider.
  • Deidentification of personal information.
  • Research, analysis and development.
  • Prospective business transaction.
  • Information produced in employment, business, or profession.
  • Employment relationship: federal work, undertaking or business.
  • Disclosure to lawyer or notary.
  • Witness statement.
  • Prevention, detection or suppression of fraud.
  • Debt collection.

Of particular interest is the exception of "legitimate interest" in conducting business activities. An organization may collect or use an individual's personal information without knowledge or consent if the legitimate interest of an activity outweighs any potential adverse effect on the individual resulting from such collection or use. There are two conditions to this: a reasonable person would expect such collection or use, and the personal information is not collected or used for the purpose of influencing the individual's behavior or decisions. The crucial element is the organization's assessment of associated risks and benefits and how it may mitigate those risks. The organization will have to demonstrate to the commissioner they are diligent in their assessment efforts.

There are exceptions to the exceptions — no, that's not a typo! Some of the above exceptions do not apply if an individual's electronic address is collected using a computer program designed or marketed primarily for generating, searching for and collecting electronic addresses. The exceptions also do not apply when personal information is collected by accessing a computer system or causing a computer system to be accessed. In these cases, express consent is required.

The CPPA clarifies, to a degree, the concepts of "deidentified" and "anonymized" information. Personal information that has been anonymized is carved out of the legislation. "Anonymize" is defined as "to irreversibly and permanently modify personal information" such that it is impossible to reidentify them according to "generally accepted best practices." On the other hand, "deidentified" personal information is still considered personal information. "Deidentify" is defined as "to modify personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified remains." An organization may use "deidentified" personal information without consent for internal research, analysis and development purposes. Additionally, an organization may disclose "deidentified" personal information for a socially beneficial purpose, without consent, to a limited list of bodies: a government institution, a health care institution or any organization that is mandated to carry out a socially beneficial purpose. The CPPA requires organizations ensure deidentification measures are proportionate to the purpose for which the information is deidentified and the sensitivity of the personal information.

The CPPA has explicitly addressed the protection of the personal information of minors. According to the CPPA, all personal information of minors is sensitive information. However, the definition of "minor" is not found in federal legislation. The definition will be deferred to the provinces, which is under the age of 18 or 19.

Currently, an individual may request access to or correct their personal information under an organization's control. The CPPA now grants the individual the right to request the disposal of their personal information in writing. If granted, the organization will also inform any service provider it has transferred the data to and ensure the service provider has disposed of the information. If the organization refuses to dispose of the individual's personal information, it must inform the individual of the reason of refusal in writing. It's also interesting that deidentification has been added to the definition of disposal in the new law.

Special note to federally incorporated businesses: the CPPA amended the Canadian Business Corporation Act so that six years after an individual ceases to be an "individual with significant control," the corporation must dispose of any of the individual's personal information in the register within one year. The register of "individuals with significant control" has been a record-keeping requirement for Canadian federal corporations since 2019.

The CPPA also created a new portability right with respect to personal information held by organizations subject to a "data mobility framework." On the request of an individual, an organization must disclose the personal information it has collected from the individual to another organization designated by the individual if both organizations are subject to a framework.

Finally, the commissioner has a new power to approve a "code of practice" and "certification programs." Any private or public organization may apply to the commissioner for approval of a code of practice or a certification program that provides substantially the same or greater protection of personal information as the CPPA. However, compliance with the requirements of a code of practice or a certification program does not relieve an organization of its obligations under the law.

Personal Information and Data Protection Tribunal Act

The Personal Information and Data Protection Tribunal Act establishes the Personal Information and Data Protection Tribunal to hear appeals of the Privacy Commissioner of Canada's decisions, orders and recommendations. The tribunal is also given the power to impose penalties under the CPPA. The tribunal consists of three to six full-time or part-time members, and at least three of the members must have experience in the field of information and privacy law.

Artificial Intelligence and Data Act 

Although the definition of "automated decision system" in the CPPA includes "artificial intelligence systems" as defined in the AIDA, the focus of the two statutes on this subject is distinct. The CPPA protects individuals' rights and captures broad, automated decision systems that are not necessarily "autonomous" in predicting outcomes. In other words, the CPPA encompasses sophisticated computational systems with deterministic algorithms. An individual has the right to request an explanation of how a prediction, recommendation or decision having a "significant impact" on them was made.

On the other hand, the AIDA targets "real" AI systems that may be difficult to trace or explain the outcome from the inputs to the system. Some describe this difficulty as the "black box" of AI systems. As explainable AI is difficult to achieve — and many would argue "explainable" does not mean more accurate — governing "real" AI systems will have to rely on the organization's code of ethics.

The AIDA is principles-based, and industries are expected to comply voluntarily and demonstrate the deployment of "responsible AI." The purpose of AIDA is two-fold. First, it is to protect Canadians by ensuring "high-impact" AI systems are developed and deployed in a way that identifies, assesses and mitigates the risks of harm and biases. And second, it is to prohibit conduct concerning AI systems that may result in serious harm to individuals or their interests.

The AIDA requires an organization to establish measures with respect to the manner in which data is anonymized and the use or management of anonymized data. There are also record-keeping requirements to demonstrate those measures.

For any "high-impact" AI system, the qualification of "high-impact" remains to be prescribed by regulation. An organization must publish in plain language a description of the system, including how it is used; the types of content it generates and the decisions, recommendations, or predictions it makes; the mitigation measures; and any other information that may be prescribed by regulation.

It is unclear what measures must be implemented to identify and mitigate risks to human health and safety and how to reduce biases relating to AI systems. More clarity may be provided by regulation.

The AIDA prohibits the use of data obtained unlawfully for AI development or where the reckless deployment of AI poses serious harm and where there is fraudulent intent to cause substantial economic loss through its deployment. 

Finally, notification to the Minister of Innovation, Science and Industry is required if the use of the AI system is likely to result in material harm.

A senior official called the Artificial Intelligence and Data Commissioner may be designated to assist the minister in the administration and enforcement of the AIDA. Meanwhile, the minister may audit an organization and produce a report at the organization's expense and make orders where necessary. 

An organization found guilty of an indictable offense under the AIDA may be liable for up to $25 million CAD or 5% of global revenue or on summary conviction for up to $20 million CAD or 4% of global revenue. However, an organization cannot be found guilty of an offense if it establishes that it exercised due diligence to prevent the commission of the offense. Note that an organization is vicariously liable for its employee, agent or mandatary. 

An individual found guilty of an indictable offense under the AIDA may face a fine in the court's discretion and/or imprisonment of "up to five years less a day" or, on summary conviction $100,000 CAD and/or imprisonment of up to two years less a day.

The three-pronged Bill C-27 aims to modernize the current federal privacy framework and recognize individuals' privacy rights and the benefits of data collection and use. PIPEDA was enacted at the turn of the century and has been in effect for the past 20 years; technology has rapidly progressed with the advancements in electronic storage and computing power, enabling Big Data, breakthroughs in artificial intelligence and the wide adoption of digital social platforms. The proposed legislation responds to many of these new changes.

The fate of Bill C-27, however, remains to be seen in the coming months.

nNovation's Kris Klein, CIPP/C, CIPM, FIP, Anne-Marie Hayden and Shaun Brown contributed to this article. 

Photo by Nabil Saleh on Unsplash