TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | An update of C-27 since its reintroduction in Parliament Related reading: As generative AI grows in popularity, privacy regulators chime in

rss_feed

""

Since its introduction in Canadian Parliament last year, Bill C-27, the Digital Charter Implementation Act, has been front and center in regulatory conversations among privacy professionals in Canada.

The omnibus legislation passed the House of Commons on second reading 24 April and contains three separate pieces of legislation: the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act.

Despite privacy legislation coming to the forefront in a number of jurisdictions around the world, stakeholders in Canada are generally less optimistic about C-27's chances of passing this year, with some saying the legislation still needs work to get over the finish line.

"There is virtually no chance that C-27 will be passed this year," said nNovation Partner Shaun Brown in an email. "The parliamentary committee responsible for reviewing C-27 hasn't even started. I assume this will not begin until after the summer break, so I am doubtful that the committee review will even be done this year."

Brown said he doubts the bill will pass by the time Canada's next federal election occurs sometime before 2025. Though he said there could be an opportunity to remove the AIDA from the larger C-27 package, and lawmakers could attempt to pass it as standalone legislation.

"I'm sure it is possible that the Artificial Intelligence Data Act could be carved out," Brown said, though he's unsure if it will happen. "I would understand the inclination to do so, because regulating AI is probably even harder than regulating personal information. But, if AI is the pressing policy issue that many people believe, that should be the priority over privacy law amendments."

OPC and stakeholders outline current positions

On 11 May, Privacy Commissioner of Canada Philippe Dufresne shared recommendations to the House of Commons' Standing Committee on Industry and Technology, the committee C-27 was referred to this session.

Dufresne's recommendations include recognizing privacy as a fundamental right for Canadian citizens and providing them the right to have their data deleted, regardless of whether the entity storing it has a retention schedule. Overall, he told the parliamentary committee the CPPA "is a step in the right direction, but it can and must go further" to protect Canadians' privacy.

"I'm looking forward to bringing (these recommendations) and discussing them with the committee in its important study of the bill," said Dufresne, who will deliver the opening keynote address at the IAPP Canada Privacy Symposium 24 May. "I know that there will be many stakeholders who will be bringing their own perspective … and it'll be an important discussion that takes place in the committee."

One major sticking point between industry and the government is the verbiage surrounding deidentified and anonymized data, in terms of the likelihood anonymized data could be reidentified.

Rogers Communications Chief Privacy Officer Deborah Evans, CIPP/C, said her company's view, that the Office of the Privacy Commissioner's desire to see provisions defining anonymized data contain a firm "no risk of reidentification," is too high of a bar for companies to meet.

"As a company, we think the framework needs some enhancements and the Privacy Commissioner sees it a little bit more on the opposite side," Evans said. "We want the definition for anonymized data to say 'no reasonably foreseeable (risk of reidentification),' and he wants a firm no real risk of reidentification). That doesn't mean we're worlds apart, and we can't find some sort of common ground."

One of Dufresne's recommended improvements to C-27 was strengthening the "framework for deidentified and anonymized information." He said the definition of deidentified data is a lower threshold that is kept "within the scope" of the CPPA, whereas, with the way anonymized data is currently defined within to proposed legislation, some anonymized data could be processed outside the scope of the bill.   

"That is going to be one of the many important issues at the committee, and it's important that the perspective of industry be presented and be heard," Dufresne said. "We've made recommendations on both of those definitions, so with deidentification we want the issue of the risk of reidentification to be a factor to be considered in terms of the measures that are required. On anonymization we've recommended to remove one element of the definition so that the focus really is on there being no possibility of reidentification."

Overall, Evans said, as a company, Rogers supports the government's goals outlined in C-27. Her company views the bill as "housekeeping legislation" and 98% agrees with achieving wider approval among stakeholders and broader consensus among lawmakers. Once C-27 ultimately passes, Evans said a critical component of the law will be its implementation period for businesses.

"Particularly for large companies like us, we have complex (information technology) systems and need a runway in order to get our IT ready to make these changes," she said. "Domestically, here with Quebec, they had a three-phased approach over three years. The GDPR had a similar approach. The California Bill had a similar approach. For us, that's one of the most important aspects with this bill."

C-27 adequacy questions

However, as the Committee on Industry and Technology marks up C-27 later this session, there are some who maintain the bill still needs wholesale changes.

Carleton University Privacy Manager Pierce White-Joncas, CIPP/C, CIPM, said C-27 still poses issues for Canadian citizens because it "gives too much power to businesses." He called the legislation incomplete and is concerned it will not meet adequacy standards in the EU and the U.K.

He said a sometimes-overlooked element of the current Canada-EU adequacy agreement is that Canada's Personal Information Protection and Electronic Documents Act was only granted adequacy by the EU for commercial purposes. Noncommercial entities, such as universities, are left to sign standard contractual clauses or enter GDPR Article 49 derogations, which are not to be used for "continuous and repeated processing."

"I'm fearful that our trading partners are not going to want to continue to do business with us because we are stepping away from adequacy and not moving towards it," White-Joncas said. "Anything outside of a commercial context currently doesn't have adequacy to do business in Europe, which means that organizations, (as) 'creatures of the province and creatures of provincial legislation,' don't get to leverage PIPEDA for transfers of personal data to European institutions."

Brown said the adequacy concern was unfounded in his view. He said institutions, such as universities, would still rely on another set of provincial laws governing their ability to transfer data beyond Canada's borders and would not be subject to the CPPA. 

"I don't see it moving away from adequacy at all," Brown said. "The introduction of the tribunal and penalties addresses the main concerns privacy advocates have had for years."

Proposed Data Protection Tribunal, unnecessary oversight or effective check ?

For White-Joncas, another point of contention with C-27 is the establishment of a three-member body to hear appeals of OPC decisions under the PIDPTA. He said the act creates an additional layer of bureaucracy for citizens to navigate when their right to privacy is violated.

"Why is there more administrative burden for Canadians to have to achieve their privacy rights by filing a complaint with the company, filing a complaint with the Privacy Commissioner and then having to go to this tribunal?" White-Joncas said.

Brown said he believes the creation of an independent tribunal is necessary to counter balance the OPC's power, though he would like to see the a requirement for one member of the tribunal to have relevant technology business expertise.

"We've seen examples in Canada, and around the world, of what can happen when investigatory and adjudication functions are rolled into a single regulatory body. It looks like a competition among regulators of who can impose the biggest fine," Brown said. "Many regulators have lost all perspective over what's a stake in a lot of these cases."

Dufresne said, in many instances, a federal court gives the final approval of OPC enforcement decisions, and, therefore, he does not see the creation of a tribunal as an encroachment on his office's independence. However, he recommended the process for citizens lodging complaints about privacy violations be made "more expeditious and economical by streamlining the review" OPC of decisions.

"There are going to be reviews of the decisions of the commissioners and that's necessary. That's appropriate. Currently, a federal court plays that role," Dufresne said. "If there is going to be an additional step in the process, we should remove one step so it doesn't make it longer and more expensive. If there needs to be a tribunal, the appeals from that tribunal should go directly to the Federal Court of Appeal."

AI and Data Act too vague? 

As for the final piece of legislation contained within C-27, White-Joncas said the AIDA is another component of C-27 that needs work. He said it is confusing to some privacy pros that Parliament is pushing the bill to the finish line, but lawmakers are not working to develop the specific regulations prior to passing the legislation.

"Aren't you opening up the door for a lot of malicious activity to happen if the regulations aren't in place, but the bill is in force?" White-Joncas asked.

Brown said, while AIDA is little more than a basic framework outlining the government's goals in regulating AI, the technology's rapid development may necessitate the passage of imperfect regulations if the end result is the Canadian government is better prepared to respond to negative outcomes produced by AI.

The act is "arguably undemocratic, because regulations are made by the government and not Parliament," Brown wrote. "At the same time, however, AI is an extremely fast evolving and difficult area to regulate, and regulations are much more flexible and adaptable to keep up with changes. If the government is required to back to Parliament every time a change is required, that process tends to take years."


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.