TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Parting thoughts: A Canadian privacy one-on-one with Daniel Therrien Related reading: Canada introduces new federal privacy and AI legislation



Canada's federal privacy regime is on the cusp of an overhaul with the recent introduction of Bill C-27, an omnibus legislative package for data protection and artificial intelligence. Many have called for this reform in recent years to address growing privacy issues. Former Privacy Commissioner of Canada Daniel Therrien was one who saw the need and strongly advocated for change during the final years of his eight-year term.

Therrien completed his tenure as commissioner June 3, spending some time at the IAPP Canada Privacy Symposium during his final days on the job. He sat down with The Privacy Advisor for a one-on-one interview to expand on his keynote speech regarding the future of Canadian privacy, going in-depth on the his hopes for reform — which were discussed before Bill C-27's introduction — and other work to combat specific issues the country faces.

The Privacy Advisor: You’re a clear proponent for updating or replacing Canada's Personal Information Protection and Electronic Documents Act. Just briefly outline why we’ve reached a point where detailed reform is absolutely essential.

Therrien: It took a few years before I talked about law reform. I started by setting certain priorities … not trying to understand practices and how they affected privacy in these areas. We developed policy on consent and consulted many stakeholders, including industry stakeholders to add a sense of the issues. Then the investigations start, and that's where we had the clearest demonstration that something needs to be addressed because the regime of the current laws are insufficient. We investigated Facebook's Cambridge Analytica issues and the smaller British Columbia-based company AIQ that used data from Cambridge Analytica. … To me, that matter illustrated for the first time that came across later as the clear and immediate link between privacy protection and the protection of other rights.

People are now more aware, and I'm not saying people have privacy discussions at the dinner table every day, but I think people are more aware today than they were 10 years ago that privacy is not something arcane and esoteric that very few people are interested in. There is a link between privacy protection and real stuff. … We often here about privacy in terms of risks. I think we're beyond that now. We've seen there have been demonstrated harms through privacy violations to not only privacy itself, which is bad enough, but to other interests like democracy or not being under constant surveillance. That clearly demands a better legal framework, including certain sanctions for companies that breach the law.

The Privacy Advisor: How disappointing was it to see Bill C-11 — Canada's previously proposed data protection reform bill introduced November 2020 — fall off the table without action, and now seeing the government have to restart its drafting of the bill? Was Bill C-11 going to be effective legislation as proposed?

Therrien: It would've been good for that bill to go through parliamentary scrutiny and public debate so that the distance between what it appeared to be or was sold as versus what it actually was could be discussed and improved. But I've said the bill was a step backwards. I'm not happy it didn't become law, I would've preferred for the debate to continue. … I haven't had many meetings with (Minister of Innovation, Science and Industry of Canada François-Philippe Champagne), but there have been a few about (Bill C-27). He says and I have to accept that his department is consulting to improve the bill (compared to Bill C-11). From the conversations I've had, there will be some improvements. Less than I would've hoped, but some perhaps on the rights of children for instance. Regardless, the next step is to have a piece of legislation before parliament and, even if it is imperfect, have a public debate around it so that we have new legislation as soon as possible.

The Privacy Advisor: Your office recently published some updated recommendations for federal privacy law reform in anticipation of a new bill being presented to Parliament this year. What are a couple of the most meaningful or essential suggestions that you hope will be taken into consideration or adopted in the new bill?

Therrien: The document we published was a summary of our longer set of submissions we made a bit more than a year ago. There were six or seven themes in there and they're all important, but if I had to pick I think you'd need to start with balance, meaning the law needs to permit if not promote responsible innovation. Privacy should not be an impediment to innovation or economic growth. The balance to that is it needs to do all this in a way that respects privacy as a human right. Surveillance à la Clearview AI is prohibited. For instance, violating privacy rights leading to unwanted political messages should be prohibited. It's a balance proposition.

When you get into the mechanics, some are more important than others. Of course, data protection authorities seek to have order making and consequential fines. Frankly, a number of companies want to do the right thing but there are a number that see the profits that can be made by violating privacy rights. The sanctions need to be proportional to the profits that can be made.

The Privacy Advisor: As far as potential increased enforcement powers with any reform, what types of increased responsibilities or capabilities would you like to see OPC get and why is it important for the office to have them?

Therrien: Perhaps what I'm spending less time on that is equally important is the authority I think we should have, and many data protection authorities have, to do proactive auditing. The principle in common-law countries is a regulator, prosecutor or police can only investigate if there are reasonable grounds to believe a violation has occurred. That really doesn't work in the technology world where the technologies themselves and the business models around them are very complex.

What exists in many jurisdictions, including the U.K., Canadian provinces and EU countries under the (EU General Data Protection Regulation), is for the regulator to — based on knowledge of where the risks may lie, for instance facial recognition or AI as areas of concern — identify sectors or practices that it wants to audit or investigate without necessarily grounds to believe a sector or company has violated the law. It's not to be nosy or wanting to be in the way of business, but essentially to reassure the public that has no other way of having trust that practices do in fact comply with the law. In order to do that, you have to be able to have this auditing authority to "look under the hood" with certain practices and arrive at assurance that rights have been respected or in some cases a violation that needs to be regulated. 

The Privacy Advisor: Your former office's enforcement work against Clearview AI's facial recognition technology is well known. Other countries have begun enforcing against the company as well, yet it appears set on continuing its work. Looking at facial recognition in the broader context, is a federal ban something that needs to be considered or are there alternative ways in the current regulatory environment to uphold individual privacy rights and hold technology providers accountable?

Therrien: I'm not a fan of complete bans, particularly in the public sector for police. I know there are some jurisdictions that have banned the use of FRT for their forces and I don't agree because I do think, if properly used, FRT can be helpful in solving crimes and other problems. But FRT and biometrics generally raise the stakes of privacy risks tremendously for obvious reasons. You can change a password that's been hacked, but you can't change your biometrics with FRT.

This is going to be a problem for a long, long while, so that to me means the level of regulation and scrutiny needs to be heightened. In the technology world there's a consensus that it's best to work with principles-based laws applicable to all sectors and technologies, and I agree with that. With FRT, the risks are elevated to such an extent that I think there needs to be laws specific to FRT that heighten some safeguards. For instance, limiting the use of FRT to define circumstances as opposed to a general principle "company X" has to use whatever technology with consent so long as it is not "inappropriate" under a vague standard. Laws that would limit the ability to define uses and prohibit other uses Clearview's mass surveillance for instance — would make sense.

Then there's the question of whether the same laws would apply to private and public sector. The context here is very different. With police, misuse can lead to loss of liberty so the stakes are very high and the protections should be extremely high. For the private sector, the stakes look less dramatic, but at the same time if you provide your biometrics to a company and it is breached and your identity becomes fair game to all kinds of bad players then the stakes are pretty high there too.

The Privacy Advisor: What's the best way to approach quickly regulating on children's privacy to curb the fast-emerging issues that revealed themselves with kids' increased online presences since the start of the COVID-19 pandemic?

Therrien: On this issue, as with many others, you start with high-level principles such as the conventional rights of the child. So when dealing with the data of children, it should be from the perspective of what is in the best interest of the child. That seems really plain and simple, but you have to start at that high level and then look to something like the U.K. Age Appropriate Design Code, which goes into a bit more detail. For individual sectors, you go even further in the weeds as far as company practices and how the principles on the child's interest apply. It's a tough business. Things move all the time, but it's about having a combination of the right principles, working through details, having conversations, and ultimately having codes or decisions by regulators on what general principles should mean on the ground.

The Privacy Advisor: Do you have any regrets about your time as commissioner? Anything you would’ve done differently, tried to get ahead of or advocated harder for?

Therrien: Obviously, I would've like to see our proposals debated in Canadian Parliament so that we have modernized laws. Even if the proposal we submitted was met halfway I would've been happy. But we need to make progress. It's not right that we have this 20-year-old law and then see problems, not theoretical risks but real harms to real people, that are not properly addressed. But that's not really under my control.

I'm proud of the fact that we have identified issues, not because we wanted to find problems but because we were honestly looking to understand privacy in relation to various practices, the economy, et cetera. We were able to help raise the level of awareness among Canadians as to the importance of privacy. It does truly affect people and the population has an understanding due to a number of factors, but we did contribute to this and that has to eventually lead politicians to act and reassure people.

Credits: 1

Submit for CPEs


If you want to comment on this post, you need to login.