As the privacy profession continues to grow and evolve, it's clear our job descriptions are also changing. Last year, the IAPP made a big splash by establishing the link between a privacy pro's job and that of an artificial intelligence governance pro. And it's not just AI that is being placed on our desks, but the entire world of digital data governance.

To this point, I hope you don't mind if I take this space every once in awhile to write about these newer topics — even if they don't fall squarely into what was considered within the privacy box 25 years ago.

In Canada, in Bill C-27, there's a proposed bill that will attempt to regulate how AI systems are created and deployed in the country. I have already written a fair amount about it. This week, I want to bring to everyone's attention to another federal bill that has started to make its way through Parliament: Bill C-63, which would enact the Online Harms Act.

In principle, I think everyone would agree we should not tolerate when someone posts something online that causes harm to either another individual or to a group of people. But my reading of the proposed law has me thinking the government may be overcomplicating things a tad.

The new law will create a complex set of regulatory offices — at least three of them — that will be responsible for investigating and punishing any platform that allows someone to post harmful content. I think it's an interesting policy position to take, that it will be the platform's responsibility and risk that someone using it will post harmful material.

Most social media platforms have terms of use that prohibit members from posting harmful or discriminatory content. Many also have various mechanisms to remove anything that violates these terms, including using AI to scan and flag problematic posts and having human content moderators watching out for bad actors posting harmful stuff.

Bill C-63, again, in principle, seems like a good idea, with good intentions. For example, it amends the criminal code to make it a bit more clear that individuals that do the actual posting of harmful material are committing offenses with severe penalties.

If the bill gets to committee before the next election, I'm sure there will be a ton of witnesses wanting to appear to make representations. So, if this falls anywhere near your responsibility, you may want to look more closely at the law and start thinking about what your organization's position is going to be.

Kris Klein, CIPP/C, CIPM, FIP, is the managing director for Canada at the IAPP.