An Irish data protection commissioner said privacy and data protection folks may not have to go too far afield to bring their institutions up to speed on EU artificial intelligence literacy requirements.
Commissioner Dale Sunderland said there are several instances where the EU AI Act's literacy requirements, which states entities need their staff who use AI to have sufficient understanding of how it works and how to prevent any harms coming from its use. Those provisions, Sunderland said, are not too different from the EU General Data Protection Regulation's terms requiring understanding of technologies and now they can affect people's rights and freedoms. Data protection officers are also familiar with the GDPR's mandate that they raise awareness and train staff on best practices.
"Data protection has always been aimed at assuming a baseline understanding at an organizational level, which can be a safeguard measure in itself, and this concept can be leveraged for AI knowledge and literacy," he said. "I don't think we're starting from scratch. They're very different, the context is a bit different, the complexity of technology is absolutely different, but the core foundational principles are very much the same."
Given the similarities between the AI Act and the GDPR, Sunderland said, "it makes no sense to me, as a regulator, for any organization, and even for regulators to treat them as entirely separate matters."
The comments came during the IAPP AI Governance Global Europe 2025 conference and shed some light on how Ireland, which regulates some of the biggest companies in the technology industry, may view the task of AI literacy, one of the first requirements of the bloc's landmark AI regulation. The European Commission has laid out detailed guidelines and is maintaining a repository of information on how to meet this requirement.
There is no fine linked to AI literacy explicitly in the law. But while Sunderland said it may be that Article 4 is enforced on its own, he said it would more likely be a factor AI regulators consider when they assess violations of the law when it starts taking broader effect this August. He pointed to Article 15, which regulates the level of accuracy high-risk systems are supposed to achieve, and whether designers or developers of AI are able to provide suitable literacy training.
"It could potentially be ... seen as a contributing factor to a lack of appropriate ... for example, cybersecurity, or lack of due diligence in dealing with bias in arguments," he said.
AI literacy is a foundational concept, Sunderland later added, saying it also speaks to an organizations' ability to understand why it is using AI and how specific uses can benefit their company in the long run. The risks of not understanding that go beyond the business.
"I think the issue with AI and its enormous potential, and I would think quite low understanding of how the technology works, is that when something goes wrong, it can go really badly wrong, with huge impacts for individuals," he said.
Not being able to demonstrate good AI literacy could also affect how customers view your enterprise, said Jasmien César, AIGP, CIPT, the senior managing counsel for privacy, AI and data for Mastercard. Customers will be looking to understand what value AI adds to their lives and are more likely to choose one whose AI governance they have confidence in, she said.
"I think it's something that we saw with GDPR, is customers, consumers, investors, they really start focusing on, what are you doing in terms of privacy and data protection? What do you have in place?" César said.
There are ways to make AI literacy implementation more approachable. César said it can be helpful to start by considering what baseline knowledge all employees need to have on AI. She advised the audience to consider role-specific training, noting a data scientist does not need machine learning explained to them but those interacting with high-risk systems, like HR, need a baseline understanding of how the EU AI Act works and how to apply it when using AI.
But, she said AI literacy can be an upskill for employees and encouraged people to see it as an opportunity to teach employees about responsible AI as a whole.
"Don't approach it as, 'it's an EU thing, so I'm going to make it EU focused, and I'm only going to train my EU employees or people who are interacting with the EU,'" César said.
Caitlin Andrews is a staff writer for the IAPP.