Editor's note: The IAPP is policy neutral. We publish contributed opinion and analysis pieces to enable our members to hear a broad spectrum of views in our domains.

"It is an ethical responsibility towards the next generation." Cyprus, Denmark, France, Greece, Slovenia and Spain are upping discourse on the protection of minors online.

Digital, policy and artificial intelligence ministers from the countries recently shared a non-paper on "Protecting Minors from Online harms and risks," which addresses age verification, age-appropriate design and a pan-European digital age of majority.

"Poorly designed digital products and services can lead to heightened anxiety, depression, and self-esteem issues as minors are constantly exposed to trivial or comparative contents, while excessive screen time can limit the development of critical skills, alter cognitive capacities, weaken human relationships, and diminish the ability to collaborate effectively," they wrote.

This happens against the backdrop of multiple instruments in the EU digital rulebook that address the topic directly and tangentially, at least in part: the EU General Data Protection Regulation, the Digital Services Act, the AI Act, the European Digital Identity Framework, the General Product Safety Directive, the Audiovisual Media Services Directive and rules to prevent and combat child sexual abuse, to name a few.

This 2024 compendium provides a comprehensive view of EU formal texts concerning children in the digital world, including those in force and proposed as well as non-legislative initiatives. This document does not reflect the European Commission's most recent initiatives — such as the anticipated Digital Fairness Act, which may have relevance for children's online experience — nor the jurisprudence and best practices developing in the space.

It is also worth noting the implementation of some of these texts is still in progress. For example, the Digital Services Act has already been in force for two years, yet the European Commission is only now producing detailed guidance on how platforms should protect minors online.

Greece, France and Spain highlighted three primary challenges for the European community: the absence of a digital-age majority for online social networks; the lack of "proper and widespread age-verification mechanisms that are properly enforced"; and the need for mandatory technical standards for safe, private, non-addictive interfaces in platforms by design.

These three countries are promoting new approaches and tools at a national level. The ideas range from better informing parents about existing parental controls to mandating a legal and technical framework that conditions access to a service to a "robust and privacy-compliant age-verification mechanism," or considering mandatory parental controls and digital education programs.

The non-paper's authorial countries are calling for EU-level regulatory action to three specific objectives:

  • "Mandatory and built-in age verification solutions and parental control software to all devices with Internet access that will be available in the European market.
  • The introduction of a European 'Digital Majority Age' for access to online social networks.
  • European Norms that will mandate age-appropriate designs, thus minimizing addictive and persuasive architectures (eg. Pop-ups, profile personalization, video autoplay)."

Some voices in the industry are already cautioning against the quick development of new rules and suggest the EU should let rules play out, particularly at a time when legislation like the DSA is not fully implemented.

Isabelle Roccia, CIPP/E, is the managing director, Europe, for the IAPP.

This article originally appeared in the Europe Data Protection Digest, a free weekly IAPP newsletter. Subscriptions to this and other IAPP newsletters can be found here.