The European Commission's “Have your say” portal is always an interesting source of insights into its thinking on initiatives that may lie ahead, as well as on the implementation of existing laws. For the next few weeks, Brussels is seeking input focused on the protection of minors online, specifically in the context of the Digital Services Act and reporting of child sexual abuse materials.
Under the Digital Services Act, the Commission has to develop guidelines to help online platforms in scope meet requirements to provide a high level of privacy, safety and security for minors. "These guidelines will provide a non-exhaustive list of good practices and recommendations for online platform providers to help them mitigate risks related to the protection of minors. … The guidelines will also help the Commission and the Digital Services Coordinators to supervise platforms and to enforce the DSA.”
The development of the guidelines, writes the Commission, should be led by the best interest of the child and a risk-based approach to online harm, suggesting a "child specific impact assessment." Interestingly, though such risk assessments might be considered de facto needed to meet DSA's article 35 obligations related to minors, it is not specifically required by the DSA as approved by the co-legislators.
The feedback period is open through 30 Sept. and guidelines are expected in Q2 of 2025, with their effectiveness to be assessed in 2027.
In parallel, the Commission's Migration and Home Affairs aims to improve reporting of child sexual abuse material under the CSAM legislation. CSAM was one of the contentious files debated during the last year and its reform has not yet been finalized. To ensure progress in reporting requirements under the current legislative framework, the Commission developed a standard form aimed at providing more uniformity across the region on data to be included in reports by companies and organizations combatting online child sexual abuse.
The form would require detailed reporting on nine primary categories, including the type and volumes of data processed, retention policies and safeguards applied, the specific grounds relied on for processing and transfers under the EU General Data Protection Regulation, and the number of cases of CSAM identified.
Data points throughout these categories require, for instance, the number of bytes of text processed for grooming detection of online exchanges in relation to non-EU users, the average time needed to make the decision to restore or keep the suspension of user accounts in the EU, error rates in automatic flagging of user accounts, etc.
The feedback period is open through 5 Sept. and there will be more to come on this under the new legislative term.
Isabelle Roccia, CIPP/E, is the managing director, Europe, for the IAPP.