Last May, the European Commission presented a proposal to fight Child Sexual Abuse Materials. The proposed legislation has spurred controversy as it touches upon the delicate issue of private interpersonal communications and might affect end-to-end encryption.
The issue of child pornography has exploded recently, as abusive online content has increased by 60 times in the last 10 years. While content production mainly takes place in the "global south," groups of countries in Africa, Latin America and developing Asian countries, Europe has the hideous record of hosting 86% of the CSAM discovered by the Internet Watch Foundation in 2020.
In this context, the EU's regulation to fight CSAM introduces the obligation for technology companies providing hosting or communication services to conduct risk assessments and consequently adopt mitigating measures to prevent the dissemination of child pornography.
The relevant national authorities would be in charge of assessing whether such measures would provide adequate protection for minors online or if further corrective measures would be needed.
In this regard, the legislation follows the same path as the recently adopted Digital Services Act, which will mandate the same mitigation measures for societal risks on large online platforms, namely those with more than 45 million users in the EU — 10% of the bloc's population.
The main difference is the CSAM regulation provides EU authorities with the possibility to ask a judge to issue a detection order if there is evidence the service is being used to abuse children online and when "the reasons for issuing the detection order outweigh negative consequences for the rights and legitimate interests of all parties affected."
The detection order varies according to the nature of the online abuse, which can take two types of content: multimedia content, like photos and videos, and text content aimed at "grooming," the action of luring a child into meeting in an attempt to commit a sexual offense.
Civil society organizations have accused the commission of having chosen the wrong approach to tackle the problem, disproportionally opening the door to surveillance practices that could then be redeployed to, for instance, identify political opponents.
By contrast, NGOs focused on child protection have welcomed the proposal, especially as it puts in place preventive measures, introduces obligations to assess risks and mandates the use of safe technology to tackle this crime at scale.
State of play
Currently, EU legislation allows online platforms to put in place voluntary measures to detect known CSAM via the ePrivacy Derogation, a temporary measure that expires in 2024. The legal framework is derogating from the ePrivacy Directive, which was meant to be repealed by the ePrivacy Regulation — a proposal indefinitely stuck in legislative limbo.
These voluntary measures usually take place in the form of automated content recognition tools that detect hashes, a string of data associated with a video or image. These AI-powered tools have been criticized for having a high error rate and are subject to human oversight.
Moreover, technology companies like Meta have been working for years on detecting suspicious users via behavioral patterns. For instance, a user that sends many messages to other users without having their number or getting very few replies back would be flagged as anomalous.
"There is a general misconception that to keep people safe online, you have to read every message that they send, we fundamentally disagree with that idea. We have been responding to legal requests and effectively cooperating with law enforcement for years without breaking end-to-end encryption," a Meta spokesperson said.
Behavioral analysis is less intrusive than content detection as it is based on metadata, information related to how, when and where the communication was sent. No tech company carrying out this sort of operation would make much of these details public since otherwise malicious actors would learn how to circumvent them.
The encryption issue
However, the commission was not satisfied with this voluntary approach.
"What we have put in place is to try to be technology neutral in the legislation. Because we see how quick technology is developing," European Commissioner for Home Affairs Ylva Johansson said.
However, in the impact assessment accompanying the proposal, the EU executive assessed how the proposal would apply to the different technologies, envisaging two ways to bypass end-to-end encryption, a security method whereby only the sender and the receiver can read the content of the communication.
One option sees the message sent not only to the messaging service server but also to another "scanning" server. Security experts warn that this backdoor would create a vulnerability that hackers or other malicious actors would avidly exploit.
The other scenario would be on-device scanning, trying to detect the material on the senders' end. For what concerns grooming, the receiver also enters the picture as the platform would need to know that a child is involved, but the technology in this area is still at an embryonal stage.
All options essentially break end-to-end encryption, as they defeat its very purpose. Moreover, they face huge practical problems regarding the new CSAM, as the idea is to employ machine learning techniques to train categorization algorithms to spot similarities with known child pornographic content.
"The Commission says this is not about encryption but child sexual abuse material. It's actually about both because they motivate the proposal stating the difficulty of detecting CSAM when companies use end-to-end encryption. Otherwise, most of the proposal is unnecessary," Irish Council for Civil Liberties Technology Fellow Kris Shrishak said.
False flags
In August, Commissioner Johansson wrote in a blog post that these sorts of tools have an accuracy rate above 90%, in some cases even 99%. Following a Freedom of Information request, published documents revealed this figure was taken by Meta's percentage of content removed before a user reported it.
The lack of an independent assessment of the precision rate prompted a backlash, as false positives containing legitimate intimate content, such as sexting, would be disclosed to human review.
The accuracy of these automated tools was also one of 61 questions that Germany sent to the Commission in June. In the coalition agreement that underpins the new German government, the three parties committed to defending encryption from proposals that would introduce generalized scanning.
The commission replied, explaining the EU Centre on Child Sexual Abuse would verify their illegal nature and send the material to law enforcement agencies. Still, what remains unclear is how the EU body will cope with the massive amounts of false positives that even a 10% error rate would entail.
Legislative process
The progress on this file has so far been minimal precisely for the controversial nature of the proposal. The Czech Presidency of the Council of the European Union circulated two partial compromises in the Council, which gathers the EU member states.
A significant change introduced the possibility for authorities to issue delisting orders, requiring search engines to delete a website that hosts CSAM from its search results. In practice, search engines like Google have already been doing that. Moreover, the EU countries are discussing removing the provisions allowing the national authority to be independent.
In European Parliament, the work has barely started. The only decision taken has been to assign the file to the committee on civil liberties, justice and home affairs, which was the home of the General Data Protection Regulation and gathers many privacy-sensitive lawmakers.
At the same time, the lead has been taken by the center-right MEP Javier Zarzalejos, who is lenient on the needs of law enforcement. He previously spearheaded the work on revising the mandate of Europol, the EU's law enforcement, to crunch Big Data to detect suspicious behavior.
Zarzalejos already indicated that, for him, there is a "tradeoff between privacy and security." His work will mainly focus on the technologies used, substantive and procedural safeguards, the procedure for detection orders and the role of the EU Centre.
"We need to find a proper balance between fighting CSAM and limiting fundamental rights," said progressive lawmaker Paul Tang, anticipating that the center-left is ready to defend end-to-end encryption and prevent the risk of disproportionate and generalized monitoring obligations.