TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Europe Data Protection Digest | Notes from the IAPP Europe Managing Director, 18 Sept. 2020 Related reading: Google to delay Privacy Sandbox deployment

rss_feed

""

Greetings from Brussels!

The fight against child sexual abuse and exploitation is an area of increasing concern in Europe (as elsewhere). Last week, the European Commission proposed a new Interim Regulation to enable and facilitate online communications service providers to continue detecting child sexual abuse material online. This interim measure was deemed necessary as with the full application of the European Electronic Communications Code from 21 Dec., certain online communication services, like webmail or messaging services, will fall under the scope of the ePrivacy Directive. The proposal comes in the wake of a recent Europol reporting a shocking 106% increase in the amount of child sexual exploitation online over the last year, particularly during the current COVID-19 pandemic.

The European Commission decided to act fast to shore up having a legislative gap in the EU telecoms regulatory framework. For context, the ePrivacy Directive does not provision for an explicit legal basis for voluntary processing of the content of data traffic for the purposes of detecting such harmful material. Technically, OTT service providers would have to discontinue any monitoring efforts to that end, unless individual member states had specific national legislative measures in place to address the issue. The interim proposal is designed to address this gap and temporarily limit the confidentiality of communication services to ensure that OTT service providers are not prevented from voluntarily monitoring their users to detect abuse after December. The provisions as proposed are set to last until 2025.

In a press statement by the commission, it was stated that “the proposed regulation provides guarantees to safeguard privacy and protection of personal data. It has a narrow scope limited to allowing current voluntary activities to continue, subject to the General Data Protection Regulation, and data processing will be limited to what is necessary to detect and report suspicious cases.”

The commission began work this year on supporting industry efforts in the fight against child sexual abuse online under the “EU Internet Forum.” The forum brings together all EU Home Affairs Ministers, high-level representatives of major technology companies, the European Parliament and Europol. Since 2015, the model has been successful in a cross-sector collaboration in the fight against terrorist content online and has now expanded its mission to cover child sexual abuse online.

One specific initiative under the forum was the establishment of a technical process group to map and assess possible solutions to allow companies to deploy detection mechanisms in end-to-end encrypted communications, in full respect of fundamental rights and without creating new vulnerabilities. Technical experts will examine possible solutions focused on devices, server networks and encryption protocols that could continue to ensure the privacy and security of electronic communications and the protection of children from abuse and exploitation. A report addressing both the regulatory and operational challenges is expected by the end of 2020.

On the surface, this is a fairly complex conundrum to solve: Identifying technical solutions in E2E encryption that allows selective detection of electronic content, all the while respecting the benefits of the intended encryption. It seems somewhat incompatible or paradoxical from the outset. Invariably, data would still need to be filtered to detect harmful content prior to encryption. And it may be inevitable that encryption services are undermined whatever the outcome. A concern here potentially from a privacy perspective is the perception that the right to privacy is left increasingly vulnerable as we look to reconcile it with fighting crime. There can be no doubt a more robust approach to combatting such online abuse is necessary. It will be of interest to see if viable AI technology can be developed in this context while respecting privacy without resorting to "back doors."

It is now for the European Parliament and Council to adopt this proposal.

Comments

If you want to comment on this post, you need to login.