TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Age verification and data protection: Far more difficult than it looks Related reading: Reconciling the Age Appropriate Design Code with COPPA

rss_feed

""

""

The French government published Decree No. 2021-1306 Oct. 7, 2021, concerning the implementation of measures to protect minors from accessing sites broadcasting adult content. This allows us to take a closer look at the implementation of technical processes to check the age of users online.

At the European Union level, the Audiovisual Media Services Directive requires the adoption of appropriate measures to protect children from harmful content, including age verification. In addition, Article 8.2 of the EU General Data Protection Regulation implicitly establishes the need for some controllers to verify age by setting a minimum age requirement for children to be able to provide valid consent to the processing of their own data, in the context of the provision of information society services, where consent is the lawful basis for processing.

Age verification systems aim to protect those under 18 in the EU, which make up 18% of the bloc’s total population. According to research led by the Age Verification Providers Association, little to no age verification is implemented in most EU countries for goods, content or services and, where it is, self-declarations are used, which are far from certain. Some countries have implemented age verification systems (e.g., Germany, where the Kommission für Jugendmedienschutz vetted and published a list of age verification vendors), or taken action (e.g., the U.K.’s Draft Online Safety Bill and Draft Age Assurance (Minimum Standards) Bill) to implement in the future.

On June 3, 2021, France’s data protection authority, the  Commission nationale de l'informatique et des libertés, issued an opinion on the draft decree. The interplay between the decree and the CNIL’s opinion — which sets certain privacy criteria — limits what companies can do in this area because of these criteria and the technologies available to them.

The decree and the opinion are relevant for organizations that run websites and/or mobile applications that offer adult content in France. Furthermore, the opinion and the CNIL guidance are also relevant to organizations that offer other online services that the law or the organization themselves deem unsuitable for children (such as online betting).

In short, providing a fully compliant appropriate age verification system is complex, since such system needs to comply with the data minimization principle (no personal data should be collected for age verification purposes and no biometric data should be collected) and should be accurate (the use of self-declaration is not enough).

Context

The decree was published under powers granted by Article 23 of Law No. 2020-936, which aims to protect victims of domestic violence. Article 23 and the decree set out a scaled procedure, which applies when the operator/publisher of an online public communication service allows minors to have access to adult content.

The procedure is presented in the flowchart below. For all these steps, the president of the Superior Audiovisual Council, known as the CSA, can either act on its own initiative, or upon referral by the public prosecutor or any natural or legal person with an interest in the matter.

Scope of the obligation to implement a reliable technical process for age verification

Article 23 applies to any person whose activity is the publication of “online public communication services” (e.g., websites, mobile applications). In its opinion, the CNIL emphasized that the scope of Article 23 can extend to a broad range of sites that publish adult content, even when it is not their main activity.

However, according to the CNIL, a general obligation for people to identify themselves before visiting any site offering adult content is not justified by the legitimate purpose of protecting minors. Indeed, the CNIL highlights that being able to benefit from online public communication services without the obligation to identify oneself or by using pseudonyms contributes to the freedom to information and to the protection of the private life of users.

Use of personal data: privacy criteria to comply with

In assessing whether minors have access to pornographic content or not, the president of the CSA is required to take into account the reliability of the technical process put in place (see Article 3 of the decree).

The CNIL recalls, in its opinion, that the implementation of technical processes to verify the age of users is likely to lead to the implementation of personal data processing, which would need to comply with the EU General Data Protection Regulation. Such technical processes would have to comply with Article 5.1c of the GDPR, meaning they must be proportionate to the purpose pursued. The CNIL further refers to the European Data Protection Board’s guidelines on consent, which point out online service providers have an obligation to check age and parental consent and must make “reasonable efforts” to do so, “taking into account the technologies available.”

In its opinion, the CNIL sets out further criteria for a technical process for verifying age to be fully compliant with data protection laws:

A. No personal data should be collected directly from the users by the publisher solely for age verification purposes.

The CNIL notes such processing would be contrary to the GDPR as it would present significant risks for the data subjects since their sexual orientation — real or assumed — could be deduced from the content viewed and directly linked to their identity.

In any case, such a database would pose serious risks if it were compromised by a third party, with a very significant impact on the data subjects concerned.

B. Use of proof-of-age systems.

The CNIL notes it is difficult to reconcile data protection principles with any age control mechanism involving prior identification of minors.

These systems would be based on a trusted third party that should incorporate a double anonymity mechanism that prevents it from (i) identifying the website/application at stake and (ii) sharing users’ personal data to the said website/application. The said third party would have to implement all data protection rules, in particular regarding information on the risks and rights related to the processing of their data.

C. Data minimization principle should be paramount.

The CNIL regards the methods listed below as contrary to data protection rules:

    • The collection of official identification documents. This is due to the risk of identity theft in case of disclosure and misappropriation. In other words, it should be possible to prove your age online without having to disclose your full identity.
    • The use of systems designed to estimate a user’s age based on an analysis of the browsing history.
    • The collection of biometric data within the meaning of Article 9 of the GDPR. This is since consent would not be freely given, as it would be a condition for access to the content.

Are there any enforcement actions?

Before the adoption of Article 23 and the Decree, the associations e-Enfance and la Voix de l’enfance lodged a complaint with the Judicial Court of Paris, alleging nine of the main adult content sites do not take sufficient measures to prevent minors from viewing their content and requesting the Judicial Court of Paris ask ISPs to block access to these sites.

On Oct. 8, 2021, the Judicial Court of Paris rejected their request because ISPs do not edit or control adult content and do not have to justify the lack of measures taken to prevent minors from having access to such content. In other words, the two child protection associations should have lodged their complaint against the publishers of the nine sites.

However, this complaint was lodged before the adoption of Article 23 and the decree. Now, the president of the CSA has the power to request such measures. Following the new procedure, the two associations would need to lodge their complaint with the president of the CSA.

On Dec. 13, 2021, the president of the CSA issued injunctions to the publishers of Pornhub, Tukif, xHamster, Xvideos and XNXX to take, within a 15-day period, any measures to prevent minors from accessing said websites. Indeed, the president of the CSA found these websites are using a self-declaration process and this measure does not guarantee only an adult audience is likely to access the content.

Where does it leave us?

The CNIL claims third-party intervention with a double anonymization system may be one solution to verify age with a high degree of certainty and also protect children’s privacy, but the implementation of such a system might be complex.

In any case, the CNIL acknowledges in its recommendation 7 — of its recommendations to strengthen the protection of minors online — in June 2021 that age verification is a complex matter and considered current providers as mostly unsatisfactory in the matter. This is as age verification mechanisms on the market generally either collect excessive data (e.g., facial recognition) or are easily circumvented (e.g., self-declaration or verification by email).

Other solutions can be used if they respect the following principles: proportionality (e.g., a facial recognition mechanism would be disproportionate), minimization, robustness, simplicity, standardization and third-party intervention (as held in the CNIL’s opinion).

These questions are also being considered elsewhere. On Oct. 14, 2021, the U.K. Information Commissioner’s Office published an opinion on Age Assurance for the Children’s Code where it recommends following similar principles. This opinion came with a call for evidence on the use of age assurance, where the ICO sought evidence of age assurance technology, including details on existing or proposed age estimation approaches, novel approaches to age assurance, systems where data protection by design has been applied and the type of economic impact of age assurance approaches.

Furthermore, the euCONSENT consortium was awarded European Commission funding in April 2021 to create a child rights-centered cross-border system for online age verification and parental consent. This project will run until the end of 2022. The objective of this project is to demonstrate an interoperable technical infrastructure dedicated to the implementation of child protection mechanisms (such as age verification) and parental consent mechanisms, as required by relevant EU legislation.

Photo by Thomas Park on Unsplash


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

1 Comment

If you want to comment on this post, you need to login.

  • comment Iain Corby • Jan 26, 2022
    Thank you Mihnea for this excellent summary of a complex situation.
    
    As the director of the Age Verification Providers Association (www.avpassociation.com) and ex officio, project manager for www.euCONSENT.eu I wanted to add a couple of points if I may.
    
    CNIL's endorsement of the use of independent third parties to conduct age verification (AV) is welcome and by far the most straightforward way of applying age checks while complying with data protection legislation.  All forms of age check require the use of personal data, whether that is conventional identity documents or biometric data such as a "selfie" from which artificial intelligence can estimate age (with astonishing accuracy!).  But critically, there is no need to retain that PII once an age check is achieved, and AV providers can also simply reply "yes" or "no" to the website a user wishes to access when asked if that user is an appropriate age.  The website need never know the full identity of the user, and the AV provider does not record which site was making the enquiry - this is the double-blind protection described above.
    
    By not retaining any sensitive data, the risks from hacking are effectively mitigated.
    
    But AV providers do need to be closely regulated to ensure compliance, and prevent bad actors impersonating legitimate services to harvest personal data.
    
    We achieve that through certification (see https://ico.org.uk/for-organisations/age-check-certification-scheme-accs/) and by ensuring well-known platforms use certified providers, so the due diligence is done by the website not left to the consumer.
    
    euCONSENT creates a network of certified AV providers so once you prove your age to one, that same check is recognised by any other member of the network.  Not only does this massively reduce the inconvenience to the user, but because there is an audit process to join the network, it further embeds the safeguards imposed by certification schemes.  (The architecture mirrors the eIDAS solution redirecting users to the provider of their original age check to release it for re-use.)
    
    This squares the circle of delivering rigorous age checks without compromising privacy and data security, keeping both Arcom and CNIL happy.