On Sept. 2, Ireland's Data Protection Commission issued a landmark decision concerning the processing of children’s data in connection with the Instagram social media service. The decision, issued to Meta Platforms Ireland Limited (Facebook Ireland Limited at the time of commencement of this inquiry) as the controller in question, found multiple infringements of the EU General Data Protection Regulation across legal basis, data minimization, transparency, data protection by design and default, controller responsibility and data protection impact assessment obligations.
Much attention has been focused on the size of the fine imposed on Meta – a whopping 405 million euros, representing the highest fine imposed by the DPC, and second highest ever under the GDPR. However, while the decision represents a watershed moment, this is arguably down to far more than its (potential) financial consequences for the controller (note Meta has indicated it will appeal).
Firstly, it marks a critical staging post on the evolving map for protecting children as users of digital services, being the first cross-border processing case under the GDPR where all EU/EEA data protection authorities were involved in the Article 60 co-decision-making process. It also sends a clear signal that the DPC is leading the way among other DPAs in placing children’s privacy at the top of the agenda.
Secondly, the scale of the fine and the attendant global publicity around the decision will inevitably cause other digital companies processing children’s data to reassess their own practices in light of the clear markers from the DPC about how the “specific protection” children deserve (Recital 38 GDPR) should be given effect in the design of an online service.
Thirdly, and not least, the DPC’s pursuit of this inquiry has proven impactful in real-life terms for child users of Instagram given that during the course of the inquiry, the service implemented key changes to account settings under examination by the DPC, including by giving child users the option to choose between a public or private account. A range of remedial actions was directed by the DPC in the decision, some of which had already been anticipated by Instagram; once implemented they will further enhance protections for child users on the Instagram service, and indeed on any other digital service which takes this decision as its cue to re-configure user settings in light of the DPC’s findings.
The spotlighted issues
The decision focused on two distinct types of processing carried out in connection with the provision of the Instagram service: (1) the processing of contact information of child users (phone numbers and email addresses) which were automatically made public to “the world at large” (and which were also visible in the HTML source code of profile webpages for business accounts) when the child user switched from a regular to a business Instagram account; and (2) the making public-by-default of Instagram accounts of children generally.
While the decision is focused solely on processing of children’s personal data, it also provides a useful blueprint for controllers on the DPC’s expectations around accountability issues such as legitimate interest assessments, DPIAs and measures to mitigate risks and ensure and demonstrate GDPR compliance. The decision also illustrates the rigorous, evidence-based approach by which the DPC will assess risks to users and measures to counteract those risks.
Legal basis — a narrowing of the parameters for processing children’s data?
Aside from other aspects of the decision discussed below, the final position adopted in the decision on the legal basis for the processing children’s data (here the publication of contact details) will no doubt be a source of consternation for many organizations which also process children’s data.
Meta had relied on Article 6(1)(b), processing necessary for the performance of a contract, in those EEA/EU member states where national laws allowed for children to enter into contracts, and as an alternative, Article 6(1)(f) (legitimate interests) where national laws prevented this. The DPC, having carried out a detailed analysis, accepted the legitimacy of both positions but was ultimately overruled by the European Data Protection Board at the dispute resolution (Article 65) stage. (This process had been triggered by the DPC as it was unable to reach a consensus position on objections raised on this and other points by certain DPAs).
In particular, the EDPB adopted an expansive approach to its assessment of how reliance on a given legal basis should be analyzed, applying an arguably onerous interpretation of what could be said to be “necessary” and finding that data minimization (amongst other factors) should be taken into account. Having no choice under the GDPR but to incorporate the EDPB’s decision on disputed issues, this approach formed part of the DPC’s final decision, necessitating findings that neither legal basis could be relied on and that the processing therefore infringed Article 6(1).
However, the EDPB’s approach to the interpretation of both legal bases seems to further restrict the circumstances in which they can be relied on, particularly in connection with the processing of children’s data, and will be concerning to controllers in this sphere.
Transparency — no room for nuance
The weight attributed by the DPC to the importance of providing transparency to child users is underscored in this case by the fact that the largest components of the total fine imposed related to transparency (195 million euros).
The DPC identified a range of transparency deficiencies under Articles 5 and 12 relating to both the making public-by-default of children’s accounts (which applied up to March 2021 when the option was introduced for child users to choose between private and public accounts) and the publication of contact details for business accounts (which applied up to September 2019 when an option to refuse the publication of contact details was introduced). In connection with the former, the DPC pointed to an obligation on a controller to provide information on the purposes of processing where an account is set to public, noting here the lack of explanation provided during the user registration process (pre changes made to Instagram transparency information in 2021) around the difference between public and private accounts, and the fact accounts would be public by default.
While information was provided elsewhere on this (including via links from the Data Policy), ultimately the fact that there was no information on this matter in the Data Policy itself nor at the registration stage meant that during the period in question Meta had not complied with the requirements in Article 12.1 for such information to be in a clear and transparent form for child users.
In connection with the publication of contact details for business accounts, the DPC noted that children might not be aware that the consequence of switching from a regular to a business account was the resulting publication of contact details which had previously only been collected as part of the account registration process. In finding that Meta had not complied with its obligation to provide information to child users on recipients of their personal data (i.e. the world at large), the DPC’s analysis makes it clear that transparency information — when directed at children — must essentially be blunt and spelled out clearly, and that there is no room for nuance in the information provided.
For example, the DPC’s view was that stating that “people” might be able to contact a user did not sufficiently distinguish between other Instagram users and the world at large; there was no concrete indication to the child user that their contact details would be displayed to the world at large, whereas the later amendment of the phrasing to “[t]his information will be displayed on your profile publicly so people can contact you” was sufficient.
In addition, Meta’s failure to alert existing child users of business accounts, to an opt-out (introduced in September 2019) from the mandatory publication of contact details infringed Article 5(1)(a) because it was not fair or transparent.
Risk-related obligations and data protection by design and default
As a precursor to its assessment of whether Meta had complied with obligations in relation to the assessment and addressing of risks to child users arising from the two types of processing, the DPC undertook an in-depth assessment of the nature, scope, context and purpose of the processing. In both scenarios the DPC found that the processing posed high risks to the rights and freedoms of child users. In particular the DPC, in referring to a broad range of extraneous sources, pointed to the risks to child users of communications from dangerous individuals, fraud and impersonation while noting also that certain features of Instagram business accounts (analytics around content viewing, age and location of followers, which would potentially be attractive to children) incentivized child users to switch from regular to business accounts.
The clear message here is that organizations need to be mindful in their service/product design about features that will be especially appealing to child users and their associated risks.
In light of the severe risks identified by the DPC to child users from the processing and the fact that more information than necessary was processed for the purposes in question, the DPC concluded that Meta had infringed its obligations in relation to data protection by design and default (Article 25) and data minimization (Article 5.1(c)). As Meta had not conducted a DPIA in relation to these types of processing this constituted an infringement of Article 35. The lack of a DPIA was also a factor in the finding that Meta had not complied with the obligation under Article 24 to take account of the risks of varying likelihood and severity arising from the processing, despite certain technical and organizational measures, e.g., promoting safety and preventing bullying on the platform, being in place.
More to come
This decision, while the first, will by no means be the last of its kind.
News that a further children-related draft decision had been submitted by the DPC to the Article 60 co-decision making process, concerning TikTok, followed hot on the heels of this decision. In the meantime, there is much to digest in the Instagram decision — not only for privacy professionals but also importantly for digital engineering and design team teams who are grappling with how to give practical effect to the specific protection for child users called for by the GDPR.
Anna Morgan is slated to discuss children's privacy at the upcoming IAPP Europe Data Protection Congress 2022.