Following the European Data Protection Board's dispute resolution decision, Ireland's Data Protection Commission in September adopted its final decision against TikTok Technology Limited related to the company's processing of children's personal data.
The findings build on many positions established in the DPC's September 2022 decision concerning Instagram's processing of children's personal data. For example, regarding transparency information for child users, the DPC found that stating "people" might be able to contact a user did not sufficiently distinguish between other Instagram users and the world at large.
Similarly, in its decision against TikTok, the DPC found the use of certain terms — including "everyone" and "anyone" — to describe who could see a user's account was not sufficiently clear as to whether that meant all registered TikTok users or anyone who could access the platform. The largest component of the total 345 million euro fine against TikTok — 180 million euros — was for this transparency infringement.
The decision also reflects the prior determination in the Instagram decision that making children's accounts public by default is inconsistent with the EU General Data Protection Regulation's data protection by design and default obligations.
However, the decision is novel from a pan-European/EDPB perspective insofar as it is the first to examine age-verification measures against the backdrop of the GDPR. While the EDPB's dispute resolution procedure, in an arguably rather odd way, directed the DPC to reach an inconclusive outcome, there are some important markers digital services with a mixed user population should note, as they may be indicators of future regulatory approaches to age verification.
Other key takeaways:
- The decision reaffirms the major focus of European regulators — and moreover the DPC as the bloc leader in this area — on children's data. This is a topic we expect to see increasingly more often in regulatory investigations and enforcement decisions.
- The DPC's findings regarding the risks to children from the processing of their data are informative of how the DPC will expect organizations to assess such risks in relation to their own processing operations.
- Finally, the decision also signals the EDPB's willingness to use the fairness principle to bolt on additional findings of infringement at the dispute resolution stage, even where the lead supervisory authority's investigation did include such an issue within its scope.
Brief background
The decision is the result of an own-volition inquiry launched by the DPC in September 2021. The inquiry covered solely the period between 31 July-31 Dec. 2020. Since then, TikTok Technology Limited has made several service modifications addressing most of the criticisms within the decision.
TikTok's terms do not allow users under age 13 to use the platform. The decision focuses on the processing of personal data relating to users aged 13-17, but also examines TTL's compliance regarding personal data of children under 13 in the context of the company's age-verification measures.
As mentioned, the case went through the GDPR's dispute resolution mechanism under Article 65. While there was general consensus to the DPC's proposed findings in its draft decision, objections were raised by the Italian and the Berlin supervisory authorities. Despite the fact that these objections were a tiny minority opinion among the collective EDPB, the Article 65 process mandates that even just one unresolved objection must trigger the whole machinery of the GDPR's process, thus these objections were referred to the EDPB for determination.
The EDPB adopted its binding decision on these objections 2 Aug. 2023, requiring the DPC to include a new finding of infringement of the fairness principle and an order to bring the relevant processing operations into compliance, while also requiring the DPC to amend its conclusion regarding its draft determination on whether TTL's age-verification measures were GDPR-compliant.
It should be emphasized that the relevant period of the DPC's inquiry pre-dated the DPC's guidance on children's data, The Fundamentals for a Child-Oriented Approach to Data Processing. The decision therefore assesses TTL's compliance by reference to the GDPR itself and does not refer to the Fundamentals — however, the DPC carefully clarifies that the Fundamentals introduce "child-specific data protection interpretative principles" and that it would still be permissible to refer to principles derived from the GDPR.
Public-by-default settings
TikTok is available as a mobile app and through a web version. Registration is required for the mobile app, but the web version can also be accessed — with limited functionality — by nonregistered users, who can view certain content of a user's profile page or on the TikTok "For You" homepage. Such access to content by nonregistered users had an important bearing on the DPC's findings.
During the period relevant to the DPC's decision, child users were presented a pop-up notification at registration where they could select to "Go Private," to turn their account private, or "Skip," to continue with a public account. The pop-up explained that only approved followers could view the user's content on TikTok with a private account, while public accounts were viewable by anyone. Contrary to TTL's position, the DPC considered that the options to "Go private" or "Skip" amounted to public-by-default processing, because "Skip" meant that without purposefully opting for a private account, the account would be public.
When a user with a public account sought to publish a public video, a pop-up notification explained the implications of doing so, giving the user the option to "Cancel" or "Post now."
The DPC considered the public-by-default account setting also had a "series of cascading implications" for other platform settings, enabling by default features like public video posting, despite there being granular level settings. The ability of any registered user to post comments on a child user's content by default, whether child or adult, was considered by the DPC to increase the potential for abuse by bad actors.
Consequently, this processing posed several risks to child users, including loss of autonomy and control over their data, and the risk of becoming targets for bad actors.
The DPC thus took the view that TTL did not implement measures to ensure the content of child users was not made accessible by default, without the user's intervention, to an indefinite number of individuals, contrary to GDPR's Article 25, Sections 1 and 2, and the data minimization principle under Article 5, Section 1.
The DPC also concluded that by implementing a public-by-default setting and expecting child users to have sufficient technical knowledge to change this setting, TTL created conditions where unnecessary publication of child users' content may occur, for instance, more extensive processing of content than the user intended. Consequently, TTL did not properly account for the associated risks to child users, nor did it implement appropriate technical and organizational measures to ensure GDPR compliance, contrary to the controller's responsibilities under Article 24 of the GDPR.
TTL pointed to measures it had in place to mitigate the identified risks and submitted that the purpose of the platform — a global community where people can create and share — and the related context would have informed child users' expectations. The DPC did not accept that these factors would ameliorate the risks to child users on the basis that they did not prevent contact via public comments, nor could they prevent nonregistered users from accessing child user's content on the TikTok website.
Family pairing and direct messaging
The "Family Pairing" feature gave certain parental-type controls over the child user's account to another user. Notably, the DPC acknowledged that, in general, the "Family Pairing" options allowed the paired account user to make privacy settings more strict for the child user's account — by narrowing available content, disabling search and direct messages, making the account private and limiting comments.
However, the other user could also enable direct messages for accounts of child users over the age of 16 (although, based on the quoted TTL submissions to the DPC, this was only with regards to "Friends") where the child user had themselves switched off this feature. The accounts were paired so that a QR code was generated to the nonchild user. This code had to then be scanned by the child user, who confirmed if they wished for the accounts to be linked. The DPC took the view that despite this process, there was no verification of the relationship between the two users.
The DPC considered that allowing a user, who was not a verified parent/guardian, to enable direct messages in this way for child users over age 16 posed risks. This enabled third parties to contact the child user and would thereby constitute unauthorized processing of their personal data, since they had not selected to have their data processed in this manner.
On this basis, the DPC concluded that TTL failed to apply appropriate technical and organizational measures to effectively implement the integrity and confidentiality principle and to integrate safeguards to meet GDPR requirements.
This finding again demonstrates the increased risk to children that the DPC associates with being able to directly contact children, whether through the comments function or via direct messaging.
Transparency: What does 'anyone' mean?
TTL had in place several transparency resources including "just-in-time" notifications, its Privacy Policy and a Summary of the Privacy Policy for users under 18. However, the DPC determined these resources did not inform users that their public accounts were viewable through the TikTok website by an indefinite number of persons, including nonregistered users.
The DPC regarded references in TTL's resources to "public," "everyone," and "anyone" as ambiguous. These terms could refer to both registered and nonregistered users and TTL had failed to specify this distinction, although, according to the DPC, it could have easily done so. On this basis, the DPC considered that TTL did not provide child users with appropriate information on the categories of recipients of their personal data, in violation of Article 13(1)(e) of the GDPR.
Additionally, the DPC found TTL did not meet the requirements of the GDPR's Article 12(1) to provide information in a concise, transparent, intelligible and easily accessible form, using clear and plain language. However, the DPC considered that these "information deficits" were not sufficiently extensive to infringe the general transparency principle under Article 5(1)(a). As noted, the largest individual fine imposed under the decision related to transparency infringements, reflecting the DPC's statement that it considered them to be the most grave of the infringements.
Age verification
TikTok had age-verification measures in place to prevent users under 13 from accessing the platform. These consisted of an age gate requesting the user's birthdate, along with technical measures to prevent users from re-submitting an older age, and ex-post measures to identify and remove accounts of underage users.
TTL's data protection impact assessment on age-appropriate design did not identify the specific risks of users under age 13 accessing the platform and the further risks arising from this, which was viewed by the DPC as a lack of appropriate measures to demonstrate and ensure compliance with the GDPR, contrary to Article 24(1). This is an important indicator of the regulatory expectation that digital services with minimum user thresholds must still account for risks to users under the service's permitted minimum age for use, including via DPIAs.
The DPC proposed in its draft decision to find that TTL's age-verification measures otherwise complied with the GDPR. Following an objection from the Italian supervisory authority, the EDPB analysed this point, concluding there was not sufficient information to conclusively assess TTL's compliance on this point, and instructed the DPC to amend its finding accordingly.
As such, the DPC's decision included a statement to the effect that it could not be concluded that the age-verification measures deployed by TikTok infringed the GDPR. In other words, the positive statement in the draft decision expressing the DPC's view that TikTok had complied with Article 25 in this regard was removed at the direction of the EDPB.
The decision also contains a comment on requiring hard identifiers as a method of age verification. The DPC accepted TTL's contention that this would be a disproportionate measure. The DPC's view was given that children are unlikely to hold or have access to hard identifiers, this would act to exclude or lock out child users who would otherwise be able to utilize the platform.
EDPB's view on age verification
For organizations with mixed-age user populations of both adults and children on their services that impose minimum user ages, the portion of the decision relating to age verification is potentially the most significant. This is due to the fact that the EDPB carried out a lengthy analysis of TTL's age-verification measures, taking into account — as required by Article 25(1) of the GDPR — the nature, scope, context and purposes of processing, the risks to child users, the state of the art and the costs of implementation.
In its analysis, the EDPB pointed out that regarding the requirement for "appropriate" technical and organizational measures under Article 25, appropriate means effective and this in turn requires a robustness of measures. The EDPB expressed serious doubts on the effectiveness of TTL's neutral age gate as an age-verification solution given the high risk of the processing. The EDPB noted that the age gate can be easily circumvented, that presenting the age gate in a neutral manner does not itself sufficiently discourage users from entering an incorrect birth date, that once a method of circumvention is known this can be easily shared with peers, and that since TikTok was rated for age 12+ in the Apple store, users could easily infer they had to enter a birth date above the age of 12 to use the platform.
Similarly, the EDPB expressed doubts on the effectiveness of TTL's ex-post measures to identify and remove users under age 13 from the platform. Despite these concerns, the EDPB considered it did not have sufficient information to conclusively assess the state-of-the-art element related to TTL's age-verification measures, and as such, it could not conclusively assess TTL's compliance with data protection by design.
While the EDPB's decision does not explain why it felt there was not enough information to reach a conclusion here, it is worth noting that its analysis concerned the six months between July and December 2020 and it is possible that the need to carry out a historical examination some three years back may have been a factor.
It's worth noting the EDPB's view that the appropriateness of age-verification measures changes regularly, due to the link to the state of the art and the associated risks, and a controller must periodically review whether such measures remain appropriate.
Overall, though, controllers should not read the lack of an infringement finding concerning the use of the age gate in this case as a green light to use this means of age verification.
Fairness and design choices
The last finding in the DPC's decision, regarding the infringement of the fairness principle, was not a finding originally proposed by the DPC. It was instead mandated by the EDPB's binding decision and is the result of an objection raised by the Berlin supervisory authority on behalf of both it and the Baden-Württemberg supervisory authority.
More specifically, the EDPB concluded that the design of the "Registration Pop-Up," with the "Go Private" or "Skip" options, and the "Video Posting Pop-Up," with the "Cancel" or "Post Now" options, nudged the user to a certain decision, "leading them subconsciously to decisions violating their privacy interests." The EDPB took into account the chosen language, sharing the DPC's view that the word "Skip" seemed to incentivize or even trivialize the decision to opt for a private account, which shows the use of nudging. It also considered the location of the "Skip" and "Post Now" buttons on the right-hand side of the screen, which according to the EDPB, would lead most users to choose the option as they are accustomed to clicking to continue there, as well as the different color gradient for each option — light gray for "Cancel" and black for "Post Now."
The EDPB's direction in its binding decision on this point, requiring the DPC to insert a finding of an infringement of the fairness principle, demonstrates the EDPB's propensity to use the general principles provision in Article 5(1)(a) as a route for finding additional umbrella-type infringements, even where the lead supervisory authority's investigation did include such an issue within its scope.
Corrective powers
With regards to the above infringements, the decision exercises the following corrective powers:
- A reprimand.
- An order to bring TTL's processing into compliance with the GDPR within three months, to the extent (if any) that TTL is conducting ongoing processing operations as described in the decision. TTL made several service modifications, both during and after the relevant period, which was also considered as a mitigating factor by the DPC.
- Three administrative fines totaling 345 million euros, as follows: A fine of 100 million euros for TTL's infringement of Articles 5(1)(c) and 25(1), (2), regarding the public-by-default account setting; a fine of 65 million euros for the infringement of Articles 5(1)(f) and 25(1) regarding the "Family Pairing" feature; and a fine of 180 million euros for infringement of Articles 12(1) and 13(1) regarding transparency.
The DPC did not impose a fine for infringements of Article 24, regarding the public-by-default setting and TTL's age-verification measures, given that the GDPR does not provide for an administrative fine for this infringement. The DPC also did not impose a fine for the infringement of the fairness principle — although this was requested by the Berlin supervisory authority in its objection. Instead, TTL was ordered to bring its processing into compliance.
Finally, it should be noted that TTL has appealed the DPC's decision to the Irish High Court and has also issued annulment proceedings before the Court of Justice of the European Union against the EDPB in relation to its binding decision.