It is summer recess here in Washington, D.C., the only legislative break where work truly seems to stop at all levels of the Hill. The rest of us "beltway insiders" persevere, but with fewer causes for excitement or distraction.

Today we are under a tornado watch as we wait for much-needed rain from Hurricane Debby. Though we are often told to watch for them, there hasn't been a tornado in D.C. for almost 30 years.

Almost as inexorable as the arrival of extreme August weather was the announcement of a new lawsuit against TikTok and its related companies, filed by the U.S. Department of Justice on behalf of the Federal Trade Commission.

The new complaint is notable for the allegations it makes under the Children's Online Privacy Protection Act — but also for the allegations the DOJ has chosen not to make. As if appreciating a classical landscape painting, let's examine the negative space first.

Politico reports when the FTC originally sent this TikTok complaint to the DOJ for litigation it included allegations of deception related to the company's cross-border data practices. Despite much consternation from politicians and regulators to the contrary, TikTok continues to maintain that the employees of its related companies in China do not have access to the personal data of Americans. Reportedly, the missing claim would have alleged that these statements are deceptive.

The DOJ may have chosen to streamline the complaint for several reasons. For one, it is already engaged in litigation with TikTok related to its foreign ownership after the company brought a constitutional challenge against the divestiture law passed earlier this year. The DOJ may wish to avoid muddying the waters with overlapping lawsuits. Or, like any good litigator, the DOJ may have narrowed the scope for strategic reasons, given the expense and difficulty of winning a consumer protection case in court.

In fact, there are no general consumer protection claims in the complaint. Rather than arguing for deception under Section 5 of the FTC Act, the complaint focuses on alleged violations of the COPPA, the COPPA Rule and TikTok’s existing consent agreement with the FTC based on children’s privacy missteps made by its precursor company in the U.S. market, Musical.ly.

There are many privacy lessons in the TikTok complaint. None are particularly surprising, but the existence of this lawsuit against one of the most popular apps on the market today shows that it’s time for a review.

A 'Kids Mode' must not collect or share personal information without parental consent

Like many apps that target all ages, TikTok offers a plain vanilla version of its app to those users who do not pass the age gate. However, the lawsuit alleges that TikTok for Younger Users, also known as Kids Mode, is not quite vanilla enough to satisfy the COPPA. It alleges TikTok collected "several varieties of personal information from children with Kids Mode accounts" without parental consent and used this data in ways the COPPA Rule prohibits.

In addition to the birthday, username and password necessary to create the Kids Mode account, the complaint alleges the company collected IP addresses and unique device identifiers as well as "dozens of other types of information concerning child users with Kids Mode accounts — including app activity data, device information, mobile carrier information, and app information — which they combine with persistent identifiers and use to amass profiles on children."

Persistent identifiers can be collected from children without parental consent, but only if used for the sole purpose of providing support for internal operations, which the complaint alleges was not the case here. In its proposed updates to the COPPA Rule, the FTC plans to keep this exception as is, allowing for activities such as ad attribution and some personalization, but not behavioral profiling or amassing profiles.

A particularly damning allegation relates to the sharing of persistent identifiers with third parties for retargeting purposes to encourage "less active users" to return to the platform via advertising on other platforms, even for those in Kids Mode. Though the complaint alleges this practice stopped in mid-2020, it does not look good at a time when policymakers are heavily scrutinizing screentime. The FTC has proposed to explicitly ban processes that maximize user engagement from falling within the persistent ID exceptions under its COPPA Rule updates.

Age gates should be one-and-done, and not avoidable through OAuth integrations

TikTok presents a simple birthdate prompt to sort users into those older and younger than 13. The complaint takes umbrage with the fact that “until at least late 2020” this age gate allowed the user to try again, entering an older birthdate even if they had previously indicated they were under 13 and had been directed into the Kids Mode account creation flow.

It also alleges that TikTok enabled users to bypass the age gate using third-party logins for account creation including Instagram and Google which the company then allegedly treated as "age unknown" accounts regardless of any age status known to the third-party app. In 2022, TikTok closed this latter "loophole."

Contact information collected from kids must be deleted when not necessary

The complaint alleges that TikTok enabled younger users to submit in-app problem reports the company that included a field for an email address. Without notifying parents, the company allegedly retained this additional kids' contact information after processing the problem reports. As the FTC's Xbox settlement shows, the COPPA Rule requires such information to be deleted when no reasonably necessary to fulfill the purpose for which it was collected.

Account deletion processes must be simple and effective

The complaint alleges that TikTok maintained an "unreasonable" process "as recently as 2023" to find information on deleting accounts and to request account deletion. Focusing on parents who discover their children have created an unauthorized TikTok account and wish to request deletion, the complaint states that it was necessary for them to "scroll through multiple webpages to find and click on a series of links and menu options that gave no clear indication they apply to such a request. Parents then had to explain in a text box that they are a parent who wanted their child’s account and data to be deleted." At tother times parents were told to send an email, but the company allegedly "failed to respond in a timely manner to these requests, or simply failed to respond to them at all."

In addition, the deletion requests were only honored when staff determined the account holder was underage based on "objective indicators" like an explicit admission of age or if parents completed a form and signed an affidavit affirming they were the parents of the account holder.

In fairness, this process seems designed to reduce fraudulent account deletion requests in a situation where parental relationships are not tracked. But the complaint maintains the company's policies and practices "subverted parents' efforts to delete their children's accounts" and resulted in the retention of children’s personal information after the company was informed of the account holder's age.

The process to delete should be simple. Once a company has actual knowledge that a user is a child, COPPA compliance bells should be ringing and processes should kick into place to achieve compliance. Deletion must also be effective and complete. Numerous additional allegations in the complaint focus on ineffective deletion practices, including retained app activity logs, insufficient documentation of systems where other data was retained, and photos and sound recordings from known children appearing in other users' videos, even after account deletion.

It is equally important to prevent future sign-ups from known child users. The complaint alleges that TikTok until recently did not have processes to prevent child users from re-creating accounts "with the same device, persistent identifiers, and email address or phone number as before."

Monitoring for child users should be a priority, especially in social sharing apps

The complaint describes the automated and human-led processes for identifying and removing children who claim to be adults from TikTok. Most surprising among these allegations is the claim that the processes used by human content moderators to flag accounts for additional review "did not work." Until October 2022, flags from human moderators who had reviewed individual videos on the platform allegedly "were not actually referred to the team authorized to delete the associated account."

Though this is only one of many processes the company uses to identify underage users, it is an important reminder to always check-up on stated policies and practices to ensure they are being fully implemented. In high-risk situations like the retention of underage user information, persistent errors can become big compliance problems.

Please send feedback, updates and your kids' favorite dance move to cobun@iapp.org.

Cobun Zweifel-Keegan, CIPP/US, CIPM, is the managing director in Washington, D.C., for the IAPP.