Editor's note: The IAPP is policy neutral. We publish contributed opinion and analysis pieces to enable our members to hear a broad spectrum of views in our domains.
It was back in January of this year when the U.S. Federal Trade Commission last announced the settlement of a privacy matter. This unusually long dry spell was broken this week with three new settlements. All three of the newly announced actions involve children’s privacy.
Since taking the helm, Chairman Andrew Ferguson has made two things clear on the subject of privacy enforcement. First, protecting children and teens will be the top priority of the agency under his leadership. And second, clear legislative guidelines, as established under the Children’s Online Privacy Protection Act, the TAKE IT DOWN Act, or even a future comprehensive federal privacy law, will be prioritized above general Section 5 consumer protection actions.
I wrote about these and other FTC priorities back in the beginning of the summer.
Adults are only kids grown up, anyway
The biggest of the new settlements, in dollars if not necessarily impact, resolves a complaint brought against Disney. The company agreed to pay USD10 million to resolve FTC allegations related to its use of YouTube as a distribution platform and advertising vehicle.
The narrow scope of this case is unusual. Rather than focusing on Disney's content distribution practices generally, which one might infer are copacetic, this is a case about the company's day-to-day use of one video platform. The allegations are easy to sum up: some portion of the videos Disney uploaded to its various YouTube channels were not marked with the platform's "Made for Kids" designation. Because those channels were instead subject to the platform's general privacy and safety settings, YouTube collected additional personal data from those users, delivered targeted ads to them, and provided them with other features like comments and auto-recommendations which are turned off in the Made for Kids environment. As FTC staff put it in a business blog, "If you're uploading child-directed videos to adult-designated channels, you might have a COPPA problem (or two)."
Of course, YouTube itself is already subject to enhanced FTC oversight flowing from a settlement in 2019. There, the FTC and the New York Attorney General's office required YouTube to create the COPPA governance structure that led to scrutiny over Disney in this new matter. Since content creators are in the position to determine which videos are directed at children, YouTube provides a mechanism for them to label which videos and channels are subject to COPPA-compliant kids' restrictions and which are not. Within this structure, the FTC does not require YouTube to operate a platform-level age-gating mechanism.
At the time of the YouTube settlement, Chairman Joe Simons warned content creators that the FTC would be closely scrutinizing whether they met the COPPA compliance obligations that flowed from this governance structure.
"For those who create child-directed content to upload to YouTube," Simons concluded in a press conference, "the message from today's case is that the FTC considers these videos and channels to be websites or online services directed to children under the Rule and thus strictly liable for compliance with COPPA. Once our order has been effective for a period of time, the Commission will conduct a sweep of the YouTube platform to determine whether child-directed content is being properly designated as such."
Six years later, the agency is showing this was more than an empty threat.
Meanwhile, the global conversation around age assurance has shifted significantly.
The FTC appears to have taken note. The Disney press release quotes Ferguson placing his thumb on the scale toward platform-level age assurance. In his view, the settlement "makes room for the future of protecting kids online — age assurance technology." Although Disney must now implement a review program for all videos before marking them as Made for Kids on YouTube, the mandatory program is conditioned, remarkably, on the actions of another company. In the event that "the YouTube Platform implements measures to determine the age, age range, or age category of all YouTube users," — that is, if the company takes steps beyond those required under its own consent decree — Disney no longer will need to implement the video review program. As legal requirements for such centralized mechanisms continue to spread, this condition is more likely than not to come about.
Mind your Ps and Qs and SDKs too
A second case this week is even more cut and dry. An interactive robot toy company called Apitor settled allegations that it allowed a third party to collect geolocation data from children without obtaining the verifiable parental consent required under COPPA.
The SDK was developed by a Chinese company known as Jiguang. According to the complaint, the third-party company's privacy policy "gives Jiguang broad latitude to use the data as it sees fit, including for advertising and for sharing the data with additional third parties."
Remedies in the case included a USD500,000 civil penalty, which is suspended under the agreement as well as deletion of the ill-gotten data and an ongoing data minimization obligation.
The FTC's business blog zeroes in on the lessons for app developers — whether or not the apps in question are connected to robot toys. An app's COPPA compliance is only as good as that of its SDK vendors.
Knowing your customer is even more important these days in the shadow of the Protecting Americans' Data from Foreign Adversaries Act, which is enforced by the FTC and could conceivably be used against a third-party like Jiguang if it qualifies as a data broker under the law. IAPP recently published an article on distinguishing compliance obligations under PADFAA from the closely related DOJ data security rule, which could similarly be triggered through unwary third-party integrations.
Intimate imagery carries high privacy standards
In a joint action with the Utah attorney general’s office, the FTC has proposed a settlement with Aylo, the largest global owner and operator of adult entertainment websites. Though the case focuses on a series of unsettling allegations about the company's policies and practices related to removing child sexual abuse material, it also includes allegations about deceptive claims around the handling of models’ personal information.
The ordered governance programs should serve as a guidepost for any companies that handle intimate imagery: the privacy of these materials, especially when accompanied by identity documents, is of the utmost importance.
Though these three cases share the theme of children’s privacy and safety, they differ in almost every other way. As it turns out, youth privacy issues are not a narrow niche.
Please send feedback, updates and Walt Disney quotes to cobun@iapp.org.
Cobun Zweifel-Keegan, CIPP/US, CIPM, is the managing director, Washington, D.C., for the IAPP.
This article originally appeared in The Daily Dashboard and U.S. Privacy Digest, free weekly IAPP newsletters. Subscriptions to this and other IAPP newsletters can be found here.