Resource Center / Resource Articles / Implementing kids’ privacy protections around the world
Implementing kids’ privacy protections around the world
The PerfectPetPal case study
This resource provides a case study on a fictional pet simulator mobile app.
Published: August 2024
Contributors:
Disclaimer: Before you start searching the app stores to download PerfectPetPal, you should know PPP is entirely fictional. The authors prepared this document for a training workshop on children's privacy that took place 14 Nov. 2023 at the IAPP EU Data Protection Congress 2023. After presenting an overview of privacy laws and enforcement matters from jurisdictions around the world, the authors used a case-study approach with workshop participants to apply the different legal frameworks to the fictional pet simulator mobile app. The training concluded with a mini workshop on data protection impact assessments that tied back to the hypothetical pet simulator.
The case study published here does not contain or constitute legal advice and should not be relied on for such purposes. The legal points in this document are high-level, nonexhaustive examples intended only to stimulate discussion. Any similarity between the hypothetical facts in this document and real-life apps, companies or features is coincidental. Neither the IAPP nor the authors of this document accept any liability for any errors or omissions in this document.
Navigate by Topic
Everybody loves pets, right? PerfectPetPal is a recently launched pet simulator mobile app that allows users, known as owners, to create and interact with virtual pets and other owners in new and exciting ways. PPP is wildly popular with people of all ages. The app has more than 100 million monthly active users worldwide, with thousands of new downloads daily. The app is getting lots of buzz — and attention — from journalists, privacy advocates and possibly data privacy regulators. PPP has hired you to assess their privacy practices and provide them with a plan of action.
Here are some additional facts to help you to get started on your examination. You may need to conduct a DPIA or a privacy impact assessment.
01
PerfectPetPal is a pet simulator app that allows users to create and interact with virtual pets and other users.
02
The app has over 100 million monthly active users from all over the world.
03
PPP has hired you to assess their privacy practices and provide them with a plan of action.
PPP is incredibly easy to join. All users need to do to become the owner of a PerfectPet is download the app from the Apple or Google storefront, and then sign up with their full names, home addresses and cell phone numbers. Then they can design a PerfectPet right away! PPP displays users' profiles publicly to help owners make friends. They can also use PPP's "Look Alike" feature and take the PetPoll up front to help them create a uniquely tailored PerfectPet in minutes. PPP is free, although owners can add payment information to their accounts to purchase petcoins, which can be used to acquire items, accessories, services and experiences for PerfectPets.
Privacy notice and disclosures
- PPP does not knowingly collect data from users under the age of 13 or the age of digital consent in particular jurisdictions.
- PPP does not sell owners' data, although it may use or share information for legitimate business purposes.
- PPP presents a pop-up upon first use of its special features and asks for consent to collect, use and share data related to the special feature.
- PPP has a privacy notice in its mobile app, which users can access via a link in the settings menu.
- PPP's privacy notice explains it collects and uses users' personal information to provide its services.
- PPP displays a seal on the front page of its mobile app announcing it is "data law compliant."
PPP's mobile app has a privacy notice that owners can access via a link in the settings menu in a section called "About this App." This section contains another link to a menu labeled "Statement," where users can find PPP's 39-page privacy notice. It informs users that privacy is important and encourages them to read it carefully.
The notice explains the app collects and uses users' personal information to provide its services. It also states the product is intended only for adults, and it does not knowingly collect data from users under the age of 13 or the age of digital consent in particular jurisdictions. It explains PPP does not sell owners' data, although it may sometimes use or share information for legitimate business purposes.
The second to last page of the policy states, "By using the PPP app and its features, you accept the terms of this privacy policy and agree to the data collection, use and sharing described herein." In addition to the overall privacy policy, PPP presents a pop-up upon first use special features like the "Look Alike" tool, "PetPoll" or "PetTales" and asks users to consent to the app's collection, use and sharing of data related to the special feature. It offers three choices: "Yes," "Ask me later" or "I'm not sure."
PPP says it takes its users' privacy seriously. On the front page of its mobile app, it displays a seal announcing it is "data law compliant." The seal has ribbons that say "privacy," "trust" and "security."
PPP allows owners to create and care for a wide array of virtual pets, including feeding them, grooming them and customizing their appearances. There are some cool tools available.
-
expand_more
Look Alike tool
One of the hottest new features PPP offers is a "Look Alike" tool that allows owners to customize the eyes, hair, nose, mouth and ears of their PefectPet by uploading their own photos to PetSim — selfies preferred! After all, it is a scientific fact that people and their pets often look alike!
PPP also offers a PetPoll that allows users — at sign up or afterward — to answer questions about their likes and dislikes to make sure they are compatible with their PerfectPets. A sample question might be: What is your favorite academic subject at school?
-
expand_more
Prambles
The app enables owners to take their PerfectPets on virtual and real-life walks, known as prambles. You can even do this with your virtual cheetah or snake! When users pramble with their PerfectPets in real life, the app helpfully alerts them to pet-related locations and shopping experiences it thinks users will enjoy. Sometimes PPP sends location alerts or directions and other times it sends owners coupons for discounts on pet-related items through their mobile devices. The pramble feature can help owners and their PerfectPets return to their favorite shops or other locations like dog parks over and over again.
-
expand_more
Pleagues and plats
Owners and their virtual pets can connect with each other through special interest groups known as pleagues. Some are species based, while others are mix and match. Best friends forever cats and dogs, anyone? Within pleagues, owners can share PerfectPet and real-life pet photos, tips on pet care and accessory recommendations, creating human-to-human and pet-to-pet bonds. And, when owners or their PerfectPets need a break from all the excitement, they can relax in special plats, quiet areas designed to allow owner/PerfectPet pairs to interact one-on-one with other owner/PerfectPet pairs. There are even owner-only plats for those moments when owners just want to interact with another human. Owners don't have to be in the same pleague to relax together.
-
expand_more
Petcoin
The app allows owners to buy items for their PerfectPets with a special virtual currency known as petcoin. Many essential items for PerfectPets, such as accessories and experiences, only cost a few petcoins. The more petcoins an owner obtains, though, the more they can provide their precious PerfectPet with the better things in life. After all, doesn't every PerfectPet deserve a crystal-studded satin collar or their own pet mansion — virtual versions, of course?
-
expand_more
PetTales
One of the app's newest standout features is an interactive storytelling feature known as PetTales. To enjoy PetTales, owners share information about their friends, families, jobs, hometowns, dreams, wishes and even their real-life pets with a special PetBot that creates personally tailored stories for owners. The PetTales feature automatically uploads the stories to the owner's pleagues so users can share these fantastic tales. Best of all, the PetBot remembers all the information owners have shared — and keeps adding more — to provide owners with new stories and adventures featuring their favorite pets, people and places.
"PPP is wildly popular with people of all ages. It generates over 100 million monthly active users from all over the world and thousands of new downloads every day."
-
expand_more
Discussion – Know your audience
With millions of fans of all ages from all over the world, it's important to determine the target audience for the online service, including whether PPP is "directed to children" or "likely to be accessed by a child," two legal standards that can help determine whether and when to apply special privacy protections.
These facts also raise questions about appropriate age-assurance methods and parental-consent mechanisms. In the U.S., the Children's Online Privacy Protection Act and its implementing COPPA Rule, enforced by the Federal Trade Commission, provide the standard for determining when an online service is directed to children under the age of 13 or is a "mixed audience" product, a subset of the directed to children standard.
Review the FTC's Epic Games/Fortnite and Microsoft/Xbox Live enforcement actions to see how this plays out. State omnibus consumer privacy laws and children's privacy laws like those in California, Connecticut, Florida, Louisiana, Ohio, Texas and Utah — some of which have been enjoined on constitutional grounds — should also be part of your analysis. Some provide privacy protections, such as prohibitions on the sale or sharing of the information of a child under the age of 16 without express affirmative consent. In Canada, Quebec's Act Respecting the Protection of Personal Information in the Private Sector prohibits companies from collecting information concerning a minor under 14 years of age with parental consent "unless collecting the information is clearly for the minor's benefit."
EU member states and the U.K. have also issued regulatory guidance that applies to children's personal data online. Of note, the U.K. Information Commissioner's Office promulgated the Age Appropriate Design Code, also known as the Children's Code, which requires a best interests of the child approach to safeguarding children's privacy by "information society services likely to be accessed by children." Under the code, children are defined as individuals under the age of 18.
Similarly, Ireland's Data Protection Commission's Fundamentals for a Child-Oriented Approach to Data Processing requires companies to adhere to the best interests of the child if their services are "directed at, intended for, or likely to be accessed by children." Again, children are defined as individuals under the age of 18. In its final TikTok decision, the DPC stressed that companies cannot obviate the requirement for organizations to comply with their obligations toward users below the age of 13 by simply stating a minimum user age of 13, when children are likely to be users of the service in question.
In addition, other European privacy authorities have released guidance on children's data protection. These include the Netherlands' Code for Children's Rights, France's eight recommendations to enhance the protection of children online, Norway's guidance on consent for minors and Sweden's guidance on children's online rights. Separately, in its Work Programme 2023/2024, the European Data Protection Board signaled it expects to publish guidelines on children's data over the course of the next year. The EDPB's guidelines on transparency also emphasize the importance of knowing your audience.
"Numerous facts above provide details on PPP's privacy practices, or lack thereof. Review the facts closely to prepare any required or optional PIAs and help provide PPP with a plan of action to address the concerns it is facing from journalists, privacy advocates and data privacy regulators."
-
expand_more
Discussion – DPIAs
In the EU, Canada and various U.S. states, businesses must conduct a DPIA or a PIA if their practices involve the collection, use or disclosure of children's data, which is often considered high risk or sensitive. Canada's federal, provincial and territorial privacy regulators and ombudsmen, for example, passed a resolution on "putting best interests of young people at the forefront of privacy and access to personal information." The resolution not only requires PIAs for projects involving the data of young people to identify and minimize digital privacy risks, but also instructs organizations to adapt their traditional PIA process to think specifically about the perspectives and experiences of young people as individuals and as a group before collecting, using or disclosing their information. It also states organizations should conduct an intersectional analysis to consider the privacy risks to vulnerable young people. When a company conducts multiple types of data-processing operations, it may need to conduct a series of DPIAs.
Parental consent and data minimization
"All users need to do to become the owner of a PerfectPet is download the app from the Apple or Google storefront, and then sign up with their full names, home addresses and cell phone numbers."
-
expand_more
Discussion – Parental consent and data minimization
Although PPP has designed an easy sign-up process, it might be too easy for children to join the service from a privacy-law perspective. Under U.S. law, if you determine PPP is targeted at children under the age of 13 or have actual knowledge of such children using the service, collecting their information before obtaining verifiable parental consent in the U.S. would violate the COPPA Act and Rule, unless an exception applies, as well as various U.S. state-level privacy laws.
Similarly, Quebec law prohibits collecting personal information of children under 14 without parental consent, unless collecting the information is clearly for the child's benefit. See the Know your audience section for legal citations. PPP can obtain COPPA-required verifiable parental consent using one of the enumerated mechanisms set out in Section 312.5 of the Rule. Under Section 312.3(b), a company must obtain verifiable parental consent, taking available technology into consideration, before any collection, use or disclosure of a child's information. Any method to obtain verifiable parental consent must be reasonably calculated in light of available technology to ensure the person providing consent is the child's parent.
From an EU/U.K. perspective, if PPP is relying on consent as the lawful basis for processing children's personal data under the EU General Data Protection Regulation, then it must obtain consent from the holder of parental responsibility. This is the child's parent or guardian if the child is under the age of so-called digital consent in the relevant EU member state under Article 8. Article 8(1) provides that EU member states may adopt ages of digital consent anywhere between 13 and 16 years of age. When the child is under the age of digital consent, Article 8(2) requires companies to make reasonable efforts to verify consent has been given or authorized by the parent or guardian, taking available technology into consideration.
In addition to the consent issues, PPP's collection of extensive information, such as a user's home address and cell phone number, at sign up may violate data-minimization requirements under most privacy laws. Article 5(1)(c) of the GDPR, for example, requires the processing of personal data to be limited only to what is adequate, relevant and necessary in relation to the purposes for which they are processed. This requires companies to justify why it is necessary to collect and process the personal data they are collecting for processing purposes, in line with the accountability principle under GDPR Article 5(2).
Similarly, the COPPA Rule places restrictions on collecting more information than necessary to permit a child to take part in online activities. Section 312.7 prohibits companies from conditioning a child's participation in a game, the offering of a prize or another activity on the child's disclosure of more personal information than is reasonably necessary to participate in such activity. Many U.S. state laws also contain data-minimization principles.
Privacy defaults and direct communications
"PPP displays users' profiles publicly to help owners make friends."
"Owners and their virtual pets can connect with each other through special interest groups known as pleagues."
-
expand_more
Discussion – Privacy defaults and direct communications
Displaying children's information publicly raises a range of online privacy and safety concerns.
Several of the laws and enforcement matters discussed in the Know your audience section are relevant here. For example, the U.K. ICO Children's Code mandates "high" privacy default settings for products likely to be accessed by children. Under the code, a high default setting would be designing your service to allow children's persona data to be visible or accessible to other users of the service only if they change their settings to allow this.PPP could face liability for allowing children's personal data to be publicly visible by default. For example, in the FTC's Epic Games settlement, the agency found the company violated the "unfairness" prong of Section 5 of the FTC Act by publicly broadcasting players' display names while putting children and teens in direct, real-time contact with others through on-by-default lines of voice and text communication.
The FTC alleged these default settings, along with Epic's role in matching children and teens with strangers to play Fortnite, led to children and teens being threatened, harassed and exposed to dangerous and psychologically traumatizing issues such as suicide. The FTC required Epic to pay a USD275 million monetary penalty and adopt strong privacy default settings for children and teens, ensuring voice and text communications are turned off by default.
Having features such as "making friends" or online chat on by default not only poses risks of online harm to kids and teens but may also contravene data minimization requirements under EU privacy laws. Ireland's DPC ruled in both its TikTok and Instagram decisions that the default account setting for user accounts under 18 must be private, not public by default. In particular, the DPC TikTok decision found that making child user accounts public by default contravened the principle of data minimization because it allowed a child's social media content to be visible to anyone on or off the TikTok platform. By default, it allowed any registered TikTok user, both adults and children, to comment on a child's videos and interact with a child user. This increased visibility posed a severe possible risk that dangerous individuals would seek to communicate with children directly. The DPC imposed a 345 million euro fine and ordered corrective measures as a remedy.
Similarly, PPP's pleague feature, which enables private communications, raises related online privacy and safety concerns.
Transparency of privacy notices and disclosures
"PPP's mobile app has a privacy notice that owners can access via a link in the settings menu in a section called 'About this App.' This section contains another link to a menu labeled 'Statement,' where users can find PPP's 39-page privacy notice. … The notice explains the app collects and uses users' personal information to provide its services. It also states the product is intended only for adults, and it does not knowingly collect data from users under the age of 13 or the age of digital consent in particular jurisdictions."
-
expand_more
Discussion – Transparency of privacy notices and disclosures
PPP's 39-page privacy notice likely violates requirements in most jurisdictions' laws that privacy notices be in plain, easy to understand, nontechnical language and that material information about a company's privacy practices be presented prominently and not buried or hidden in a privacy notice. See, for example, the FTC's enforcement action against Sears for burying information about user tracking on the 75th line in a scroll-down box for a lengthy user license agreement that was only available to consumers at the end of a multistep registration process.
It is also inconsistent with requirements in the ICO Children's Code and DPC's Fundamentals and Instagram decisions. There, the DPC found the use of hyperlinked privacy notices for children as young as 13 did not meet the transparency requirements set out in GDPR Article 12(1). The EDPB Guidelines on Transparency specifically require controllers that either target children or are or should be aware that children are utilizing their goods or services to ensure "the vocabulary, tone and style of the language used is appropriate to and resonates with children so that the child addressee of the information recognises that the message/ information is being directed at them."
PPP's notice is also inconsistent with regulatory guidance from Canada's Office of the Privacy Commissioner, which advises organizations to explain privacy information clearly to young people. It notes, "Organizations should write terms and conditions in plain language. They should also look for fun ways to explain things!"
Privacy authorities in Hong Kong and Singapore similarly recommend companies develop child-friendly privacy notices and just-in-time disclosures that let a child know why the service is asking for information and what it will be used for.
In its guidance Collection and Use of Personal Data through the Internet – Points to Note for Data Users Targeting at Children, Hong Kong's Office of the Privacy Commissioner for Personal Data explains children may lack capacity to understand complex legalese and recommends user-friendly means to present the written privacy notice, including using graphics and animation.
In addition to the transparency issues, the language in PPP's privacy notice about information collection from children under the age of 13 is not sufficient to keep the company from being subject to children's privacy laws. As discussed in the Know your audience section, many privacy laws and regulatory guidance documents provide that companies are liable for protecting child users on their services if they have actual knowledge of underage users, such as in the current COPPA approach; if it is likely children are accessing the service, such as in the ICO Children's Code, DPC Fundamentals approach and similar approaches adopted by laws and regulations from other jurisdictions; if the service is predominantly accessed by children, such as in Florida's law; or some other factual threshold.
From a GDPR perspective, the function of Article 8 is to ensure parental consent is obtained for the collection and processing of a child's personal data when they are using the service, if consent is the relevant legal basis for processing the data. This means the controller must first be able to identify children whose personal data will be processed and who are under the age of digital consent in the relevant country, and then identify users who must have consent given by their parent or guardian.
Data disclosures to third parties
"PPP does not sell owners' data, although it may use or share information for legitimate business purposes."
-
expand_more
Discussion – Data disclosures to third parties
This raises issues relating to the disclosure of data to third parties. Many privacy laws impose strict requirements on disclosing an individual's personal information to other parties regardless of age and some include child-specific protections.
The COPPA Rule prohibits the disclosure of a child's personal information without verifiable parental consent. Indeed, the FTC clarified in its COPPA FAQs, "If you are going to disclose children's personal information to third parties, or allow children to make it publicly available (e.g., through a social networking service, online forums, or personal profiles) then you must use a method that is reasonably calculated, in light of available technology, to ensure that the person providing consent is the child's parent."
Many state laws go further, requiring specific consent for advertising-related disclosures. For example, Section 1798.120 of the California Consumer Privacy Act requires affirmative, opt-in consent to sell or share personal information of individuals under the age of 16 to third parties. For children under the age of 13, that opt-in consent must come from their parent or guardian. Children who are at least 13 years old but under the age of 16, can provide opt-in consent themselves.
Companies that willfully disregard the consumer's age shall be deemed to have had actual knowledge of the consumer's age. Note that some U.S. privacy laws, including the CCPA and the laws of Colorado and Connecticut, define the term sell broadly to mean disclosures for any valuable consideration, not just disclosures of personal information in exchange for money. Canadian privacy regulators also expect organizations to limit the disclosure of young people's personal information, according to Principle 5 of the October 2023 Joint Resolution.
From an EU/U.K. perspective, an appropriate lawful basis under Article 6(1) of the GDPR must be identified to lawfully process personal data in this manner.
The principle of data minimization under GDPR Article 5(1)(c) also requires an organization to collect only the minimum information required to achieve its purpose. In practice, the ICO Children's Code and DPC Fundamentals have interpreted this principle as requiring organizations to minimize the amount of data collected from children in the first instance, as well as throughout their interaction with a service, and/or minimize the subsequent use and sharing of the data.
Privacy consents and dark patterns
"PPP presents a pop-up upon first use special features like the 'Look Alike' tool, 'PetPoll' or 'PetTales' and asks users to consent to the app's collection, use and sharing of data related to the special feature. It offers three choices: 'Yes,' 'Ask me later' or 'I'm not sure.'"
-
expand_more
Discussion – Privacy consents and dark patterns
The design of PPP's pop-ups implicates the closely intertwined issues of consent requirements and dark patterns, as well as nudging. Regulators in numerous jurisdictions have expanded on the conditions under which consent will be considered affirmative and freely given, making it clear companies cannot obtain consent required by privacy laws through dark patterns. These are user design interfaces that trick or manipulate users into making privacy choices they would not otherwise make and that may cause harm. Of particular note are Ireland's DPC decision that found TikTok used dark patterns to nudge young users into choosing more privacy-intrusive options and features, the FTC's report on Bringing Dark Patterns to Light and the EDPB's Opinion on Deceptive Design Patterns. Additionally, Section §7004 (a)(2) of the CCPA Regulations requires "symmetry in choice" for privacy consent, meaning the path for a consumer to exercise a more privacy-protective option should not be longer or more burdensome than the path to exercise a less privacy-protective option.
In the trio of DPC Meta decisions published in January 2023, similar fairness findings were instructed to be inserted by the EDPB following the conclusion of the Article 65 process. In particular, the EDPB considered that the principle of fairness addresses autonomy, individual expectations, power asymmetries, deception, and ethical and truthful requirements. The DPC Fundamentals also state privacy-preserving behaviors should be encouraged for child users, rather than nudging techniques that encourage or incentivize children to provide unnecessary information or to engage in privacy-disrupting actions. The ICO Children's Code also says nudging techniques should not be used to encourage children to provide unnecessary personal data or turn off privacy protections. Instead, proprivacy nudges should be deployed where appropriate.
Certification, compliance and safe harbor programs
"On the front page of its mobile app, (PPP) displays a seal announcing it is 'data law compliant.' The seal has ribbons that say 'privacy,' 'trust' and 'security.'"
-
expand_more
Discussion – Certification, compliance and safe harbor programs
Many privacy laws, such as COPPA and the GDPR, authorize compliance and certification programs, sometimes known as trustmarks, seals or safe harbors that allow companies to demonstrate compliance with privacy laws and, in some cases, provide them with a legal presumption of compliance. In the area of children's privacy specifically, the FTC established a Safe Harbor program pursuant to the COPPA Rule, enabling industry groups or others to submit self-regulatory guidelines that implement "the same or greater protections for children" as those contained in the COPPA Rule for commission approval.
The FTC also brought actions against numerous companies for misrepresenting their compliance with such programs, for example with the former EU-U.S. Privacy Shield framework and the Asia-Pacific Economic Cooperation Cross-Border Privacy Rules System, and sued at least one provider of privacy certifications for businesses for misrepresentations about its recertification process. Based on previous FTC case law and business guidance, PPP's certification mark could violate the FTC Act.
"One of the hottest new features PPP offers is a 'Look Alike' tool that allows owners to customize the eyes, hair, nose, mouth and ears of their PefectPet by uploading their own photos to PetSim — selfies preferred!"
-
expand_more
Discussion – Biometric data
PPP's Look Alike feature may use children's biometric data to generate a customized PerfectPet. The COPPA Rule includes photographs, videos or audio files that contain a child's image or voice in its definition of personal information. Many other U.S. privacy laws prohibit the collection and use of biometric data that can be used to identify a unique individual without their consent. These include Illinois's Biometric Information Privacy Act and other omnibus and specialized consumer privacy laws in the U.S., such as Washington state's My Health My Data Act. Laws and regulators are increasingly likely to define biometrics broadly, even including raw images that are not processed to identify an individual, and require they be subject to the highest privacy safeguards.
Biometric data is defined in GDPR Article 4(14) as "personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data," i.e., personal data that uniquely identifies an individual. Biometric data is a special category of personal data under Article 9(1) of the GDPR. The processing of biometric data is prohibited unless an exemption under Article 9(2), which includes reliance on consent, can be relied upon to process such data and there is a lawful basis under Article 6(1).
Some lawmakers and regulators have recently expanded the definition of biometric data to include information relating to the physical, biological or behavioral characteristics of an identifiable person that can be extracted from images, photographs, and video or audio recordings, thereby expanding the definition of biometric data. See, for example, the FTC's recent policy statement, which cross-references COPPA and Washington state's MHMDA. The ICO Children's Code confirms the processing of biometric data requires completion of a DPIA, while the DPC Fundamentals state organizations should avoid the collection and processing of children's biometric data.
"When users pramble with their PerfectPets in real life, (PPP) helpfully alerts them to pet-related locations and shopping experiences it thinks users will enjoy."
-
expand_more
Discussion – Location data
The pramble feature requires precise geolocation information to provide services. Location data, especially involving children's exact location, raises significant privacy and online safety concerns. For example, the FTC's COPPA enforcement action against Amazon alleged the company misled consumers about its retention practices for the geolocation data collected through its Alexa app. It also failed to honor parents' requests to delete this sensitive data. Consumer privacy laws in the U.S. also generally treat precise geolocation data as sensitive data associated with heightened consent and risk assessment requirements. An amendment to Connecticut's privacy law, scheduled to go into effect in 2024, prohibits the collection of precise geolocation data from minors without consent from the minor, or their parent or legal guardian for minors under 13 years old, unless the geolocation data is reasonably necessary to provide the online service, product or feature and only if the controller provides a signal to the minor for the duration of such precise geolocation data collection indicating it is occurring.
From an EU/U.K. perspective, the processing of location data must comply with various principles related to data processing, including data minimization and transparency. For example, the DPC Fundamentals state location data settings must be switched off by default and child users should be made aware that their location is available to all users. Further, the level of accuracy should be significantly reduced except when necessary.
PPP's location-based advertisements may also bring U.S. state privacy laws that govern targeted advertising into play. Virginia's Consumer Data Protection Act, like many of the newer state privacy laws, defines targeted advertising as "displaying advertisements to a consumer where the advertisement is selected based on personal data obtained from a consumer's activities over time and across nonaffiliated websites or online applications to predict such consumer's preferences or interests."
In some cases, such as in the DPC Fundamentals, targeted advertising at children is specifically prohibited. This is also the case under the EU Digital Services Act, insofar as the DSA may apply to the practices in question.
PPP "allows owners to buy items for their PerfectPets with a special virtual currency known as PetCoin."
-
expand_more
Discussion – Unauthorized purchases
This raises issues around online purchases, including via virtual currencies, that are not inherently privacy issues but have arisen in enforcement actions dealing with children online. For example, the FTC's administrative complaint in the Epic Games matter alleged the company used dark patterns to trick young players into making unintended in-game purchases and violated the FTC Act by allowing children to purchase in-game content with V-Bucks without requiring any parental or card-holder action or consent. For more on dark patterns, see the Privacy consents and dark patterns section. Some laws, such as Texas's Securing Children Online through Parental Empowerment Act, will also prohibit in-app purchases by minors without parental consent unless an exception applies. The European Commission's Consumer Protection Cooperation Network has also tackled dark patterns in various commercial contexts, including through an enforcement sweep of 399 retail websites and apps that found nearly 40% of online shopping websites rely on manipulative practices to exploit consumers' vulnerabilities or trick them. The CPC Network also teamed up with the EDPB to endorse five key principles of fair advertising to children based on EU consumer and data protection laws that cover many of these tactics.
"To enjoy PetTales, owners share information about their friends, families, jobs, hometowns, dreams, wishes and even their real-life pets with a special PetBot that creates personally tailored stories for owners. The PetTales feature automatically uploads the stories to the owner's pleagues so users can share these fantastic tales with each other."
-
expand_more
Discussion – Generative AI
The PetTales feature, which incorporates AI into the PPP simulator through a "PetBot," raises numerous novel issues about everything from data scraping to bias in datasets to data retention to transparency and disclosures. And it raises questions about children's cognitive ability to distinguish between the AI PetBot and real-life friends. A now-resolved action from Italy's DPA, the Garante, against Replika AI discusses of issues involving minors and generative AI companions. It could be its own case study!
Many of the privacy principles discussed above will be relevant, as will new laws and frameworks for AI governance, including but not limited to the U.S. Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence and the EU AI Act, as well as soft law principles from the Organisation for Economic Co-operation and Development and other multilateral organizations. In addition, the ICO recently announced it issued a preliminary enforcement notice to Snap and Snap Group over potential failure to properly assess the privacy risks posed by Snap's generative AI chatbot, My AI. The ICO provisionally found the risk assessment Snap conducted before it launched My AI did not adequately assess the data protection risks posed by the generative AI technology, particularly to children, given that it is an emerging technology and it involved the processing of the personal data of children ages 13-17.
It is clear the concerns raised by journalists and privacy advocates about PerfectPetPal are real, and the company's privacy notices and practices need some — well, a lot — of work. Now you need to convince PerfectPetPal they need to develop and implement a plan of action or face disgruntled users and displeased regulators. In doing so, you will need to touch on everything from privacy notices to consent to location data, biometrics and beyond. You will need to navigate a thicket of laws from multiple jurisdictions that are constantly evolving. IAPP resources can help you stay up to date.
-
expand_more
Additional children's privacy resources