Child safety once meant making sure children looked both ways before crossing the road or that they didn't talk to strangers. But in today's digital world children have moved from the playground to online platforms, scrolling and swiping through digital spaces long before they can even read or understand the risks and responsibilities that come with being online. 

In India, where more than 700 million people age 2 and above are active internet users, as of December 2022, the question of how to protect children online has become an urgent one. 

India's Digital Personal Data Protection Act, 2023, has been a meaningful step toward a legal framework that recognizes children's digital vulnerabilities. While it primarily addresses children's privacy through Section 9, related provisions across the DPDPA and associated rules reinforce this focus, introducing an anticipatory design aware regulation. 

Enforcement is a critical part of any regulatory framework and India's DPDPA established the Data Protection Board of India to oversee and address grievances. Once seated, the board will have the power to investigate violations, issue directions and impose financial penalties. 

Defining and protecting children

Section 9 of the DPDPA prohibits behavioral tracking and targeted advertising for children. It also requires age verification and adds a layer of protection in the form of verifiable parental consent for processing children's personal data. Further, since verifiable parental consent is required, organizations cannot in effect rely on the deemed consent grounds under Section 7 when the data principal is a minor.

The DPDPA defines a child as anyone under age 18. India's age threshold stands higher than the EU General Data Protection Regulation's flexible 13 to 16 range and the U.S. Children's Online Privacy Protection Act's focus on children under 13. 

Across these jurisdictions, age operates mainly as the legal trigger for parental consent. The practical burden for platforms now lies not in the numerical differences but in adapting onboarding, parental consent and age assurance systems so they function across different regulatory environments with distinct verification expectations and child specific design requirements. 

The DPDPA's child-specific obligations

The Digital Personal Data Protection Rules, released in November 2025, operationalized several child specific obligations under the DPDPA. The rules require data fiduciaries to obtain verifiable parental consent through specific mechanisms. They can rely on identity and age information they already have for existing registered users, use details voluntarily provided by the parent or guardian or use verification conducted by an authorized third-party entity. 

This verification process may involve issuing a virtual token that is mapped to the parent's credentials to confirm their identity and consent. Such mechanisms could potentially reduce the exposure of parental data during the verification process. 

With the rules identifying pathways for obtaining parental consent, many providers may find that existing flows can be adapted, but India's token-based verification model and a presumptive ban on behavioral tracking and targeted advertising with only limited statutory exemptions will require substantive adjustments to design, onboarding and data flow architecture. 

The DPDPA rules provide for purpose-based exemptions from the consent and tracking prohibitions. Under Rule 12 and Schedule 4, health care establishments, educational institutions and child daycare centers can process children's data for health, safety and educational purposes without the full consent apparatus in place. Specific purposes like location tracking for safety and health-related processing are carved out. Therefore, while the primary framework appears to be restrictive, these exceptions, for now, give broad sectoral access

Large platforms may also be classified as significant data fiduciaries, a designation that carries additional obligations such as maintaining processing records, conducting data protection impact assessments, undergoing periodic audits and appointing a data protection officer. These requirements will ensure a higher standard of accountability and risk management across all data processing activity, including those involving children's data. 

These obligations signal a regulatory expectation that companies will move from reactive compliance toward an anticipatory model, where risks to children need to be identified and mitigated before any harm can occur. 

The global shift in protecting children online

With its Age Appropriate Design Code, the U.K. has one of the most influential child centric frameworks. It mandates high privacy defaults and prohibits detrimental nudges, restricts profiling and requires providers to conduct DPIAs that assess risks to children related to age assurance and other design decisions. The AADC has adopted a risk-based approach and does not prescribe a single method for parental consent verification and age assurance. AADC guidance from the U.K. Information Commissioner's Office provides a range of acceptable approaches from self-declaration to third-party verification services with the expectation that the checks will be proportionate to the risks.  

In Europe, GDPR enforcement has emphasized children's data protections. Data protection authorities across member states have imposed substantial fines and corrective measures on organizations for failures such as inadequate age verification, unlawful profiling of minors and insufficient parental consent mechanisms. Under the GDPR, platforms must make reasonable efforts to verify that consent is genuinely provided by a parent or guardian, taking into account the technologies available and the level of risk involved in the processing. Regulators apply a proportionality standard rather than mandating a single method. 

France's data protection authority, the Commission nationale de l'informatique et des libertés, and Ireland's Data Protection Commission emphasize that verification methods must be proportionate, privacy preserving and minimally intrusive especially in child facing services. Verification approaches vary but may include confirmation via offline steps, parental email, phone-based authentication or third-party verification where appropriate. This reflects the EU's dual approach in combining substantive rights with procedural obligations, including the requirements for platforms to conduct risk assessments, implement safeguards and ensure ongoing compliance. 

Additionally, the EU Digital Services Act, fully applicable since February 2024, has introduced systemic duties for very large online platforms to identify and mitigate risks to children, reinforcing the EU's layered model of protection.

In the U.S., the Children's Online Privacy Protection Act primarily applies to commercial websites and online services directed at children under 13, as well as those that knowingly collect personal information from this age group. COPPA also requires verifiable parental consent, which the U.S. Federal Trade Commission specifies, can include methods such as payment verification, government ID checks, or signed consent forms, before collecting, using or disclosing children's data. 

Further it imposes obligations for data security, retention limits and privacy policies. Enforcement has resulted in significant settlements, including multimillion dollar penalties for unauthorized collection of children's data. 

Additionally, the proposed Kids Online Safety Act, currently under congressional review, would broaden the scope of covered platforms beyond COPPA namely by extending protection to all minors under 17 and applying to any platform reasonably likely to be used by minors, versus those only directed at children. The KOSA would also impose a duty of care on online platforms used by minors and require them to prevent and lessen specific harms, life safety threats and exploitation.  

A common thread runs through these frameworks and that is the idea that child protection is shifting from reactive compliance to anticipatory design, where platforms must identify and mitigate risks before harm occurs. India's framework forms part of this broader global shift. Together these regulations seek to create a digital shield, making online spaces relatively safer for children to explore. 

Komal J. Thacker, CIPP/US, is a lawyer licensed in New York and India, advising on legal and regulatory matters.