Editor's note: The IAPP is policy neutral. We publish contributed opinion and analysis pieces to enable our members to hear a broad spectrum of views in our domains.
In the increasingly scrutinized world of digital design, "dark patterns" have moved from being a user experience problem to a regulatory concern.
From impossible-to-decline consent pop-ups to checkout flows that sneak in additional charges, dark patterns are an enforcement focus globally. India's Central Consumer Protection Authority is the latest to step into the arena.
In early June, the CCPA advised e-commerce platforms to comply with the 2023 Guidelines for Prevention and Regulation of Dark Patterns and gave platforms three months to conduct self-audits to identify and eliminate any dark patterns on their platforms. The authority noted platforms are "advised to refrain from deploying deceptive design interfaces that mislead consumers or manipulate their decision-making."
The move brings India in line with regulatory sentiments from the European Union and the United States, where design deception is increasingly treated as a legal violation, not just a design flaw. But while the intent is clear, the approach so far raises critical questions about enforceability, regulatory coordination and alignment with India's broader data protection goals.
Defining the problem
The advisory lists 13 specific types of dark patterns companies should avoid, including practices like "basket sneaking," adding extra items to a consumer's cart without asking; "confirm shaming," guilting a user into saying yes; and "false urgency," pretending something is about to sell out when it's not.
Following the self-audit, platforms are directed to submit declarations of compliance within the three months.
This framing aligns with global definitions. In its 2023 guidelines, the European Data Protection Board explicitly linked deceptive design patterns to violations of the EU General Data Protection Regulation, especially where consent or transparency is compromised. The CCPA advisory similarly recognizes that user-driven experience manipulation is a systemic issue deserving of regulatory scrutiny.
Yet, there's a critical distinction: The CCPA's move is an advisory, not an enforceable regulation. There is no mention of penalties for non-compliance nor any mechanism for auditing or investigation.
This makes India's approach, at least for now, more suggestive than compulsory.
Enforcement actions: Lessons from the EU and U.S.
Internationally, regulators have gone beyond advisories. In the EU, the Planet49 case established that pre-checked consent boxes are invalid under the GDPR. The EDPB has since published in-depth guidelines on dark patterns; national data protection authorities, like France's Commission nationale de l'informatique et des libertés and Sweden's Integritetsskyddsmyndigheten, have begun issuing compliance warnings for manipulative cookie banners. Additionally, the non-profit NOYB has filed multiple complaints in this regard.
Most notably, Ireland's Data Protection Commission fined TikTok 345 million euros in 2023 for using dark patterns to steer minors toward public account settings, breaching the GDPR's requirements for fairness and transparency. The case is significant because it was one of the first major decisions in which a regulator explicitly labeled user experience practices as dark patterns in violation of legal standards.
In the U.S., the Federal Trade Commission has been invoking its long-standing authority to prosecute dark pattern use. In FTC v. Amazon, the commission alleged Amazon used deceptive design to make it easy to enroll in Prime subscriptions but difficult to cancel. The complaint was framed around the dark pattern taxonomy.
California's enforcement under the California Consumer Privacy Act, as amended by the California Privacy Rights Act, also reflects this trend with its Sephora settlement requiring both monetary penalties and compliance improvements related to consent and tracking.
These actions show regulators are now willing to treat interface manipulation as a form of legal non-compliance. More importantly, they demonstrate the power of enforcement to drive platform behavior.
The gap between law and practice
India's Digital Personal Data Protection Act provides a legal foundation for data protection, including requirements for consent to be free, informed, specific and unambiguous. On paper, these principles directly conflict with the dark pattern behaviors the CCPA advisory seeks to curb.
But the connection between the DPDPA and the CCPA advisory remains weak. The DPDPA has not yet been operationalized through detailed rules and India's Data Protection Board is still being formed. Moreover, there is no structured coordination mechanism between the consumer protection authority and data protection enforcement bodies. This regulatory fragmentation risks creating overlapping but ineffective oversight.
Without a binding rulemaking process or clear sanctions, the CCPA advisory risks being aspirational. The challenge is not just recognition of the problem, but enforcement and accountability. Most digital platforms will not voluntarily abandon high-conversion interfaces without a compelling compliance incentive — unlike a request for self-audits in this instance.
Why enforcement is crucial
Experience from the EU and U.S. shows guidance alone does not change industry behavior. Even after the GDPR went into force, many companies continued using misleading consent banners until the CNIL and other regulators began to issue fines. Self-audit regimes often suffer from lack of follow-through and weak internal incentives to change.
Moreover, in India's context, there is an urgent need to build consumer trust in digital systems. Dark patterns not only degrade user experience but also erode trust in data practices, product recommendations and consent frameworks. Enforcement isn't just about punishment — it's about ensuring users feel safe navigating the digital economy, especially as the DPDPA is set to be implemented soon.
The way toward a stronger framework
To move from principle to practice, India could consider the following steps:
- Codify the identified 13 dark patterns into enforceable consumer protection rules with penalties for violations.
- Establish joint oversight mechanisms between the Data Protection Board and CCPA to address overlapping concerns.
- Provide regulators with the authority and budget to conduct audits and investigations.
- Mandate standardized user experience elements for consent and privacy interactions — similar to efforts underway in the EU.
Ultimately, India's approach to digital trust and fairness will be shaped not only by its legislation but also by how seriously it treats implementation.
Conclusion
The CCPA advisory is a welcome signal of intent. It places India in the company of other jurisdictions grappling with how to protect consumers from interface-level manipulation. But intent alone does not change behavior. Without enforceable rules, coordinated oversight and meaningful consequences, dark patterns will continue to shape user decisions in ways that undermine autonomy and fairness.
India has the legal foundations in place, thanks to the DPDPA and the Consumer Protection Act. What it needs now is follow-through.
As global regulators converge on dark patterns as a shared threat to digital fairness, India's next moves will determine whether its digital economy can truly balance innovation with integrity.
Shravan Kalluri is a legal consultant and researcher.