Everyone agrees that dark patterns are real. Manipulative design is all around us. But identifying practices that rise to the level of illegally deceptive user interface design has remained a matter of some debate.

A new staff report from the U.S. Federal Trade Commission aims to shed light on the contours of dark patterns to help businesses avoid stepping afoul of the law — and consumer good will — in the future. The report is based on a public workshop from April 2021 and includes helpful guidance throughout, along with examples of prior cases where the FTC has dinged companies for manipulating consumers. Most helpfully, Appendix A of the report presents a fascinating and comprehensive taxonomy of digital dark patterns. It is worth noting that the phrase “dark patterns” has been problematized for perpetuating the idea that “dark” means “bad.” But no viable alternative to the phrase has been found, as explored in this thoughtful blog by Caroline Sinders.

Unfortunately, dark patterns in the privacy context remain relatively commonplace. The FTC focuses on this in Section IV of the report — “Design Elements that Obscure or Subvert Privacy Choices.” Because dark patterns are a feature of user interfaces, it is natural in the privacy context that they generally interfere with the principle of choice.

Even as we move to an era of data privacy less focused on consent, respecting consumer preferences remain core to good privacy practice. In digital systems, consumers only have the choices about their privacy that a company offers. And choices are only meaningful when they are based on accurate information. When dark patterns subvert preferences by manipulating the appearance or reality of choice, it harms not only the individual consumers, but the entire idea of meaningful choice in the marketplace.

To avoid the ire of the FTC, not to mention your users, it is important to present choices in clear and unbiased ways. A well-structured up-front notice should ensure consumers are able to understand the risks and benefits of their choices. Rely on clarity and empowerment rather than obfuscation and manipulation. As the FTC puts it, “Consumers should not have to navigate through multiple screens to find privacy settings or have to look for settings buried in a privacy policy or in a company’s terms of service: they should be presented at a time and in a context in which the consumer is making a decision about their data. Any toggle options presented to the consumer should not be ambiguous or confusing, and one option should not be more prominent than another.”

One general tip from the FTC for UI designers: Walk a mile in the user’s shoes. This echoes a common approach known as human-centered design. “Businesses should take a moment to assess their user interfaces from a consumer’s perspective and consider whether another option might increase the likelihood that a consumer’s choice will be respected and implemented.”

Here's what else I’m thinking about:

  • There is “very good progress in the discussions on safe data flows between the EU and the U.S.” according to a tweet from EU Commissioner for Justice Didier Reynders, in which he thanked U.S. Commerce Secretary Gina Raimondo for her leadership and the teams on both sides for their hard work. The tweet included a photo of the two principals meeting today and some cause for hope on the future of EU-U.S. data flows: “We should be able to move soon to next steps.”
  • Surveillance reform via public comment. The Privacy and Civil Liberties Oversight Board is seeking comments on Section 702 of the Foreign Intelligence Surveillance Act. Comments can address questions PCLOB should explore and “recommendations it should consider making, in connection with its oversight project to examine the surveillance program,” which comes up for renewal in just over a year.
  • Kids’ codes are here to stay. The D.C. policy community is belatedly turning its attention to California’s Age-Appropriate Design Code Act, which was signed into law by Gov. Gavin Newsom, D-Calif., yesterday. Coming into effect in July 2024, the law will apply to any online service, product or feature that is “likely to be accessed” by youth under 18 and is operated by a business covered under the California Consumer Privacy Act. For some thoughtful commentary on the purposes and privacy impacts of the legislation, check out posts by Amelia Vance and Jennifer King.
  • Privacy issues in the metaverse were front and center at a panel discussion hosted by the Information Technology and Innovation Foundation and the XR Association, available to view here. Panelists discussed how privacy best practices interface with the worlds of augmented reality and virtual reality, including some of the unique considerations of these systems, such as eye-tracking and object identification. There is much still to learn about how to incorporate privacy-by-design into these new technologies.

Upcoming happenings

  • Sept. 20 at 3 p.m. EDT, IAPP’s D.C. KnowledgeNet will host a virtual panel on Consent and Preference Management, in coordination with the Connecticut and Phoenix KnowledgeNets.
  • Sept. 21 at 7 p.m. EDT, the Electronic Privacy Information Center (EPIC) will host its annual Champions of Freedom Awards ceremony (District Winery).
  • Sept. 28 at 11 a.m. EDT, the Georgetown Institute for Technology Law and Policy will host a virtual symposium on AI Classification Frameworks and AI Accidents, part of its AI Governance Virtual Symposium Series.

Please send feedback, updates and dark thoughts to cobun@iapp.org.