Remember the mail-order compact discs popular in the '90s that automatically turned into a monthly subscription, when all you wanted to do was listen to a few favorite songs?
Ways to manipulate or sway consumers in a certain direction are nothing new; they’ve been around for years. Known today as “dark patterns,” such practices continue to advance in the digital age and are catching the attention of companies, regulators and consumers.
Free trials that automatically turn into paid memberships, programs that force users to identify multiple steps to cancel, designs that influence users to make certain choices or extra items added to an online shopping cart at checkout are just some examples of dark patterns. Websites or mobile applications may use these practices to manipulate individuals in ways that could be profitable but could lead consumers to do something they didn’t intend.
“We’re all so addicted to our technology in a way that we weren’t even 10 years ago,” said TV advertising sales and technology company Ampersand’s Chief Privacy Officer, General Counsel Noga Rosenthal, CIPP/E, CIPP/US. “We’ve become more dependent on our technology for dating, for music, everything now is on our devices, even books, news … and companies are getting more sophisticated.”
London-based user experience specialist Harry Brignull coined the term dark patterns in 2010. He was invited to speak at a conference and wasn’t sure the topic would generate enough content for a 20-minute talk. As he dug into the idea, he thought, “I’ve got enough here to make a library.”
And within the last three years, Brignull said dark patterns have become “a real topic for regulators, enforcers and people who write laws.”
Regulations prioritize valid consumer consent
While dark patterns are not defined within the EU General Data Protection Regulation, the law does prioritize consent, defined as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes ..." The Age Appropriate Design Code published by the U.K. Information Commissioner’s Office prohibits the use of “nudge techniques,” otherwise known as dark patterns, to lead or encourage children to provide unnecessary personal data or weaken their privacy protections.
In the U.S., state legislation is targeting the use of dark patterns. Both the newly enacted Colorado Privacy Act and the California Privacy Rights Act state that user consent obtained through the use of dark patterns is not valid. Similar language was proposed within the Washington Privacy Act 2021, which failed this past April.
Federal lawmakers are also watching closely. The proposed Deceptive Experiences to Online Users Reduction Act and the Setting an American Framework to Ensure Data Access, Transparency, and Accountability Act both include dark patterns provisions. And this spring, the U.S. Federal Trade Commission convened researchers, legal experts, consumer advocates and industry professionals to examine digital dark patterns, their effect on consumers and prevalence in the marketplace, laws and regulations, and what additional efforts may be needed.
Andrew Dale, CIPP/E, CIPP/US, general counsel and chief privacy officer of direct mail platform Alyce, said the FTC’s focus on dark patterns is a good starting point towards addressing consumer impact. The next step is for businesses, particularly larger platforms with exposure to large amounts of data and consumers, to evaluate closely how they communicate privacy notices and terms to users.
While current regulations seek to tackle the use of dark patterns, Dale noted there is no federal law directly addressing manipulative practices online or even a concrete definition of the term.
“At this point we are reacting on what feels like a dark pattern and what does not feel like a dark pattern, and I don’t think that’s going to work in the long-term because technology will move lightyears faster than regulation, and so to get a better understanding of what consumers are comfortable with and not comfortable with is a good start, but I don’t think we’re there yet,” he said. “I don’t think we have the framework in which to work.”
Reaching consumers in transparent, accessible ways
More companies are getting creative in the ways they reach consumers when it comes to their privacy, moving beyond long, text-heavy policies, Dale said.
“We’re seeing chat box, video and expandable and collapsible privacy policies. More focused, more clear, more disclosures,” he said.
By putting themselves into consumers’ shoes and prioritizing their expectations as a user, Dale said companies can “mitigate the potential for dark patterns — building yourself into the privacy-by-design framework and building in the expectations of the consumer into that same framework.”
Toyota Motor North America Managing Counsel, Data Privacy and Cybersecurity Jim Simatacolos, CIPP/C, CIPP/E, CIPP/US, said the vehicle manufacturer is guided by what the company calls the “Toyota Way,” a pillar of which is “respect for people.” As part of that, the company strives to reach consumers in a transparent and accessible way, a challenge when it comes to communicating collection and use of consumer data, particularly as vehicles become increasingly more connected.
“We want to do really cool things for research and development and the quality and safety sides of things. We want to make great products that people want to buy that last a long time. That requires data,” Simatacolos said. “At the same time, we want to make sure we are doing it in a way that’s secure and is very transparent to consumers.”
Newly purchased vehicles come equipped with a sticker on the upper center console informing consumers that their vehicle’s data transmission is on, with information on how to deactivate it (by pressing the nearby SOS button) and pointing consumers to the company’s privacy notice website. There, users will find information on what data is collected and used, how it is used, how long it is stored, who it may be shared with, how it is protected and more. The company also follows-up with consumers via e-mail with additional information on connected services.
“What we are trying to do is find one more way to let consumers know, ‘Hey, your car has telematics and is transmitting data. Here’s generally why, here’s how you can find out more in a way you can understand and here’s how you can turn it off,’” Simatacolos said. “We are continually working on this website because as technologies improve there are new ways and things we need to describe. There are also better ways over time to continually improve communication with consumers.”
This summer, Toyota Motor North America also introduced a Data Privacy Portal to its Toyota and Lexus applications, accessible for connected vehicles starting from model year 2013. Consumers can customize privacy and data sharing settings for vehicles with connected services and see what data they may be sharing.
“We’re just trying to give users more options and trying to be as transparent as we can,” Simatacolos said. “How can we on their journey of ownership let them know about not only the features that we have, but if they use these features, what use of their data does that entail? We are always looking for new and better ways to communicate with consumers.”
Preventing dark patterns
Ampersand’s Rosenthal said a company’s ethical obligation to consumers is one of the biggest questions facing the industry. She pointed to a recent experience with an app that offered incentives for physical activity. She read about it in a newspaper article, downloaded and registered, and later learned it collects users’ shopping data.
Rosenthal agreed with Dale that privacy by design is a key component. She also said privacy professionals should be a part of product build-outs, working with technology, engineering, sales and other teams lending advice, education and guidance along the way “so that we not only catch it, but that hopefully we prevent this from ever happening.”
Electronic Frontier Foundation Designer Shirin Mori said designers should also be a part of the conversation, particularly in the development of legislative proposals as policymakers may not "have a good understanding of what the day-to-day is like for designers on product teams" or how a company operates.
Mori shared the following best practices for avoiding dark patterns: Ensure the steps a user takes to opt out are comparable to steps needed to opt in; avoid confusing and manipulative language, in particular double negatives; provide explicit notice for how data will be used; and weigh user choices equally and visually “so the user understands what they are looking at in terms of where the yes is and where the no is.”
"Someone should not feel guilted into choosing the more privacy-preserving option, or opting out of the more privacy-preserving option," she said.
Mori is also on the team of designers, researchers, legal experts, policy specialists and advocates who this year formed the Dark Patterns Tip Line where consumers can report instances — with images and screenshots encouraged — when they’ve felt shamed or tricked, were denied choice or lost privacy, for example. Mori said the group plans to release a report on the kinds of dark patterns submitted and any trends observed.
“Sharing a dark pattern you spotted in the wild helps us learn more about interfaces and designs that impact real people. Your tips will help policymakers and enforcers hold companies accountable for their dishonest and harmful practices,” the website reads.
Time will tell how current and future regulations will target the use of dark patterns and how enforcement actions will unfold. As Brignull said, the goal is "a safer internet where people’s wishes are respected and consent is truly given instead of implied and faked.”
“That’s not a huge ask, is it?”
Photo by Samuel Scalzo on Unsplash
If you want to comment on this post, you need to login.