TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | A digital world without dark patterns: ‘That’s not a huge ask, is it?’ Related reading: How US, EU approach regulating ‘dark patterns’

rss_feed

""

""

Remember the mail-order compact discs popular in the '90s that automatically turned into a monthly subscription, when all you wanted to do was listen to a few favorite songs?

Ways to manipulate or sway consumers in a certain direction are nothing new; they’ve been around for years. Known today as “dark patterns,” such practices continue to advance in the digital age and are catching the attention of companies, regulators and consumers.

Free trials that automatically turn into paid memberships, programs that force users to identify multiple steps to cancel, designs that influence users to make certain choices or extra items added to an online shopping cart at checkout are just some examples of dark patterns. Websites or mobile applications may use these practices to manipulate individuals in ways that could be profitable but could lead consumers to do something they didn’t intend.

“We’re all so addicted to our technology in a way that we weren’t even 10 years ago,” said TV advertising sales and technology company Ampersand’s Chief Privacy Officer, General Counsel Noga Rosenthal, CIPP/E, CIPP/US.  “We’ve become more dependent on our technology for dating, for music, everything now is on our devices, even books, news … and companies are getting more sophisticated.”

London-based user experience specialist Harry Brignull coined the term dark patterns in 2010. He was invited to speak at a conference and wasn’t sure the topic would generate enough content for a 20-minute talk. As he dug into the idea, he thought, “I’ve got enough here to make a library.”

And within the last three years, Brignull said dark patterns have become “a real topic for regulators, enforcers and people who write laws.”

Regulations prioritize valid consumer consent

While dark patterns are not defined within the EU General Data Protection Regulation, the law does prioritize consent, defined as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes ..." The Age Appropriate Design Code published by the U.K. Information Commissioner’s Office prohibits the use of “nudge techniques,” otherwise known as dark patterns, to lead or encourage children to provide unnecessary personal data or weaken their privacy protections.

In the U.S., state legislation is targeting the use of dark patterns. Both the newly enacted Colorado Privacy Act and the California Privacy Rights Act state that user consent obtained through the use of dark patterns is not valid. Similar language was proposed within the Washington Privacy Act 2021, which failed this past April.

Federal lawmakers are also watching closely. The proposed Deceptive Experiences to Online Users Reduction Act and the Setting an American Framework to Ensure Data Access, Transparency, and Accountability Act both include dark patterns provisions. And this spring, the U.S. Federal Trade Commission convened researchers, legal experts, consumer advocates and industry professionals to examine digital dark patterns, their effect on consumers and prevalence in the marketplace, laws and regulations, and what additional efforts may be needed.

Andrew Dale, CIPP/E, CIPP/US, general counsel and chief privacy officer of direct mail platform Alyce, said the FTC’s focus on dark patterns is a good starting point towards addressing consumer impact. The next step is for businesses, particularly larger platforms with exposure to large amounts of data and consumers, to evaluate closely how they communicate privacy notices and terms to users.

While current regulations seek to tackle the use of dark patterns, Dale noted there is no federal law directly addressing manipulative practices online or even a concrete definition of the term.

“At this point we are reacting on what feels like a dark pattern and what does not feel like a dark pattern, and I don’t think that’s going to work in the long-term because technology will move lightyears faster than regulation, and so to get a better understanding of what consumers are comfortable with and not comfortable with is a good start, but I don’t think we’re there yet,” he said. “I don’t think we have the framework in which to work.”

Reaching consumers in transparent, accessible ways

More companies are getting creative in the ways they reach consumers when it comes to their privacy, moving beyond long, text-heavy policies, Dale said.

“We’re seeing chat box, video and expandable and collapsible privacy policies. More focused, more clear, more disclosures,” he said.

By putting themselves into consumers’ shoes and prioritizing their expectations as a user, Dale said companies can “mitigate the potential for dark patterns — building yourself into the privacy-by-design framework and building in the expectations of the consumer into that same framework.”

Toyota Motor North America Managing Counsel, Data Privacy and Cybersecurity Jim Simatacolos, CIPP/C, CIPP/E, CIPP/US, said the vehicle manufacturer is guided by what the company calls the “Toyota Way,” a pillar of which is “respect for people.” As part of that, the company strives to reach consumers in a transparent and accessible way, a challenge when it comes to communicating collection and use of consumer data, particularly as vehicles become increasingly more connected.

“We want to do really cool things for research and development and the quality and safety sides of things. We want to make great products that people want to buy that last a long time. That requires data,” Simatacolos said. “At the same time, we want to make sure we are doing it in a way that’s secure and is very transparent to consumers.”

Newly purchased vehicles come equipped with a sticker on the upper center console informing consumers that their vehicle’s data transmission is on, with information on how to deactivate it (by pressing the nearby SOS button) and pointing consumers to the company’s privacy notice website. There, users will find information on what data is collected and used, how it is used, how long it is stored, who it may be shared with, how it is protected and more. The company also follows-up with consumers via e-mail with additional information on connected services.

“What we are trying to do is find one more way to let consumers know, ‘Hey, your car has telematics and is transmitting data. Here’s generally why, here’s how you can find out more in a way you can understand and here’s how you can turn it off,’” Simatacolos said. “We are continually working on this website because as technologies improve there are new ways and things we need to describe. There are also better ways over time to continually improve communication with consumers.”

This summer, Toyota Motor North America also introduced a Data Privacy Portal to its Toyota and Lexus applications, accessible for connected vehicles starting from model year 2013. Consumers can customize privacy and data sharing settings for vehicles with connected services and see what data they may be sharing.

“We’re just trying to give users more options and trying to be as transparent as we can,” Simatacolos said. “How can we on their journey of ownership let them know about not only the features that we have, but if they use these features, what use of their data does that entail? We are always looking for new and better ways to communicate with consumers.”

Preventing dark patterns

Ampersand’s Rosenthal said a company’s ethical obligation to consumers is one of the biggest questions facing the industry. She pointed to a recent experience with an app that offered incentives for physical activity. She read about it in a newspaper article, downloaded and registered, and later learned it collects users’ shopping data.

“They gave me notice, a link to the privacy policy, and my job as a consumer was to read it. But this is so far away from my expectation of what was happening to my data that maybe they did have an obligation to say here are the top things we do with your data, whatever it is,” Rosenthal said. “As a company is that really my obligation? I’m almost babying the user and maybe that’s not my job. But in the same vein, when do I step so far away from what the user’s expectation is that I’m almost tricking them?”

Rosenthal agreed with Dale that privacy by design is a key component. She also said privacy professionals should be a part of product build-outs, working with technology, engineering, sales and other teams lending advice, education and guidance along the way “so that we not only catch it, but that hopefully we prevent this from ever happening.”

Electronic Frontier Foundation Designer Shirin Mori said designers should also be a part of the conversation, particularly in the development of legislative proposals as policymakers may not "have a good understanding of what the day-to-day is like for designers on product teams" or how a company operates.

Mori shared the following best practices for avoiding dark patterns: Ensure the steps a user takes to opt out are comparable to steps needed to opt in; avoid confusing and manipulative language, in particular double negatives; provide explicit notice for how data will be used; and weigh user choices equally and visually “so the user understands what they are looking at in terms of where the yes is and where the no is.”

"Someone should not feel guilted into choosing the more privacy-preserving option, or opting out of the more privacy-preserving option," she said.

Mori is also on the team of designers, researchers, legal experts, policy specialists and advocates who this year formed the Dark Patterns Tip Line where consumers can report instances — with images and screenshots encouraged — when they’ve felt shamed or tricked, were denied choice or lost privacy, for example. Mori said the group plans to release a report on the kinds of dark patterns submitted and any trends observed.

“Sharing a dark pattern you spotted in the wild helps us learn more about interfaces and designs that impact real people. Your tips will help policymakers and enforcers hold companies accountable for their dishonest and harmful practices,” the website reads.

Time will tell how current and future regulations will target the use of dark patterns and how enforcement actions will unfold. As Brignull said, the goal is "a safer internet where people’s wishes are respected and consent is truly given instead of implied and faked.”

“That’s not a huge ask, is it?”

Photo by Samuel Scalzo on Unsplash


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

1 Comment

If you want to comment on this post, you need to login.

  • comment R. Jason Cronk • Aug 25, 2021
    Rosenthal's take is exactly what's wrong. She said "They gave me notice... my job as a consumer was to read it." No, that's not the consumer's job. A consumer is not employed to read privacy notices. They aren't getting paid nor is the time investment worth it. 
    
    She further said "As a company is that really my obligation? I’m almost babying the user and maybe that’s not my job." Literally, as CPO, that is your job, or rather your job to ensure that that somebody at the company is explaining risks in a "concise, transparent, intelligible and easily accessible form" (Article 12 GDPR). The average American reads at a 10th grade education level. That means if you write at a 10th grade level, you're excluding half of American consumers from understanding what you're trying to convey. It's not about babying them but providing them actionable information consistent with their level of understanding and experiences.  
    
    I teach in my courses that three of the core justifications for a company designing with privacy in mind have to do with asymmetries between the company and the individual. Two of them, information and time asymmetry (what Hartzog calls the bandwidth problem), have to do with the organizations being in a better position to deal with privacy risks they create and being in a better position to explain those risks to individual. This ties back to the consumer's "job" isn't to read privacy notices. I'd be willing to give Rosenthal the benefit of the doubt and that the quotes might be taken slightly out of context and she meant to say essentially, providing notice is not the answer, but designing the app to preclude the need for notice is. Ultimately, as the article properly implies, the referenced physical activity app should not be tracking shopping habit.