Privacy and data protection regulators and lawmakers are increasingly focusing their attention on the impacts of so-called “dark patterns” in technical design on user choice, privacy, data protection and commerce.
U.S. Federal Trade Commissioner Rohit Chopra recently defined dark patterns as “design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent.” Woodrow Hartzog, professor of law and computer science at Northeastern University, argues in his book "Privacy’s Blueprint" the way user interfaces are designed plays a key role in eroding a user’s privacy. He writes privacy laws need to take into consideration how big of a role design plays in privacy, noting privacy laws should “be careful to broach design in a way that is flexible and not unduly constraining” while also setting “boundaries and goals for technological design.”
Lawmakers and regulators are starting to address the use of dark patterns directly, focusing on how user interfaces are designed and their impact on consumer consent.
US FTC
Recently, the U.S. Federal Trade Commission signaled it is giving serious attention to the use of dark patterns by businesses. It issued a complaint against Age of Learning for its use of dark patterns involved with their service ABC Mouse. The FTC alleged ABC Mouse made cancelation of recurring subscription fees difficult for tens of thousands of customers despite promising “Easy Cancellation.”
Chopra issued a statement accompanying the complaint that referred to ABC Mouse as a “roach motel,” a term coined by Harry Brignull (also the first to use the phrase “dark pattern”), “where it is easy to get in, but almost impossible to escape.” Chopra listed examples of common roach motel tactics, stating “these digital traps can come in the form of forced continuity programs that make it difficult to cancel charges, trick questions to frustrate user choice, or through free trials that automatically convert into paid memberships.”
Chopra described how ABC Mouse deliberately designed its service to be hard to cancel. The link to the cancelation was “buried” in the website and “families who tried to cancel through the website were forced to click through a labyrinth of pages urging them not to cancel.” The FTC complaint alleged ABC Mouse used a negative option feature, which is when in “an offer or agreement to sell or provide any goods or services, [there is] a provision under which the consumer’s silence or failure to take an affirmative action to reject goods or services or to cancel the agreement is interpreted by the seller as acceptance of the offer.”
This matter is now pending court approval of a proposed settlement. Terms of the stipulated final order include ABC Mouse (1) being prohibited from misrepresenting negative options; (2) being required to make certain disclosures relating to negative options; (3) obtaining express informed consent for a negative option; (4) providing consumers with a simple mechanism to cancel negative option features; (5) paying a $10 million fine; and (6) maintaining all records necessary to demonstrate compliance with the order for 20 years.
Chopra further stated, “the agency must deploy these tools to go after large firms that make millions, or even billions, through tricking and trapping users through dark patterns.” He went on to conclude the FTC “needs to methodically use all of our tools to shine a light on unlawful digital dark patterns, and we need to contain the spread of this popular, profitable, and problematic business practice.” This seems to signal the FTC will be closely monitoring companies’ dark pattern use.
Federal legislation
There have been several recent bills introduced in the U.S. Senate that included provisions designed to curb the use of dark patterns. The Deceptive Experiences to Online Users Reduction Act, introduced April 9, 2019, by Sens. Warner, D-Va., and Fischer, R-Neb., was aimed at using the powers of the FTC to limit the ability of online providers to use dark patterns to influence online consumer behavior.
The DETOUR Act stated in Section 3: “It shall be unlawful for any large online operator ... to design, modify, or manipulate a user interface with the purpose or substantial effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.”
This bill also would have allowed for the creation of an association of the large online operators (online companies with more than 100,000,000 users in a 30-day period) that could create rules and regulations for their members as long as those rules and regulations were in line with the DETOUR Act and FTC policies.
More recently, Sen. Roger Wicker, R-Miss., introduced the Setting an American Framework to Ensure Data Access, Transparency, and Accountability Act, which incorporates the DETOUR ACT into Section 206. The SAFE DATA Act prohibits manipulating a user’s interface to compel compulsive usage, including auto-play, for sites that are directed at users under the age of 13. (IAPP Senior Westin Research Fellow Müge Fazlioğlu, CIPP/E, CIPP/US, California Consumer Privacy Act gives Californians control over their data and regulates what businesses can do with that data. One of the key tenets of the CCPA is the need for transparency and meaningful consumer consent.
On Oct. 12, 2020, the California attorney general announced a new round of proposed modifications to the CCPA. Included in these proposed regulations is a new provision (Section 999.315(h)) limiting the number of steps it takes for a consumer to opt out of the sale of personal information to no more than the number of steps it takes for the consumer to opt in to the sale. Additionally, the proposed regulations would prohibit a business from using confusing language when the consumer tries to opt out (999.315(h)(2)). The regulations also prevent the business from making the consumer read or listen to a list of reasons not to opt out while trying to opt out (999.315(h)(3)).
On Oct. 28, 2020, a group of seven leading advertising and marketing trade associations issued comments to the pending regulations, specifically highlighting proposed Regulation 999.315(h)(3), the regulation that prevents businesses from making the consumer read or listen to a list of reasons not to opt out. They claimed that “as a result of the proposed modifications, consumers’ receipt of factual, critical information about the nature of the ad-supported Internet would be unduly hindered, thereby undermining a consumer’s ability to make an informed decision.” Additional comments stated the proposed regulations would run afoul of the first and fourteenth amendment and requested the proposed regulation be changed.
In addition to the proposed regulations, the California Privacy Rights Act ballot initiative, which recently released a report discussing the importance of user interface design on user empowerment. Echoing Hartzog, the CNIL stressed design is critical to help protect privacy. The report also discussed how consent that was gathered using dark patterns does not qualify as valid consent freely given, stating, “the fact of using and abusing a strategy to divert attention or dark patterns can lead to invalidating consent.”
In late 2019, researchers published a paper that found that out of a sample of 1,000 German websites, 57.4% of those websites used a dark pattern known as "nudging" to help push the user towards giving consent. This finding was consistent with prior work showing that the design of choices on a website heavily influences people’s decisions when it comes to navigating the website.
And, in January 2020, researchers conducted a study into dark patterns post-GDPR and found many of the common uses of dark patterns “illustrates the extent to which illegal practices prevail, with vendors of CMPs turning a blind eye to — or worse, incentivizing — clearly illegal configurations of their systems.” They also found “enforcement in this area is sorely lacking” and provided a tool that DPAs could use to help enforcement of freely given consent.
Much like the proposed U.S. legislation and CPRA, the discussion surrounding the GDPR and dark patterns will likely continue to focus on how dark patterns influence a user’s consent.
Conclusion
As companies design customer and user interfaces and craft their consent requests, they should carefully consider the attention being given to the issue of dark patterns by regulators, lawmakers and researchers.
Photo by Paweł Czerwiński on Unsplash