“Measure and symmetry are beauty and virtue all the world over.” — Socrates, in Plato’s Philebus.

Whether we accept symmetry as a fundamental measure by which we can judge what is good and right in the world, we cannot deny the importance of symmetry in art and science alike.

Symmetry is a central theme of artistic expression. Artists use symmetry to create balance and stability in their works, disrupting symmetry to evoke feelings of tension.

Whatever the underlying reason, humans have a universal aesthetic preference for symmetry.

As it turns out, regulators also have a strong preference for symmetry. It is a clear theme in the enforcement advisory on dark patterns released yesterday by the California Privacy Protection Agency’s enforcement division.

The agency’s warning does not provide any new guidance — it is arguably not allowed to — but it does serve to remind businesses of the obligations under California law to meet a high standard of consent and, in doing so, to avoid relying on design features that nudge consumers away from privacy-protective choices.

Although the literature and regulatory guidance on dark patterns is more expansive, the CPPA is naturally concerned with dark patterns only in relation to privacy choices. Specifically, the enforcement advisory focuses on legally mandated touchpoints with consumers, including “methods for submitting (California Consumer Privacy Act) requests and obtaining consumer consent.”

Themes of symmetry are sprinkled throughout the advisory, including in a list of questions businesses should consider asking themselves when implementing privacy design choices: “Is the consumer’s path to saying ‘no’ longer than the path to saying ‘yes?' Does the user interface make it more difficult to say ‘no’ rather than ‘yes’ to the requested use of personal information?”

Symmetry in the architecture of choice is an idea worth expanding on not least because it provides a more satisfying and visceral framing than the often vague and controversial idea of "dark patterns."

California’s implementing regulations expound on the concept of symmetry in choice. "The path for a consumer to exercise a more privacy-protective option shall not be longer or more difficult or time-consuming than the path to exercise a less privacy-protective option because that would impair or interfere with the consumer’s ability to make a choice."

Somewhat oddly, given that California law does not require opt-in consent for the use of tracking technologies, the CPPA illustrates its enforcement advisory with examples of cookie consent boxes. See pages 3 and 4.

The examples immediately sparked vociferous debate among privacy professionals on LinkedIn. The general consensus seemed to be that none of the CPPA’s example consent boxes fully achieved the stated goals of symmetry and clarity. Perhaps that was the point.

Yet it is impossible to fully assess the adequacy of a choice mechanism without understanding the underlying architecture. Default settings matter. Each of the pictured choice mechanisms could exist on a website where cookies were placed before the user took action just as easily as it could exist on a website where nothing happens until a user chooses one way or another.

In an opt-out scenario, true symmetry cannot exist until the point where consumers decide to take action. At that point, nudges or extra clicks will likely be judged just as harshly by regulators as they are in the opt-in scenario. Until that point, the default will always be frictionless in comparison with every other possible setting.

This brings to mind the evolving literature on the role consent should play in privacy law. Consumer choice has always existed on a continuum, with opt-out mechanisms on one end and freely given, specific, informed, and unambiguous consent on the other. At the very extreme side of the ambiguity scale are forced choice mechanisms — forks in the road whereby users are required to make a decision one way or the other before accessing a system.

In a recent provocation titled "Voting for Consent," Georgetown professors Paul Ohm and Meg Leta Jones wrote about forced-choice consent as a healthy contributor to digital governance that truly reflects consumer choices. They compare the consequential Apple iOS forced choice of “Ask App Not to Track” or “Allow” to the function of direct democracy, asking whether academics shouldn't embrace such consent mechanisms as a means by which firms can earn the privacy votes of users after earning their trust.

Perhaps in an imperfect world, user consent, however flawed, is the best of all possible evils. "Unlike other areas of life, we shouldn't keep consent around because it’s the only morally justifiable way to arrange data relationships," they wrote. "We should keep it around because it's a very powerful and simple tool when everything else breaks."

Jones builds on this idea in her new book, "The Character of Consent," which uses a historical philosophy lens in the style of Ian Hacking to explore the "computer characters" that we imagine when we design systems of choice and the laws that mandate them. The book asks who it is we expect to be consenting — data subjects, users, or "privacy consumers" — and shows how the rise of each of these three legal concepts has contributed to our modern world of ubiquitous pop ups and the "phony performance of privacy."

Through this lens, Jones raises questions about the computer character we should consider designing for in the future. Should we design choice architectures that treat individuals as well-informed citizens of the digital world or as simple persuadable lemmings?

Whether we are moving toward building a world with more choice mechanisms or less, when we do offer consent, we should do it the right way. To that end, embracing symmetry may be one way of prioritizing true user autonomy.

Please send feedback, updates and symmetrical choice mechanisms to cobun@iapp.org.

Cobun Zweifel-Keegan, CIPP/US, CIPM, is the managing director in Washington, D.C., for the IAPP.