Happy April Fools’ Day, U.S. Digest readers! I’m thrilled to pass another milestone on our way to spring, but I’m not one for pranks, so you can be sure there’s no joking around in this letter.
This week, two pre-rulemaking informational sessions with the California Privacy Protection Agency Board highlighted topics we could see becoming important as the agency works to finalize California Privacy Rights Act regulations. Experts, officials and academics covered a wide range of topics — including the data flow of personal information, dark patterns, and automated decision-making — intended to help the board, agency staff and public gain insight that could be relevant in the rulemaking process.
CPPA Executive Director Ashkan Soltani was among speakers discussing the types of data flows consumers might encounter as they move through their daily lives, how their information can be shared and what it can be used for, and how individuals can be identified on the internet. Additional speakers Stanford Institute for Human-Centered Artificial Intelligence Privacy and Data Policy Fellow Jennifer King and University of Chicago Sidley Austin Professor of Law Lior Strahilevitz outlined the range of manipulative dark patterns online and their impact on users.
In its review, King told the board it needs to consider manipulative practices intended to influence users’ decisions as the current scope of the CPRA is framed “tightly around consent” and there’s “an opportunity to rethink consent standards.”
“Within the privacy space there’s ways to identify things outside of consent where we see personal data being used to influence your decisions,” King said. “This is something the agency needs to consider … and you need a way to connect with the public in order to receive complaints, suggestions, or reports of dark patterns.”
Professor and co-founder of the University of California, Los Angeles Center for Critical Internet Inquiry Safiya Noble, who presented an overview of data processing and automated decision-making, said California “is heading in the right direction in creating a robust privacy act that protects consumers rights” and told the board it has “a chance to do something groundbreaking.”
But, she said, she wanted to “disavow” the board of any belief that technology and tech platforms are “neutral.”
“They simply are not. They are human-made and they reflect our society, therefore the rules you are writing cannot be neutral either,” Noble said, and offered particular questions the board might consider, like how to then define the meaningful information companies must share about technology, how “deep” California rules should go in supporting “people in moving toward a more just and equitable world,” and how automated decision-making systems can be used to cultivate equity and justice.
In public comment, NetChoice Policy Counsel Jennifer Huddleston asked that the board be “cautious of overly expansive legislation that would damage technology” and “carefully consider the impacts on beneficial uses as well as attempts to address any concerns,” while Electronic Privacy Information Center Counsel Ben Winters urged the board to adopt a “broad rights-enhancing definition” of automated decision-making technologies and “make it as easy as possible for individuals to opt out of such systems.”
Coming up next in the board’s pre-rulemaking process are stakeholder sessions, Chair Jennifer Urban said, welcoming input from stakeholders based on their experience and expertise. Combined with this week’s sessions, she said, “These perspectives will be very helpful in understanding the backdrop of our potential regulations.”
Then, formal rulemaking proceedings are expected in the next few months, according to Soltani, with the process expected to reach completion later in the year.
We’ll continue following along and keeping you informed.
If you want to comment on this post, you need to login.