“Privacy means a lot of things. But it is essentially about dignity, respect, and equity. These are all things we want regardless of our identity.”
I was struck by these closing remarks from Under Armour Head of Global Privacy Ami Rodrigues, CIPP/E, CIPP/US, CIPM, FIP, PLS, during a panel last week on inclusive privacy design at the IAPP Privacy. Security. Risk. conference. The panel highlighted the importance of enshrining design processes that consider privacy from the perspectives of “the least among us,” as LGBT Technology Partnership and Institute Deputy Director and General Counsel Carlos Gutierrez put it. His remarks focused on the need to individually consider the most marginalized communities — especially people who don’t otherwise have a seat at the table in the design process. My takeaway from the panel was a reminder that we need to be listening to users, of all types, and evolving our practices to meet the particular privacy risks that may fall on all sorts of individuals because of their characteristics, abilities, lifestyle or identity.
We are entering an era of integrated privacy processes that go a step deeper than “privacy by design.” Rather than designing products and services for the “average” consumer, we should be asking our teams to consider the unique burdens carried by marginalized and minoritized communities and other vulnerable groups. Not in general terms, but by considering individual categories of users, and engaging with them to better understand their needs. Specifically, this means considering the ways that different types of users can be harmed by design choices, users harming one another or misusing features, as well as governments misusing or improperly leveraging data.
The shift toward inclusive privacy is apparent in laws like California’s Age-Appropriate Design Code, which requires organizations to think about young users in a way that mitigates the harms they face through using online services. As we explored previously, the AADC includes rules that mandate protective default settings, data minimization and expanded impact assessments. Duties of care and loyalty are also included, explicitly placing emphasis on the organization to consider the best interests of users — not on average, but in all their complexity. (Well, at least in this context, limited to what is appropriate for their age.) Many of these ideas are also echoed in proposed federal legislation like the Kids Online Safety Act, which has been championed by a number of prominent Democratic senators.
There are also echoes of these themes in the White House’s Blueprint for an AI Bill of Rights. Though it focuses on “automated systems,” particularly algorithms that support decision-making processes, the blueprint incorporates language calling for inclusive design principles and pre-consideration of the best interests of users: “Systems should not employ user experience and design decisions that obfuscate user choice or burden users with defaults that are privacy invasive.” Unusual for an executive branch blueprint, the document also takes pains to go from principle to practice in the section on data privacy.
Time will tell whether regulators continue to encourage privacy teams to build demographic-specific considerations into their design processes. In the meantime, it is clear that the more privacy professionals can bring these ideas to bear, the more users will feel respected and heard. As Rodrigues put it, “Privacy pros hold a special role in organizations to advocate for users.”
Here's what else I’m thinking about:
- Analyses are trickling in about the White House Executive Order on Enhancing Safeguards for United States Signal Intelligence Activities. IAPP Vice President and Chief Knowledge Officer, Caitlin Fennessy, CIPP/US, provided a cogent overview along with details on what comes next. For deeper scrutiny on the contours of the new redress mechanisms, the IAPP also published a reaction from prominent scholars Théodore Christakis, Kenneth Propp and Peter Swire, CIPP/US. The Hogan Lovells blog has a helpful breakdown too.
- The U.S. Federal Trade Commission’s extension for comments to its Advance Notice of Proposed Rulemaking caused an audible collective sigh of relief from the policy community in D.C. Privacy professionals should consider adding their organization’s perspectives to the mix as the FTC works to separate the digital wheat from the commercial surveillance chaff. For inspiration before the Nov. 21 deadline, check out some of the early submitters on Regulations.gov, including Tech Freedom and the Chamber of Commerce.
- The FTC workshop on protecting kids from stealth advertising in digital media raised some issues around privacy. When the lines between organic content and advertising are blurred, kids and teens may be more likely to provide personal information to third parties.
- Where is recently departed FTC Commissioner Noah Phillips? Naturally, he joined a law firm with a robust antitrust practice Cravath, Swaine & Moore, which recently signed a lease in D.C. to expand its presence here.
- There’s a new “action tank” in town, the Council for Responsible Social Media. Started by the nonpartisan group Issue One, the group is focusing on mental health, divisiveness and national security. The Washington Post reports the council could rally around existing proposed legislation like the Kids Online Safety Act.
- Google’s update to its ad preferences system could shake things up. This week the company began rolling out a new dashboard called My Ad Center, which will replace all prior ad choices interfaces. It’s a notable enhancement to user choice with a lot of granular controls. It may be a good example of purpose limitations too, as users can now limit ad activity based on browsing history, for example, without stopping collection of the data for other purposes. Also notable is the inclusion of a “sensitive” tab that lets the user opt-out of certain ads related to common addictions (alcohol, gambling) as well as “pregnancy and parenting” ads.
Under scrutiny
- Ambiguities in the Children's Online Privacy Protection Act are explored in an article by M. Sean Royall in the law journal Antitrust, calling for the FTC to issue clear guidance rather than relying on the unofficial COPPA FAQs.
- “Luxury surveillance” is the subject of an Atlantic article that problematizes the ubiquity of tracking devices, with widespread adoption fueled by companies like Amazon.
- Life in the metaverse is profiled in the way only Kashmir Hill can in her report on spending a day in Meta’s Horizon Worlds to find out who was there and why “whether the rest of us would ever want to join them.”
Upcoming happenings
- Oct. 27 at noon EDT, the IAPP D.C. KnowledgeNet will host a discussion on Youth Privacy: Lessons from gaming and beyond (virtual).
Please send feedback, updates and blueprints to cobun@iapp.org.