TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

United States Privacy Digest | A view from DC: Is your privacy notice stuck in the '90s? Related reading: A view from DC: Kids Online Safety Act could pass — and it applies to you

rss_feed

""

""

When the California attorney general's office announced a privacy settlement with DoorDash this week, the press release raised eyebrows for referencing an old California law most privacy professionals don't think much about these days. The California Online Privacy Protection Act of 2003, known as CalOPPA, was a cutting-edge law for its time that has largely been eclipsed by modern data privacy rules — namely the California Consumer Privacy Act.

So, why is this decades-old statute part of this 2024 enforcement action?

With the continuous commotion raised by new state privacy laws across the U.S., it is easy to forget how far data privacy standards have come. There was a time, not so long ago, when even the basic practice of posting a privacy notice on a website was voluntary — and a lot less complicated.

For example, in 1999, Google's first public privacy policy ran to a total length of 609 words, an order of magnitude shorter and less complex than the current version today — and even shorter than this column. Many of the changes to the company's 46 versions of this privacy policy are explained by the maturation of external privacy policies as a way of providing users with basic notice about a company's practices, as well as the ever complexifying world of personal data, online advertising and digital services generally.

But a large percentage of the content of Google's privacy notice — and DoorDash's and every other public policy out there — is driven by explicit regulatory requirements. Very quickly in their history, external-facing privacy policies became a tool for communicating with privacy oversight bodies. Arguably, regulators became the primary audience for increasingly formalized privacy policies, while separate practices matured around delivering layered notices and privacy portals to provide individuals with a better understanding of their privacy.

CalOPPA is notable because it was the primary legislative vehicle that drove global adoption, at least on the English-speaking internet, of detailed public privacy policies. Why? Simply put, because the law suddenly required any company with a website accessible in California to post a public privacy policy with many of the baseline disclosures about collection, use and sharing of personal data that we are familiar with today. Failure to do so is a direct violation of CalOPPA, as is the failure to accurately reflect the company's privacy practices in its posted notice.

Requiring detailed posted privacy notices was a game changer, not because it led to more informed consumers, but because it provided a hook by which regulators could enforce privacy promises via general consumer protection laws like the Federal Trade Commission Act.

Section 5 of the FTC Act prohibits unfair and deceptive practices that harm consumers. One of the ways the U.S. Federal Trade Commission can prove a website operator has engaged in a deceptive act or practice is by showing that the operator's privacy notice was false, misleading or incomplete. At a basic level, if a website operator claims in its privacy notice that it does not sell or share personal information with third parties, but in fact does so, the FTC can sue the operator for deceiving consumers and violating Section 5 of the FTC Act. Simple misrepresentations and even omissions are also potential violations of the deception prong of Section 5. All 50 states have similar consumer protection laws, known collectively as UDAP statutes.

This legal path undergirded the first two decades of privacy enforcement in the U.S. As privacy practices matured, the FTC also relied on other types of deception, and increasingly on theories of unfairness.

Even as California modernized its privacy law, CalOPPA remained in effect. When the California attorney general investigated DoorDash's privacy practices, including its alleged sharing of personal data with a marketing cooperative, CalOPPA was the relevant hook for basic omissions from the company's privacy notice.

The major takeaway of the DoorDash case under the CCPA is that using data as part of a marketing cooperative counts as a "sale" and therefore must be disclosed along with an opportunity for consumers to opt out.

Under CalOPPA, the takeaway is even more basic. If your company shares personal data with third parties, even as part of a collective arrangement, this fact must be disclosed in your privacy notice. As the attorney general alleges, during the relevant time period, the company's privacy notice never explained "that other businesses — like marketing co-op members — could contact DoorDash customers with advertisements for their businesses."

The case serves as a good reminder of privacy basics, including the basic lesson to "think like a regulator" when writing and reviewing your privacy notice. Material omissions are not just a problem under laws like CalOPPA that require specific disclosures. The FTC has brought numerous privacy cases based on omissions, including the Goldenshores flashlight app case.

In the artificial intelligence age, as consumer disclosures catch up with new regulatory expectations, it is important to think even harder about potential gaps between practices and promises. For example, some of Google's most recent updates to its public privacy policy, which the company helpfully makes readable through its transparency efforts, relate to the company's use of personal data for AI systems.

Last month, the FTC issued an explicit warning to AI developers about misrepresentations and omissions related to personal and confidential data. Pointing to its Everalbum case, the FTC took pains to mention that "what a company fails to disclose to customers may be just as significant as what it promises. The FTC can and does bring actions against companies that omit material facts that would affect whether customers buy a particular product — for example, how a company collects and uses data from customers." As consumers become increasingly concerned about the use of their data for training models, and other possible AI harms, updates to disclosures will be needed.

Writing detailed privacy disclosures may be an old idea, but spotting omissions is an exercise in constant vigilance.

Here's what else I'm thinking about:

  • Signs point to… unfair surveillance advertising. With a major consent order against the security software firm Avast this week, the FTC is continuing to build its case that some commercial activities related to consumers' web browsing behavior are unfair under Section 5 of the FTC Act. Not only should this put individual firms on notice — especially when remedies include data disgorgement and USD16.5 million in consumer redress payments — but the FTC's expected NPRM on commercial surveillance and data security will likely seek to make the argument that similar harmful practices are widespread in the market, and therefore should be banned completely under its forthcoming trade regulation rule. Relevant to the Avast order are alleged failings related to notice, choice, data retention, and properly aggregating and de-identifying tracking data before selling it to third parties.
  • What changes to kids' privacy protections are coming? In addition to the hubbub around federal bills like the Kids Online Safety Act, the FTC is revising its implementation of the Children's Online Privacy Protection Act. IAPP Westin Research Fellow Andrew Folks, CIPP/US, unpacked the big takeaways in the proposed updates to COPPA. Relatedly, I sat down with two kids' privacy experts on a recent LinkedIn Live on the same subject.
  • A new report on AI governance best practices. The Centre for Information Policy Leadership released a report based on qualitative interviews with 20 companies, mapping their AI governance programs to CIPL's own Accountability Framework. The report highlights emerging trends in AI governance, which could serve as helpful benchmarking for other organizations seeking to launch robust programs.

Upcoming happenings:

  • 25 Feb.: The deadline to submit speaking proposals for the IAPP Privacy. Security. Risk. 2024 conference in Los Angeles, scheduled for 23-24 Sept.
  • 27 Feb., 17:30 ET: The Future of Privacy Forum hosts its 14th annual Privacy Papers for Policymakers event (U.S. Capitol Visitor's Center).
  • 29 Feb., 9:00 ET: The Data Transfer Initiative hosts a summit on Empowerment through Portability (National Union Building).
  • 6 March, 11:00 ET: George Washington University's Digital Trade and Data Governance Hub hosts a book talk titled Our Next Reality: How the AI-powered Metaverse Will Reshape the World (virtual).

Please send feedback, updates and material omissions to cobun@iapp.org.

California Privacy Law, Fifth Edition

“California Privacy Law,” now in its updated fifth edition, provides businesses, attorneys, privacy officers and other professionals with the practical guidance and in-depth information to navigate the state’s strict policies.

View Here


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.