Editor's note: The IAPP is policy neutral. We publish contributed opinion and analysis pieces to enable our members to hear a broad spectrum of views in our domains.

On 8 Jan., the California Privacy Protection Agency announced a settlement with a mailing list targeting company called Rickenbacher Data, which operates under the brand name DataMasters. It was one of two new orders against data brokers that failed to register under California's Delete Act, as CalPrivacy continues its investigative focus on the data industry. Historically, privacy enforcers have come down hard on the sources of sensitive personal data, but enforcing new laws like the Delete Act has helped to spread accountability to those who buy and resell it.

As a failure-to-register case, the legal violations are unremarkable, though the substantive outcome will no doubt warm the hearts of privacy advocates. In addition to a USD45,000 fine, the company decided to entirely carve out Californians' data from the lists of consumers it buys and sells rather than comply with requirements to register as a data broker and process deletion requests. 

Though pulling out of California could signal a remarkable industry trend, the tea leaves in this case are even more interesting than the tea.

ADVERTISEMENT

Radarfirst- Looking for clarity and confidence in every decision? You found it.

In the press release and the order, CalPrivacy signals serious discomfort with DataMaster's business practices. Specifically, it calls out the fact that the company "bought and resold the names, addresses, phone numbers, and email addresses of millions of people with Alzheimer's disease, drug addiction, bladder incontinence, and other health conditions for targeted advertising." The order, which provides a detailed play-by-play of the enforcer's interactions with the company, also includes screenshots of some of these targeting lists, showing, for example, that the company claimed to have a list of nearly half a million consumers' mailing addresses under the category "Alzheimer's."

This type of scrutiny highlights an unsettling duality in the modern practice of data privacy.

First, as a matter of the lived experience of privacy, it is obvious that the existence of certain health conditions is a piece of private information that differs in sensitivity depending on the nature of the condition. For a similar reason that targeting lists of certain conditions seems to sit poorly with regulators, everyday consumers likely find the same behavior distasteful at best — and potentially highly invasive. The same commercial practice that feels acceptable when it is used to target products to those suffering from acne, for example, suddenly feels different when it is used to target people with a rare cancer.

On the other hand, as a matter of black-letter consumer privacy law, there is little to no distinction between various types of health information. In California, all "personal information collected and analyzed concerning a consumer's health" is sensitive, meaning businesses must provide consumers with the ability to limit the use and disclosure of this information. In other states with consumer privacy laws, opt-in consent is required to collect and share health information. Such rules apply in the same manner whether the health information in question relates to acne, amyloidosis or, indeed, Alzheimer's disease.

This reality is one of the great ironies of the emergence of comprehensive consumer privacy laws. Though they provide uniform minimum protections, they sometimes collapse the messy contextual nature of privacy harms into protections that may leave consumers and regulators alike feeling… icky. 

This is why focusing only on basic compliance with these laws is an easy way to miss the forest for the trees.

In fact, longstanding consumer protection principles and industry best practices provide a far more nuanced perspective on the guardrails around sensitive health data.

We see this, for starters, in the U.S. Federal Trade Commission's enforcement of the FTC Act. The agency's focus during the first half of this decade on the sources of health data in the data ecosystem made clear that context matters when it comes to whether a practice unfairly harms consumers' health privacy. What may be a standard advertising business model for a cosmetics website becomes an enforceable violation for a mental health website.

Admittedly, it is possible that such behavior violates consumer privacy laws too. In its Healthline settlement last year, the California attorney general alleged violations of the purpose limitation principle under the California Consumer Privacy Act. Almost exactly the same behavior that the FTC found unfair in the BetterHelp matter, sharing article titles suggesting a consumer may have already been diagnosed with a specific medical condition to target advertising at the consumer, was seen by the California attorney general as violative of the purpose limitation principle.

For even more nuanced guidance on where to draw the lines around ethical marketing in the health context, privacy professionals should review longstanding industry guidelines. For almost as long as there has been targeted advertising, the industry has recognized the importance of treating health data with care — and subjecting data revealing certain more sensitive health conditions to higher standards than the rest.

The model that emerged, which has never been captured by a privacy law, used a multifactor test to measure the sensitivity of health-related data for advertising purposes. As far back as its 2015 Code of Conduct, for example, the Network Advertising Initiative required its member companies to receive opt-in consent to use sensitive health data. The self-regulatory body laid out the following factors to help its members navigate the contextual nuance of health privacy: 

  • The seriousness of the condition. 
  • How narrowly the condition is defined. 
  • Its prevalence. 
  • Whether it is something that an average person would consider to be particularly private in nature. 
  • Whether it is treated by over-the-counter or prescription medications.
  • Whether it can be treated by modifications in lifestyle as opposed to medical intervention.

The guidance continues: "Under this analysis, sensitive health segments, which require opt-in consent under the Code, include, but are not limited to, categories such as: drug addiction, all sexually transmitted diseases (such as AIDS, HIV, HPV), all types of mental health conditions (such as generalized anxiety disorder, schizophrenia, Alzheimer’s, depression, anorexia/bulimia), pregnancy termination, as well as cancer." The industry group released even more guidance in 2020 about how to handle health audience segments.

Such enforceable industry guidance serves as a major signal to regulators about what counts as fair and reasonable privacy practices under consumer protection laws. As privacy professionals embrace the increasingly compliance-driven era of our field, we would do well to remember the more nuanced lessons of earlier times.

Please send feedback, updates and acne ads to cobun@iapp.org. 

Cobun Zweifel-Keegan, CIPP/US, CIPM, is the managing director, Washington, D.C., for the IAPP.

This article originally appeared in The Daily Dashboard and U.S. Privacy Digest, free weekly IAPP newsletters. Subscriptions to this and other IAPP newsletters can be found here.