It can seem daunting these days to keep up with new expectations about the types of personal data that should be subject to heightened safeguards and enhanced consumer choices.

Whether driven by new compliance obligations or evolving enforcement standards, the scope of data types considered "sensitive" in U.S. privacy practice is only expanding. But new technologies also drive this trend.

Artificial intelligence systems challenge some of our basic notions of what counts as "personal data," shifting the dividing line between data that can identify an individual and all other nonpersonal data. The same is true for "sensitive data," as the power of algorithm-driven inferences expands the capacity to transform murky data lakes into intimate sensitive data.

As this act of turning water into wine becomes more commonplace, policymakers and enforcers are keeping pace with their own new forms of alchemy.

Just this week, the U.S. Federal Trade Commission reiterated one of its recent enforcement trends in the area of sensitive health-related data. In a settlement with the telehealth addiction service Monument, the agency again drew attention to the recently expanded understanding of the scope of health data — and the expanded understanding of "revealing" such data to third parties.

As in the BetterHelp matter, the FTC's complaint against Monument highlights the sharing of user information in a context that allowed third parties to make potentially sensitive and medically-relevant conclusions about the users of the service.

One determining factor in these cases is the sensitive nature of the company's narrowly-tailored services. BetterHelp only provides mental health counseling services. Similarly, Monument only provides addiction treatment. This fact increases the risks that third-party web tracking technologies have on user privacy. Any indicator that confirms an identifiable individual's use of such a service may, by itself, be considered a revelation of sensitive data.

Of course, it's not just telehealth services that are on notice. Last year, guidance from the Office for Civil Rights in the U.S. Department of Health and Human Services clarified that tracking technologies must meet the requirements of Health Insurance Portability and Accountability Act when deployed on the websites of covered entities, even at times when users are not signed into a service. Regulated entities have had more than a year to accommodate this new guidance — and those outside of HIPAA's coverage need only look at recent FTC cases to see the parallels under general consumer protection law.

It is past time for any website that collects or infers any health-related data, broadly defined, to double check the types of information third-party tracking technologies may reveal to other entities. But rather than think too critically about the details of these enforcement and guidance actions, I would propose that we take a step back. Think, instead, about gossip.

I propose a simple test for privacy pros navigating these trends: Think about the third parties with whom you share data as your social partners, perhaps even your neighbors in a small provincial town. If you were to tell your neighbor the fact you share through a web beacon, would they find it gossip worthy? Is it notable?

For example, "Did you know that John signed up for alcohol addiction counseling?" When spoken aloud, this seems to obviously meet the gut check for a notable piece of gossip. When converted to dry data elements — as alleged in the Monument case, a hashed email address and a custom field labeled "Paid: Weekly Therapy" — it may not at first glance seem as notable.

But the FTC is again reminding us the gossip-worthy conclusions reachable by the recipient of the data are what matter. Revealing a health-related fact about an identifiable individual goes beyond the reasonable expectations of privacy that consumer's have in how their sensitive information will be handled.

Gossip may have a negative connotation, but it is a powerful force for creating social cohesion and fostering cooperative group dynamics. As social animals, humans evolved a strong desire to know and understand the reputations of our peers. Scientists often point to the parallels between grooming in nonhuman primates and gossip among humans. For millennia, we have used gossip as a mechanism to gauge the trustworthiness of those within and without our social groups. Some, like Robin Dunbar, even posit that gossip was a driving force in shaping the evolution of language itself.

But while gossip may have clear evolutionary advantages for small and organized social groups, when connected to the global network of the internet, these benefits begin to crumble. In our globalized and interconnected society, the potential harms of gossiping about customers to our third-party partners outweigh any social benefits we might otherwise measure.

This rubric may equally apply in other contexts. Sensitive data, on the whole, is gossip worthy. I know my aunt would comment on the visits of our neighbors to sensitive locations, their health conditions, their financial status, or even their video viewing habits.

Beyond this rule of thumb, the act of gossiping itself may even be covered by some privacy laws. A recent case in the European Court of Justice, Endemol Shine Finland Oy, appears to have determined the GDPR can apply to orally shared information, in certain circumstances. Be careful who can hear your whispers!

As an initial gut check in assessing whether sharing certain information fits reasonable norms, I hope we can begin to use the gossip test. Whether in the local pub or on the web platform, it is a good practice to first consider whether a piece of information is gossip worthy before allowing it to be shared. If there is potentially enough information on the receiving end to reveal a gossip-worthy fact, we should think critically about whether our consumers are aware and have proper control over such sharing.

Or, said more simply, if your aunt would find it notable, it's best to keep it to yourself.

Upcoming happenings:

  • 17 April, 10:00 ET: The Innovation, Data, and Commerce Subcommittee of the House Committee on Energy and Commerce will host a hearing on "Legislative Solutions to Protect Kids Online and Ensure Americans' Data Privacy Rights."

Please send feedback, updates and gossip to cobun@iapp.org.