TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

United States Privacy Digest | A view from DC: Updating the map of location privacy safeguards Related reading: A view from DC: Cyberspace was colonized; 2024 is AI's turn

rss_feed

""

For years, many scholars, advocates and privacy professionals have argued geolocation data is a sensitive category of personal information, worthy of the same enhanced protections as other sensitive data types. Most notably, the idea was given legal significance in the context of government access by the U.S. Supreme Court in its 2018 Carpenter decision.

It has taken some time for data privacy laws to catch up. The EU General Data Protection Regulation famously omits precise geolocation data from the list of special categories of personal data. But every U.S. state with a comprehensive consumer privacy law, with the notable exception of Colorado, now treats some version of precise geolocation data as sensitive. In general, this means the collection and use of this data type is subject to opt-in consent requirements and other safeguards.

The most recent public draft of the American Data Privacy and Protection Act would have included even more rigorous limits on the collection and use of location data.

As the law catches up, we are seeing the maturity of a different theory of location sensitivity — the idea that location data can reveal information about intimate aspects of our lives. The logic here is unavoidable.

Even if we may disagree on the inherent sensitivity of general location data signals, it is apparent that monitoring a person's physical location has the potential to reveal other sensitive information about them. This idea is captured in Colorado's implementing regulations, which incorporate into the definition of sensitive data any inference revealing covered sensitive categories — religion, health status, sex life, etc. This explicitly includes inferences derived from visits to certain types of locations.

In a new consent order against X-Mode, and its successor Outlogic, the U.S. Federal Trade Commission is finally making clear that a person's digital breadcrumbs, when they reveal visits to sensitive locations, are also subject to the highest reasonable safeguards under U.S. consumer protection law.

In an accompanying statement, Chair Lina Khan, joined by her fellow commissioners, begins by quoting Carpenter before laying out the facts of the FTC's inquiry:

"X-Mode is a data broker that tracks people's location data through its own apps, through software development kits (SDKs) installed on third-party apps, and through buying data from aggregators. As noted in the FTC's complaint, X-Mode sells this raw location data with persistent identifiers that can be used to connect specific individuals to specific locations. It also sells access to 'audience segments,' or groups of people that likely share characteristics based on their demographics, their interests, or the locations they visit."

As a result, the X-Mode case presents many of the same lessons that might have been learned from the FTC's earlier action against Kochava, if that case had not descended into protracted litigation.

The operational lessons should not be ignored, especially because of the serious ramifications for stepping out of bounds. The FTC's consent order in this case includes deletion of improperly collected sensitive location data, model disgorgement of any product derived from this data, and a future prohibition on the use, sale or disclosure of sensitive location data. Moving forward, X-Mode has agreed to secure consumer consent in order to collect, use or disclose any location data.

The lessons of this case are numerous.

Companies that make use of location datasets should ensure that they are properly scrubbed of sensitive locations, or else subject them to the highest standards for notice and control. At the federal level, sensitive locations likely include those that the FTC lists in its X-Mode complaint: "medical facilities, places of religious worship, places that may be used to infer an LGBTQ+ identification, domestic abuse shelters, and welfare and homeless shelters." But nuances in state law definitions could mean additional types of locations should be treated in this manner.

The X-Mode case also presents another strong reminder that revealing data about unique device visits to medical facilities is not appropriate for marketing purposes.

The FTC alleges that X-Mode entered into an agreement with another company to deliver a list of devices that had visited cardiology, endocrinology or gastroenterology offices within a specific geographic area, followed by a visit to a pharmacy. This practice alone led to its own unfairness charge: "unfair categorization of consumers based on sensitive characteristics for marketing purposes."

There is important precedent in this case, too, about the identifiability of data. If re-identifying an individual from supposedly deidentified data is practical, the data is not deidentified. Unique device identifiers, such as advertising MAIDs, are too easily connected with offline profiles. And even raw location data, when tracked over time, is too easily cross-referenced with public records, such as property ownership.

The takeaways don't stop there. X-Mode includes some reminders about privacy basics, plus a few new details about expectations around acquiring and reselling personal data:

  • Respect opt-out signals. App and SDK developers that receive unique device IDs with opt-out signals attached should not share the data for targeted advertising purposes.
  • Failing to disclose a purpose of processing data is a deceptive act, and it may even bring an unfairness charge from the FTC. In particular, do not forget to disclose the fact that data may be sold to government contractors for national security purposes, if applicable.
  • Due diligence is important when collecting data from third parties. Contractual safeguards alone are not enough! Monitoring third-party consent mechanisms is a great first step, but this should be coupled with corrective action when issues are identified. Audit the process by which your suppliers obtain consent and cease using location data that was not obtained with appropriate consent. If you don't, you could get hit with a "means and instrumentalities" charge for facilitating unlawful data collection.
  • When selling or sharing data, build robust procedures to mitigate misuse, and cut off access to datasets when problems are identified. Always require third parties to "employ reasonable and appropriate data security measures commensurate with the sensitivity" of the data.

Here's what else I'm thinking about:

  • New Jersey joined the ranks of U.S. states with comprehensive consumer privacy laws, kicking off another busy year for privacy bills in the U.S. The timing comes just as IAPP publishes its annual report on the U.S. state privacy landscape. You won't find New Jersey in there, but you will find the other 7 states that caused a more than doubling of the map in 2023. You can keep up with newly proposed comprehensive bills on the U.S. State Privacy Legislation Tracker, updated weekly.
  • Another important warning from the FTC to those who build AI models as a service: "uphold your privacy and confidentiality commitments." Warnings like this signal future enforcement direction, and we are likely to continue to see the common law of artificial intelligence governance built out through FTC inquiries. Don't ignore the footnotes! The FTC reminds us that it views its authority as extending to prevent harm to "consumers, small businesses, and enterprise users" alike.
  • Automated fraud detection system used in 42 states for public-benefits decisions is not reliable, according to an EPIC complaint filed with the FTC. The complaint singles out Thompson Reuters' Fraud Detect system and alleges a general lack of proper AI governance processes including improperly comingled datasets, inaccurate results, a lack of transparency for government clients and a lack of mechanisms of control for consumers.
  • The new Center for Law and Technology at GW Law combines efforts from the school's IP, privacy and technology programs, as explained in a new IAPP profile. There are plans to launch a journal and an academic fellowship program.
  • Speaking of GW, Daniel Solove and Woodrow Hartzog published a draft essay eloquently challenging the idea of individual consumer empowerment as the solution to privacy harms, illustrated through a Kafka lens.

Upcoming happenings:

  • 18 Jan., 11:00 ET: The FTC hosts an open meeting, which will include a presentation about the proposed changes to the Children's Online Privacy Protection Act Rule (virtual).
  • 25 Jan., 10:30 ET: The Federal Communications Commission hosts its monthly open meeting (hybrid).
  • 25 Jan., 16:00 ET: R Street hosts a panel on The Future of Privacy and Security in 2024 (Rayburn 2123).

Please send feedback, updates and breadcrumbs to cobun@iapp.org.


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

2 Comments

If you want to comment on this post, you need to login.

  • comment Juronel Smalling • Jan 13, 2024
    Several important points were raised in this article. Yesterday I read an article about how geo-location data from vehicles are being used by abusers to locate their victims who have escaped. I also like the point about re-identifying persons from data supposedly deidentified. Excellent read.
  • comment Jacqueline Acker • Jan 19, 2024
    Thanks for a helpful article, Cobun!