The story of the U.S. Federal Trade Commission and data broker Kochava continues to unfold before our eyes.
When last we left our intrepid privacy enforcers, the FTC was attempting a daring last-ditch legal maneuver, working to change the mind of a U.S. District Court judge in the frigid foothills of Boise, Idaho. The judge, B. Lynn Winmill, granted a motion to dismiss the Kochava case, seemingly dashing the hopes of the agency to reform geolocation practices in the data broker industry.
But Winmill left a small path to victory, an opportunity for the FTC to amend its complaint against Kochava and re-state sufficient allegations to make a plausible claim of a violation of Section 5 of the FTC Act.
For the full history of the case, which began two summers ago, you can review my prior columns: The Kochava Gambit, Die Another Day and Never Say Never Again. The last of these described the FTC's amended complaint, which alleged that Kochava's sale of marketing profiles that include location data creates a "multiplying invasiveness" in consumers' lives, which cannot be readily avoided.
Since that November update, the landscape of location privacy enforcement has shifted. The FTC successfully settled two other geolocation-focused matters against X-Mode Social (now known as Outlogic) and InMarket Media.
In denying Kochava's motion to dismiss the FTC's amended complaint, the court focuses on two separate claims about the likely infliction of substantial privacy injury from Kochava's alleged practices.
Both theories relate to the sharing of geolocation data about sensitive locations tied to unique identifiers, such as a mobile device MAIDs, that can allow individuals to be identified. This practice creates a significant risk that consumers will suffer secondary harms, such as "stigma, discrimination, physical violence, and emotional distress."
Importantly, the court agrees, this practice also potentially represents a substantial injury to consumers in and of itself by invading their privacy. The theory that privacy intrusion is its own substantial injury, without the need for additional harms, would be a game changer in the consumer protection realm, if it continues to hold water later in this case and across other courts.
At this stage, the court is only agreeing with the FTC insofar as the agency has made plausible claims that the alleged practices could meet the requirement for an unfair practice under the FTC Act. The case will now continue along the usual path of litigation, proceeding toward the discovery phase, which will allow parties to solicit evidence from each other. This is an important win for the FTC, as if it proceeds through discovery, it will be able to supplement its factual allegations based on an examination of Kochava's internal records and sworn statements. Until this point, the FTC has had to rely on its initial investigation and public information about Kochava's business model.
Though much of the case remains to be settled, the takeaways from the court's analysis already closely track other recent FTC enforcement actions. This spate of enforcement scrutiny increases the heat on any company that shares location data, but it's important to note that not all location data business models are created the same.
The specific lesson is as follows: Sharing identifiable geolocation data that reveals visits to sensitive locations is an unfair practice. The modifiers in this sentence matter, so let's look at each in turn.
First, if data could reveal visits to sensitive locations, it should not be shared in an identifiable format. Under current policy trends, this includes combining geolocation data with any unique identifier. Because device IDs can be matched to individual consumers using widely available commercial products, a device ID alone is a unique identifier, just like a name or phone number would be.
Second, if identifiable geolocation data is shared, it must not include potentially sensitive locations such as medical and reproductive health clinics, places of religious worship and domestic abuse shelters. If you are curious about what this means in practice, the FTC's settlement order with X-Mode presents a roadmap for how to establish a "sensitive location data program," by empowering a chief privacy officer to implement robust policies and procedures that identify and remove sensitive locations from datasets, continuously auditing and documenting the accuracy of the list of sensitive locations, among other measures.
Further, companies that share location data bear some responsibility for what happens to this data later. In its court filings, Kochava makes an argument that possible secondary harms would not be caused directly by the company, but by other third parties. In its decision, the court reiterates that Section 5 only requires a significant risk of concrete harm. If the company creates that risk, even if it is not actually inflicting the ultimate harm, it could be violating the FTC Act. In parallel with this fact, the X-Mode order requires a series of "contractual prohibitions" against any recipients of location data, assessing recipients' compliance with this term, and terminating relationships with recipients for noncompliance.
Though the Kochava saga has already had many parts, there is a long way to go before the FTC concludes the matter through the judicial process. In the meantime, companies are best advised to learn from the agency's other location data cases. Notably, the InMarket Media order goes even further than X-Mode, based on allegations of unfair collection of location data without proper notice to users across hundreds of mobile apps. It bans the company completely from engaging in the sale of any location data.
This is a stark reminder that the remedies for alleged failures to safeguard location data may go well beyond the recommended best practices. It is better for privacy teams to be empowered to act proactively, rather than finding yourself fenced in by a restrictive consent order.
Here's what else I'm thinking about:
- The Connecticut Attorney General submitted a report reflecting on its first six months of enforcing the Connecticut Data Privacy Act. The report is well worth the read for privacy pros who want to better understand the initial steps an attorney general must take to begin enforcing a new privacy law, including its work to engage with businesses and an analysis of consumer complaints received under the law. It also provides insights into the attorney general's initial enforcement priorities, including privacy policy deficiencies, sensitive data, teen data and data brokers. The report concludes with an interesting section recommending legislative changes to Connecticut's law.
- The U.S. House may be moving toward a vote on extending Section 702 surveillance authority as early as next week. Politico reports that Speaker Mike Johnson, R-La., is considering bringing a bill to the floor, which notably could include some changes to the program. Before 702 expired last year, Congress extended the program by a few months.
- The National Institute of Standards and Technology established a new AI Safety Institute focused on research, standards setting, testing and international cooperation on AI safety. The agency announced Elizabeth Kelly, a top White House advisor who worked on the president's AI executive order, will lead the Institute. Elham Tabassi, currently NIST's chief AI adviser, will serve as the chief technology officer for the Institute.
- Could Taylor Swift fans create pressure for stronger federal privacy rules? As her private jet travel plans continue to be shared online, Taylor Swift's lawyers are seeking to stop the spread of this information. The situation is a stark reminder about the lack of an expectation of privacy in public data, whether under consumer protection law, privacy torts, or a constitutional analysis.
Upcoming happenings:
- 15 Feb., 11:00 ET: The IAPP hosts a LinkedIn Live conversation, "Pay or OK: Practical Considerations for AdTech & Beyond."
- 22 Feb., 11:00 ET: The Centre for Information Policy Leadership hosts a webinar, "Best Practices in Accountable Governance of AI: New Research about the Experience of Leading Practitioners."
- 25 Feb.: The deadline to submit speaking proposals for the IAPP Privacy. Security. Risk. 2024 conference in Los Angeles 23-24 Sept.
Please send feedback, updates and Idaho spuds to cobun@iapp.org.