Editor's note: The IAPP is policy neutral. We publish contributed opinion and analysis pieces to enable our members to hear a broad spectrum of views in our domains.

There has been a heated debate raging this summer, reportedly, between the U.S. and U.K. governments on the question of access to encrypted data. As my colleague, IAPP Research and Insights Director Joe Jones recently posted, the debate has reflected one of “tech policy’s most intractable conundrums.” Though only one of many ongoing conundrums, it appears the U.K. government may have relented this week.

The story began in February when the U.K. Home Office issued a Technical Capability Notice under its Investigatory Powers Act, which demanded what most commenters characterized as a "back door" to access end-to-end encrypted user data on Apple's cloud servers. Apple responded by going to the mattresses, it is fair to say, by dropping U.K. users from its advanced encryption service for cloud storage and stoking a massive public outcry from encryption advocates across civil society, industry and government.

In a social media post this week, U.S. Director of National Intelligence Tulsi Gabbard showed just how far this conversation has gone, claiming that the highest levels of U.S. government had been in discussion with U.K. counterparts over recent months. Gabbard also announced that “the U.K. has agreed to drop its mandate for Apple to provide a 'back door' that would have enabled access to the protected encrypted data of American citizens and encroached on our civil liberties.”

The details of any agreement remain murky, and there has yet to be an official statement from the U.K. government beyond vague affirmations about continuing to protect its citizens. Would a deal here mean a change that would carve out Americans’ data from the order, or a full reversal?

Many other questions remain unresolved — but this is only one of many tech policy stalemates shaping up between the U.S. and its allies. In short, it is not just the weather in the U.K. that has reached all-time highs this summer. Trans-Atlantic relations over digital trade, online safety and data protection are under a heat advisory and show no real sign of a cooldown any time soon.

Protecting US consumers from… the Europeans

Meanwhile, an unlikely fighter stepped into the geopolitical ring this week: U.S. Federal Trade Commission Chair Andrew Ferguson.

In a press release, the FTC announced Ferguson had sent letters to “more than a dozen prominent technology companies reminding them of their obligations to protect the privacy and data security of American consumers despite pressure from foreign governments to weaken such protections.” Recipient companies offer a variety of different digital services ranging from cloud computing and data security to social media platforms and messaging.

It was a remarkable move from an agency that is generally focused on domestic issues, though the FTC has certainly played a role in trans-Atlantic privacy discussions in the past, including as a key stakeholder in EU-U.S. and U.K.-U.S. discussions on cross-border data transfers — another conversation that is likely to heat up in the near term.

Unlike the press release, which focuses first on privacy and security before mentioning the potential censorship of Americans almost as an aside, Ferguson’s sample letter spends the vast majority of its three pages describing how “foreign governments present emerging and ongoing threats to the free exchange of ideas,” including the possibility of direct censorship of Americans.

Ferguson’s letter weaves this threat of foreign censorship into a tapestry of surveillance, data security, and privacy concerns, reminding companies of the FTC’s prior work on the latter of these issues while also offering novel legal theories about the use of Section 5 as a tool to combat online censorship. Though perhaps the individual letters provide concrete examples of specific sectoral concerns, the sample letter instead moves seamlessly among a wide variety of general concerns, ranging from algorithmic ranking, de-platforming, surveillance and weakened data security standards for private communications.

Though recognizing that foreign laws in these areas would largely impact those countries’ domestic markets, Ferguson warns that companies often apply “uniform policies across jurisdictions” to simplify compliance.

The potentially meddlesome foreign actions in question include, in order of appearance, the EU Digital Services Act, the U.K. Online Safety Act, and — you guessed it — the U.K.’s backdoor encryption efforts. To each company, Ferguson writes, “As you grapple with how your company will comply with these misguided international regulatory requirements, I write to remind you that your company has independent obligations to American consumers.”

Unfair censorship

These obligations include the clear precedent from the FTC on deceptive promises related to data security and privacy practices. Beyond this, Ferguson warns that “certain circumstances” could require “reasonable security measures” to include end-to-end encryption — the absence of which could trigger a Section 5 unfairness charge. This, too, is relatively uncontroversial given the FTC’s recently expanded use of unfairness in the privacy domain and its longstanding support for a risk-based set of reasonable security standards.

Further, Ferguson argues that tech companies have a duty to disclose “weaker security measures” adopted due to the actions of a foreign government to U.S. consumers. Failure to do so, he writes, could be both deceptive and unfair, especially in the context of what he describes as Americans’ right to “anonymous and private speech.”

Returning to the theme of censorship that animates the letter, Ferguson provides a rough sketch of how the FTC’s authority could be used to vindicate free speech interests. Starting with deception, he highlights another duty for prominent disclosures, this time to make sure American consumers are not surprised if “censorious policies were adopted due to the actions of a foreign government.”

In recent years, most social media companies have made efforts toward deploying clear disclosures of content-ranking rules and other platform policies that implicate speech and access to information. If indeed relevant across borders, perhaps these disclosures might soon also describe how these policies are informed by international compliance obligations.

On the concept of unfair censorship, Ferguson offers the following: “Further, it might be an unfair practice to subject American consumers to censorship by a foreign power by applying foreign legal requirements, demands, or expected demands to consumers outside of that foreign jurisdiction.”

Unfairness requires a demonstration of unavoidable and legally cognizable harm, a perennial challenge in data privacy cases. This hurdle is likely even higher in the context of restrictions on consumers’ speech or access to information on private platforms. Neither foreign governments nor social media companies can implicate Americans’ First Amendment protections against U.S. government censorship. And unlike in the privacy context, there is not a standalone “censorship” cause of action that could help to demonstrate common-law harms.

Nevertheless, companies are on notice: the FTC is watching how they respond to new foreign legal obligations. And Chair Ferguson is pointing to two options for avoiding FTC scrutiny. Either treat the American market separately from any new restrictions imposed by other countries or prominently disclose how those restrictions implicate privacy, security and content policies.

Please send feedback, updates and foreign censorship disclosures to cobun@iapp.org.

Cobun Zweifel-Keegan, CIPP/US, CIPM, is the managing director, Washington, D.C., for the IAPP.

This article originally appeared in The Daily Dashboard and U.S. Privacy Digest, free weekly IAPP newsletters. Subscriptions to this and other IAPP newsletters can be found here.