On 28 April, the full U.S. House of Representatives passed S. 146, the TAKE IT DOWN Act, a bill that includes both data privacy and AI governance elements. It received resounding bipartisan support and now heads to President Donald Trump's desk after it unanimously passed the Senate in February.
TAKE IT DOWN is a backronym of Tools to Address Known Exploitation by Immobilizing Technological Deepfakes On Websites and Networks. Though both titles describe goals of the bill — a limited right to removal and a focus on deepfakes respectively — neither captures its full scope.
The bill includes both criminal and civil elements. On the criminal side, it makes it illegal for a person to "knowingly publish" authentic or synthetic nonconsensual intimate images. On the civil side, it requires covered platforms to remove flagged NCII from their platforms within 48 hours of receiving a valid request.
Trump could sign off on the bill quickly with strong congressional support and endorsement from first lady Melania Trump.
Second time's the charm
The House is not the first chamber of Congress to address the harms that NCII poses to adults and minors this congressional term. The Senate unanimously passed the TAKE IT DOWN Act 13 Feb. after it was reintroduced by U.S. Sen. Ted Cruz, R-Texas, chair of the U.S. Senate Committee on Commerce, Science, and Transportation, along with Sen. Amy Klobuchar, D-Minn., who serves on the committee.
As the Senate version of the bill is identical to that passed by the House, it now moves to the president's desk for final passage. If signed, the takedown requirements for covered platforms will go into effect one year from enactment.
Take it down — or else
The bill has two distinct aims. First, it criminalizes the publication of an authentic or computer-generated NCII and outlines separate penalties for when the image depicts an adult or a minor. The second aim is to impose new obligations on social media and online platforms to respond to individuals' requests to promptly remove unlawful NCII.
At the outset, the law defines NCII as an "intimate visual depiction," relying on the existing complex definition from 15 U.S.C §6851(a)(5).
In addition to authentic NCII, the bill addresses synthetic or computer-generated content, applying the term "digital forgery" to any intimate visual depiction of an identifiable individual "created through the use of software, machine learning, artificial intelligence, or any other computer generated or technological means, including by adapting, modifying, manipulating, or altering an authentic visual depiction, that, when viewed as a whole by a reasonable person, is indistinguishable from an authentic visual depiction of the individual."
Criminal liability
The bill carries criminal penalties for the nonconsensual exposure of NCII.
Individuals who publish an authentic or computer-generated NCII of an adult without consent can face fines, forfeiture, restitution and up to two-years imprisonment. The publication of an authentic or computer-generated NCII of a minor under age 18 extends the maximum prison sentence to three years.
The bill excludes those images covered by existing criminal provisions applicable to depictions of children, meaning these special protections are meant to extend primarily to teens.
For NCII depicting an adult, the bill prescribes a four-part legal test requiring a reasonable expectation of privacy, the lack of voluntary public disclosure by the individual, that the depiction is not a matter of public concern, and that it either intended to cause harm or actually causes harm, including reputational harm. When depicting a minor, none of this is required, as covered material intended to "arouse or gratify the sexual desire of any person" will qualify.
The bill also prohibits intentionally threatening to publish authentic or computer-generated NCII "for the purpose of intimidation, coercion, extortion, or to create mental distress." For threats, it applies the same penalties as actual publication, except that threats involving digital forgeries receive slightly shorter max sentences.
Sharing to one is not sharing to all
A frequent critique of the current application of privacy torts such as the common law tort of public disclosure focuses on the effect of limiting causes of action when an individual previously shared information in any context. In many instances, any prior exposure of information in "public," no matter how minimal the audience, can negate someone's public exposure claim.
In the limited context of criminal liability for exposing NCII, the TAKE IT DOWN Act closes this interpretive loophole in a manner that suggests a more modern understanding of contextual integrity.
In addition to recognizing that consent to create an image is not the same as consent to publish it, the bill explains "the fact that the identifiable individual disclosed the intimate visual depiction to another individual shall not establish that the identifiable individual provided consent" for its publication.
Removal obligations for covered platforms
The second portion of the bill requires social media platforms to take down any NCII identified in a valid request from an identifiable individual or someone authorized to act on the individual's behalf. Entities subject to the removal obligations are defined by the bill as online services, platforms and applications that primarily host user generated content, explicitly excluding email or broadband service providers.
An identifiable individual is defined as someone "(i) who appears in whole or in part in an intimate visual depiction; and (ii) whose face, likeness, or other distinguishing characteristic (including a unique birthmark or other recognizable feature) is displayed in connection with such intimate visual depiction."
Unlike the criminal provisions, the language of the removal requirements does not explicitly encompass synthetic NCII, though it is likely the drafters intend manipulated and synthetic images to be covered, so long as they meet the identifiability requirement. Though the criminal provisions carefully parse between criminal liability and penalties for authentic and digitally forged intimate visual depictions, the removal requirements only refer to an "intimate visual depiction published on the covered platform that includes a depiction of the identifiable individual."
The potential ambiguity was addressed in an amendment proposed by Rep. Debbie Dingell, D-Mich. After some discussion in the Energy and Commerce Committee markup, the amendment was not included, as it would have also changed other parts of the bill.
Under the bill's take-down provisions, covered platforms must establish a clear and accessible process for affected individuals to submit notice and valid removal requests for unlawful NCII on the platform. Such requests should be in writing and contain the affected individual’s signature, a reasonably identification of the NCII, where to potentially locate the image on the platform, a good faith statement that the image is nonconsensual, and contact information of the affected individual.
Once a valid request is received, platforms must remove the NCII and any known copies within a 48-hour period. Platforms that fail to reasonably comply with the removal requirements can face an enforcement action by the Federal Trade Commission under Section 5 of the FTC Act. In a rare move, the scope of the FTC's jurisdiction is expanded under the bill to include nonprofit organizations.
Covered platforms acting in good faith and on reasonable evidence are shielded from liability for removing or disabling access to content under the bill even if it is ultimately determined that the removed content was lawful. Further, the bill contains a severability clause, which could save constitutional provisions should any portion of the proposed law be challenged and found unconstitutional.
Controversial elements
During the House Energy and Commerce Committee markup, though broad bipartisan support was apparent, some congressional members raised concerns regarding the scope of law's applicability to online platforms and potential free speech risks.
One of the concerns discussed encrypted services, as some civil liberty and technology organizations have pointed out that the language of the covered services could extend beyond traditional social media platforms and implicate private messaging apps and cloud providers.
This concern was addressed by Rep. Lori Trahan, D-Mass., during the markup session, where she stated the language of the bill covers platforms that provide forums for user generated content and reiterated that service providers hosting encrypted private messaging channels would likely not fall within this bill's definition of a covered platform.
Prior to the bill's markup, some civil liberty and technology organizations also raised First Amendment concerns over the legislation's proposed take-down requirements. Here, critics of the bill find that it lacks anti-abuse provisions for false take-down requests which could lead to the suppression of lawful speech on social media platforms.
Other committee members raised concerns about the ability of the FTC to enforce the law because two of its Commissioners were terminated earlier this year. One member introduced an amendment that would reinstate the two Commissioners prior to the bill coming into effect. None of the proposed amendments received sufficient support to change the final legislation, setting it up for swift passage.
Cobun Zweifel-Keegan, CIPP/US, CIPM, is managing director, Washington, D.C., and Kayla Bushey, CIPP/US, is a Westin Research fellow for the IAPP.