In a statement released today, the Free Speech Coalition reminded stakeholders across the adult industry that key provisions of the TAKE IT DOWN Act — legislation that “created a federal criminal prohibition on the nonconsensual publishing of intimate images (including AI-generated “deepfakes”) and requires covered platforms to establish a notice-and-removal process for such content within 48 hours of a valid request,” — will take effect May 19, 2026.
While the ban on nonconsensual imagery went into effect immediately after the law was signed, the notice-and-removal requirements begin on that date.
As outlined in the FSC statement, the law applies to two categories of content: “authentic intimate visual depictions published without consent” and “digital forgeries.” The latter includes “AI-generated or otherwise computer-manipulated intimate images of an identifiable individual that a reasonable person would find indistinguishable from authentic depictions.”
FSC stated that “any person who knowingly publishes such content using an interactive computer service” may be held liable under the law, adding that enforcement is directed at “the individual uploader/publisher, not the platform.”
Under the notice-and-removal provisions taking effect May 19, “websites, online services, online applications, or mobile applications that serve the public and primarily provide a forum for user-generated content (including messages, videos, images, and audio)” must comply.
“Covered platforms must establish a process by which an individual (or their authorized representative) can submit a removal request,” FSC said. “The request must include a signature, identification of the content, a good faith statement that it was published without consent, and contact information.”
As for timing, once a valid request is received, platforms “must remove the content as soon as possible, but no later than 48 hours after receipt,” FSC explained. “Platforms must also make reasonable efforts to identify and remove known identical copies.”
The law also requires platforms to “post a clear, conspicuous, plain-language notice of their removal process and how to submit a request.”
“Failure to comply with the notice-and-removal obligations is treated as an unfair or deceptive act or practice under the FTC Act, enforced by the Federal Trade Commission,” FSC added.
FSC noted that the definition of “covered platform” is broad and may apply to “most sites that host user-generated content.”
“Platforms that host any user-uploaded content should assume they are covered and consult with counsel,” FSC said.
FSC also emphasized that under the law “consent to create an intimate visual depiction does not equal consent to publish it.”
Addressing what qualifies as a valid request, FSC explained that submissions must be made in writing and include the following:
- a physical or electronic signature of the requestor (or their representative)
- identification of, and information sufficient for the platform to locate, the offending content
- a statement of the requestor’s good-faith belief that the depiction was not consensual
- the requestor’s contact information
FSC also noted that the law “includes no provisions that address how platforms can or should deal with erroneous or fraudulent removal requests.”
The full statement is available on the FSC website.
Read More »
The War on Porn Regular Updates about the Assault on The Adult Industry