FTC Building

FTC Issues Compliance Warning Ahead of Take It Down Act Enforcement Date

WASHINGTON — With the May 19 compliance deadline closing in, the Federal Trade Commission has begun putting major online platforms on notice over enforcement of the Take It Down Act.

The agency said warning letters were sent to a wide range of tech and social media companies, including Amazon, Alphabet, Apple, Automattic, Bumble, Discord, Match Group, Meta, Microsoft, Pinterest, Reddit, SmugMug, Snapchat, TikTok and X.

The law, approved by Congress and signed nearly a year ago, establishes a federal criminal ban on the publication of non-consensual intimate imagery, including AI-generated deepfakes shared without consent.

At the center of the measure is a strict notice-and-removal requirement. Platforms covered under U.S. law must remove flagged content within 48 hours after receiving notice. That enforcement window officially begins May 19. For a lot of companies, especially those handling huge volumes of user uploads every minute, the clock suddenly feels very real.

“We stand ready to monitor compliance, investigate violations, and enforce the Take It Down Act,” said Andrew Ferguson. “Protecting the vulnerable—especially children—from this harmful abuse is a top priority for this agency and this administration.”

“Under the law, ‘covered platforms’ include various websites, apps and online services, such as social media, messaging, image or video sharing and gaming platforms,” the FTC said in its statement. The agency added that unlawful sharing of covered material could lead to federal criminal prosecution, while violations of the notice-and-removal requirements would also be treated as FTC rule violations carrying civil penalties of up to $53,088 per violation.

Attorney Corey Silverstein said adult industry companies should already be preparing for compliance and consulting legal counsel where necessary. In a recent blog post, Silverstein wrote, “For platforms that host user-generated content, creator content, private messaging, image or video uploads, live chat, AI-generated media, or adult content, this is not just a policy issue.

“It is an operational issue,” he continued. “A compliant policy is not enough if the platform cannot receive, review, track, remove, and prevent reposting of covered content within the required timeframe.”

The Free Speech Coalition raised similar concerns in late April.

Liability under the Take It Down Act applies to “any person who knowingly publishes [non-consensual] content using an interactive computer service,” the FSC said in a statement. “This targets the individual uploader/publisher, not the platform. … Platforms must post a clear, conspicuous, plain-language notice of their removal process and how to submit a request. … Failure to comply with the notice-and-removal obligations is treated as an unfair or deceptive act or practice under the FTC Act, enforced by the Federal Trade Commission.”

About thewaronporn

The War on Porn was created because of the long standing assault on free speech in the form of sexual expression that is porn and adult content.

Check Also

Age verification

Critics Warn Expanding Age Checks Could Threaten Online Anonymity

The fear is real. That’s probably why this debate has gotten so heated so fast. …