WASHINGTON — The U.S. Senate Committee on Commerce, Science, and Transportation held a hearing Wednesday on potential changes to Section 230 of the Communications Decency Act, which protects online platforms — including adult websites — from liability for user-generated content.
Three bills proposing a full repeal of Section 230 are currently pending in Congress. However, those measures were not addressed during the hearing. Instead, lawmakers focused on possible reforms to the law in a session titled “Liability or Deniability? Platform Power as Section 230 Turns 30.”
The push to revisit Section 230 stems from two primary concerns.
First, lawmakers from both parties have criticized major technology companies for allegedly profiting from harmful or illegal content while avoiding responsibility. Some argue that increased liability would encourage stronger moderation. During the hearing, Sen. Marsha Blackburn said, “Big Tech has proven they are incapable of regulating or policing themselves. They will not do it.”
Second, some conservative lawmakers argue that platforms use Section 230 protections to justify restricting certain viewpoints, particularly conservative speech. Sen. Eric Schmitt cited efforts by the Biden administration to limit the reach of COVID-19 misinformation and 2020 election claims, describing those actions as violations of the First Amendment.
Sen. Ted Cruz, who chairs the committee, referenced both issues, stating that Congress should act “to prevent social media from harming Americans, especially children, while not incentivizing Big Tech censorship.”
Cruz did not advocate for a full repeal of Section 230.
“I’m concerned that a full repeal or sunset would lead platforms to engage in worse behavior — to engage in more censorship to protect themselves from litigation,” Cruz said. “But we should consider whether reform of Section 230 is needed.”
Sen. Brian Schatz, the committee’s ranking Democrat present, also supported revisiting the law.
“We can work together and fix the law,” Schatz said. “This idea that we can’t touch it, otherwise internet freedom incinerates, is preposterous.”
Possible Impact on Adult
Current proposals to reform Section 230 are not specifically directed at adult platforms, but they could have implications for the industry.
Much of the hearing focused on issues involving minors, including cases where individuals encountered harmful content or online predators. Lawmakers also discussed whether algorithmic systems and AI-generated content should be covered under Section 230 protections.
Industry attorneys and advocates have raised concerns that changes to the law could lead to targeted exemptions, similar to those created under FOSTA/SESTA, which removed liability protections for platforms found to “unlawfully promote and facilitate” prostitution or sex trafficking.
Such exemptions could expose adult platforms to increased civil litigation related to user-generated content.
While many cases could ultimately be dismissed on First Amendment grounds, Section 230 currently allows defendants to avoid prolonged litigation. As Techdirt’s Mike Masnick has written, the law “provides a procedural advantage in getting vexatious, frivolous nuisance lawsuits shut down much faster than they would be otherwise.”
Without those protections, larger companies may still be able to manage legal costs, but smaller platforms could face greater challenges.
Testifying at the hearing, Stanford Law School expert Daphne Keller said eliminating Section 230 would create legal and financial burdens that disproportionately affect smaller companies.
A world without the law, she said, “would impose legal uncertainty and expense that today’s incumbent giants could survive but their smaller rivals could not.”
Keller also noted that under other regulatory systems, platforms often receive high volumes of complaints seeking removal of lawful content.
“We have a lot of data to predict what happens when platforms are held liable for the speech of their users,” Keller said. “Platforms receive huge numbers of false allegations under laws like the DMCA here or the Digital Services Act in Europe, from people demanding the removal of perfectly legal speech. Governments do this, companies do this against their competitors — and platforms have strong incentives to simply comply.”
During the hearing, Sen. Tammy Baldwin warned against indirect government pressure on platforms.
She cautioned against “informal, often coercive efforts by government officials to pressure private companies into moderating or removing content that they cannot legally censor directly.”
Keller, in written testimony, cited actions by Federal Communications Commission chair Brendan Carr, including pressure directed at ABC that temporarily affected comedian Jimmy Kimmel’s program.
Carr also contributed to Project 2025’s “Mandate for Leadership,” which calls for changes to Section 230 and argues that pornography should not be protected under the First Amendment.
“Pornography should be outlawed,” the document states. “The people who produce and distribute it should be imprisoned.”
The document has been cited as a policy framework for the current administration. Other officials associated with the administration have also expressed support for restrictions on adult content. Trump advisor Russell Vought has discussed limiting pornography through indirect regulatory approaches, while Vice President Vance has called for a ban.
The War on Porn Regular Updates about the Assault on The Adult Industry