The War on Porn

Iowa Attorney General Files Lawsuit Against Instagram Over Alleged Harm to Minors

Iowa flag

DES MOINES, Iowa — The fight over what kids see online has landed, again, in a courtroom—this time with Meta Platforms facing a lawsuit from Iowa’s top prosecutor, who says the stakes aren’t abstract at all.

Attorney General Brenna Bird filed the case in Iowa District Court for Polk County, which includes Des Moines, alleging that the company’s social media platforms contribute to addiction and harm among young users in the state.

Bird, a Republican, brought the lawsuit just days after juries in California and New Mexico state courts found that similar platforms are designed in ways that foster addictive behavior.

In a statement, Bird said, “Instagram says their content is safe for kids. It’s not. And Instagram was designed to get our children addicted to it, causing harm to their mental health and physical safety.”

The lawsuit seeks financial damages for the state, citing alleged violations of consumer protection and deceptive marketing laws. It also claims that Instagram exposes teenage users to explicit material.

“Defendants allow rampant profanity, sexual content and nudity, alcohol, tobacco, and drug use and references, and mature/suggestive themes on the Instagram platform, including readily accessible hardcore pornography,” the lawsuit states.

“Defendants use human and computer moderators to police the content on Instagram, but those moderators either systematically fail or apply internal policies that allow these types of content to remain on the platform,” it adds.

The case now moves forward in state court, where questions about responsibility, design, and the reach of social media platforms are expected to take center stage.

Read More »

Arizona Senate Drops Conflicting Provision From Consent Bill

PHOENIX — Lawmakers in Arizona have revised a bill targeting online adult content, removing language that had raised concerns it could make compliance effectively unworkable for many sites.

According to Free Speech Coalition Director of Public Policy Mike Stabile, the organization worked closely with the sponsor of HB 2133 to shape the amendments.

“We worked hard in Arizona to educate lawmakers as to the realities of our business and to secure amendments that reflect that,” Stabile said.

Arizona’s HB 2133, titled the “Protect Act,” is part of a broader legislative push aimed at addressing nonconsensual intimate images online, including content generated using artificial intelligence. The bill also introduces new verification and consent requirements for adult websites.

Under those provisions, adult sites would need to use “reasonable” verification methods to confirm that individuals depicted in sexual material are at least 18 years old and have given consent. They would also be required to keep records of that verification for a minimum of seven years.

Earlier language in the bill created what some described as a compliance dilemma. It defined “reasonable” verification methods as affidavits confirming age and consent, verification by an independent third party, or “any other commercially reasonable method that does not retain identifying information after the verification is complete.”

The revised Senate version removes the phrase “that does not retain identifying information after the verification is complete,” addressing concerns that the requirement conflicted with the obligation to maintain records.

Stabile said, “Our issue here was that our members are not able to delete model releases and other consent documents. That is fixed.”

In addition to that change, amendments also removed provisions that would have allowed inspections by the attorney general without a warrant and restricted the sharing of documentation with federal, state, and local law enforcement.

While affidavits and third-party verification remain listed as options, the updated bill does not require them if another commercially reasonable method is used. For many operators, affidavits can be difficult to implement because they typically require notarization, while third-party systems may introduce additional complications with record retention.

“When I spoke with the sponsor, I explained the methods that we currently use,” Stabile said. “While individuals and organizations affected by the law should review their practices and potential liability with their own counsel, the bill as amended lists out ways of satisfying the consent verification requirement. ‘Any other’ commercially reasonable method should allow just that.”

The amended bill also includes language allowing websites to demonstrate that certain material was created before the enactment of the Child Protection and Obscenity Enforcement Act of 1988, instead of requiring age and consent verification for that content. The provision is intended to address challenges in verifying older material.

Industry attorney Corey Silverstein said that while the updated language may provide more flexibility, the broader concerns remain.

“Regardless of the ‘softening’ of the language, this is all prior restraint on free speech,” Silverstein said. “The notion that this will actually protect children is absolute nonsense.”

The amended bill has been sent back to the Arizona House of Representatives for further consideration.

Read More »

Trump and the Pledge: The Shot Not (Yet) Fired in the War on Porn by Stan Q. Brick

DOJ Logo

It’s hard for me to believe 2016 was only 10 years ago. In the years since, we’ve had a pandemic, a handful of Olympic Games and 1.25 Trump Administrations. We’ve also had a few wars, as well as one “short-term excursion” that bears a strong resemblance to a war, at least to my untrained eye.

In a less literal sense of the word war, we’ve also seen an escalation in the War on Porn, at least with respect to state laws in the United States and international laws and regulations globally.

Interestingly, the one force that hasn’t fired the sort of shot it certainly could squeeze off in the War on Porn has been the federal government. While various bills that would impact the adult entertainment industry are being considered in the House and Senate, what we haven’t seen is an effort on the part of the current Department of Justice to prosecute federal obscenity crimes, other than cases involving Child Sex Abuse Material (“CSAM”).

This brings me back to 2016. That was the year then-presidential candidate Donald Trump signed the “Children’s Internet Safety Presidential Pledge,” an oath authored by the anti-porn activism group Enough is Enough.

Under the terms of the pledge, Trump promised that if he was elected, he would “uphold the rule of law by aggressively enforcing existing federal laws to prevent the sexual exploitation of children online, including the federal obscenity laws,” and appoint an Attorney General “who will make the prosecution of such laws a top priority in my administration and give serious consideration to appointing a Presidential Commission to examine the harmful public health impact of internet pornography on youth, families and the American culture.”

I put the “including the federal obscenity laws” because that phrase indicates obscenity prosecutions involving content created by and for adults, not CSAM. (There are always lots of prosecutions for crimes involving CSAM, as there should be.)

During the first Trump Administration and the first 14 months of the second one, the DOJ has initiated zero obscenity prosecutions of that sort, so far as I’m aware. I can’t speak to whether Trump ever gave “serious consideration” to appointing a Presidential Commission like the one the pledge describes, but no such Commission has been established.

On the one hand, the fact that Trump hasn’t followed through on the pledge he signed is unsurprising. Like every politician, Trump is happy to make promises in one moment and forget all about them in the next. On the other hand, if the current polls are to be believed, he could use a boost – and making a move that appeals to a big segment of his support base might have more appeal than it did previously.

I’m not saying I expect the Trump Administration to suddenly start indicting American pornographers for alleged violation of federal obscenity laws. All I’m saying is it wouldn’t shock me if they did, even if only to bask in the glow of positive publicity coming from the socially conservative ‘Christian Right’ after the press conference at which the prosecutions are announced.

I can almost hear that press conference now:

“These guys we indicted for obscenity are the worst of the worst. They made filthy, disgusting pornography at a level nobody has ever seen before, not even my super smart uncle who taught at MIT and whose great brain was almost as big as mine. And now we’ve totally obliterated their ability to make any more of their sick, demented, often foreign pornography – unless they agree to a deal, which they really should, because we have so much evidence. I’m telling you, we have so much evidence, the jury will be tired of all the evidence we have. And that’s why we have to get rid of paper ballots and mail-in voting.”

I sure hope I’m wrong. Among other things, I’m sick of hearing about his uncle. Except the whole lying about his uncle’s connection to the Unabomber thing. That was awesome.

Read More »

OCC, FDIC Bar Regulators From Using ‘Reputation Risk’ in Bank Oversight

A board with the word debanking.

WASHINGTON — Federal banking regulators on Tuesday finalized a rule removing “reputation risk” as a factor in supervising financial institutions.

Under the new rule, the Office of the Comptroller of the Currency and the Federal Deposit Insurance Corporation are barred from “criticizing or taking adverse action against an institution on the basis of reputation risk.” The rule also bars the agencies from “requiring, instructing, or encouraging an institution to close an account, to refrain from providing an account, product, or service, or to modify or terminate any product or service” based on a customer’s political, social, cultural or religious views, constitutionally protected speech, or lawful business activity viewed as presenting reputation risk.

The action follows an Aug. 7, 2025, executive order directing financial institutions not to deny or limit services to customers engaged in lawful activities on political grounds.

After that order, the OCC released a report on debanking that identified several sectors facing account closures or service restrictions, including the adult entertainment industry, citing concerns among banks about alignment with internal standards.

In March, Federal Trade Commission Chairman Andrew Ferguson issued warnings to payment processors such as PayPal, Stripe, Visa and Mastercard regarding practices that restrict access to services based on lawful but higher-risk activities.

The impact of the rule on industries that have reported difficulties accessing banking services remains uncertain. Although the OCC report identified adult entertainment as one of the affected sectors, regulators have not provided additional detail on how the new rule will be applied in practice.

While the rule prevents the OCC and FDIC from penalizing institutions for serving customers engaged in “politically disfavored but lawful business activities perceived to present reputation risk,” it does not limit banks’ ability to make decisions based on other supervisory considerations, including “safety and soundness.” Institutions may continue to restrict services under those criteria.

The Free Speech Coalition submitted comments in support of the proposed rule and recommended expanding its scope to apply more directly to banks. Those proposals were not adopted in the final version.

“The rule removes a key driver of banking discrimination against the adult industry,” said Free Speech Coalition Executive Director Alison Boden. “Federal examiners can no longer pressure banks to close accounts or deny services to lawful businesses based on reputation risk. It’s not going to solve all of our problems, but it’s a necessary piece of securing fair banking access for our industry.”

Read More »

Italian Court Ruling in Aylo Case Restricts Cross-Border Enforcement of Age Verification Rules

Pornhub logo

ROME — An Italian administrative court has ruled that the country’s recently enacted age verification rules can’t, at least for now, be enforced against sites based elsewhere in the European Union.

Last year, Italy’s communications regulator AGCOM said all platforms hosting adult content would need to implement age verification systems to keep out users under 18. The timeline was clear enough: Italy-based sites had to comply by Nov. 12, 2025, while sites operating from other EU countries were given until Feb. 1, 2026.

AGCOM also published a preliminary list of 45 providers it believed would fall under the rule. Many were among the most visited adult sites online, including Aylo-operated platforms Pornhub, YouPorn and Redtube.

Aylo challenged the rules on several fronts, prompting the Regional Administrative Court for Lazio to pause enforcement while it took a closer look. A hearing followed on March 11.

In a ruling released Tuesday, the court sided with Aylo on procedural grounds, finding that AGCOM’s rules do not fully align with the “country of origin” principle set out in the EU’s Directive on Electronic Commerce. The principle provides that online services are generally regulated by the laws of the country where they are established.

The court found that while exceptions to that principle may be justified, certain conditions must be met. The ruling states that a country where content is accessible cannot impose additional obligations unless it first asks the provider’s home country to take action, determines that such action is insufficient, and notifies both the European Commission and the country of origin before adopting restrictive measures.

According to the court, AGCOM did not follow this procedure, meaning companies would have been required to comply immediately without their states of origin having the opportunity to adopt corrective measures.

If the ruling stands, AGCOM will need to complete those steps before enforcing Italy’s age verification law against Aylo or other platforms based in EU member states. The decision may be appealed to Italy’s Council of State.

The ruling applies only to EU-based platforms and does not affect enforcement against sites based outside the European Union.

Cross-Border Legal Framework Still Emerging

While the court upheld Aylo’s complaint based on procedural grounds, it rejected the company’s argument that protection of minors online falls exclusively under the authority of the European Commission in areas covered by the EU’s Digital Services Act.

Aylo, which is based in Cyprus, has been involved in similar legal disputes in other EU countries. In Germany, a court found that the “country of origin” principle limits the ability of individual member states to impose additional national requirements in areas already addressed by EU law.

In France, an advocate general of the EU’s Court of Justice issued a non-binding opinion advising that France can require pornographic websites based in other EU states to comply with its national age verification rules. That case remains pending.

The Italian court stated that efforts to harmonize laws protecting minors online across EU member states are still ongoing. Until that process is complete, the court said, member states may adopt national measures, provided they comply with EU law and the Directive on Electronic Commerce.

“In the absence of a harmonized and mandatory solution at the Union level, Member States may adopt national measures which — while respecting the principles of Union law — ensure a high level of protection for minors,” the ruling states.

The court also indicated that broader harmonization may depend in part on the rollout of the EU Digital Identity Wallet.

The European Commission has launched a pilot program for an age verification application designed to integrate with digital wallets and support compliance with age verification requirements under the Digital Services Act. Italy’s regulations require that age verification systems be compatible with this application, which is also being tested in Denmark, France, Greece and Spain.

Aylo has participated in the pilot program.

Read More »

Wisconsin Governor Blocks Age Verification Measure

Wisconsin flag

MADISON, Wisc. — Gov. Tony Evers on Friday vetoed AB 105, an age verification bill that would have allowed individuals to sue adult content providers for alleged failures to verify users’ ages, with penalties of up to $10,000 per violation.

In a letter to the state Assembly, Evers said he vetoed the bill due to concerns about its impact on personal privacy, citing issues commonly raised by privacy advocates.

“While I agree that we should protect children from harmful material, this bill imposes an intrusive burden on adults who are trying to access constitutionally protected materials,” Evers writes. “The bill requires all users, regardless of age, who are attempting to access certain websites to turn over sensitive, personally identifiable information. While the bill does not allow a person who verifies an individual’s age to retain identifying information, nothing in the bill prohibits the transmission of such information to a third party such as a data broker or the government. This is a violation of personal privacy.

“Additionally, I am concerned about data security and the potential for misuse of personally identifiable information,” the statement continues. “Identifiable information could be intercepted by or transmitted to a third party and used as the basis for blackmail or identity theft. Further, although the bill includes penalties for a business entity who violates the prohibition on retention of personal information, those penalties cannot undo the harm that may occur to an individual who is the victim of actions like blackmail or identity theft as a result of a bad actor obtaining their identity.”

Earlier versions of the bill included a requirement that websites block virtual private network traffic, but that provision was removed during the amendment process.

The final version of the bill also included language stating that “sovereign immunity” could not be used as a defense for failure to implement age verification. The purpose of that provision is unclear in this context, as the term typically applies to governments rather than private entities.

As industry attorney Lawrence Walters said, “Sovereign immunity is raised as an affirmative defense by a state, not a private company.”

The provision may have been intended to address jurisdictional challenges. In February, a federal judge dismissed lawsuits against two adult websites in Kansas over alleged violations of that state’s age verification law, finding that the plaintiff had not demonstrated that the sites specifically targeted Kansas users. An amended complaint in a separate, ongoing case seeks to establish jurisdiction by arguing that the adult site SuperPorn intentionally targeted residents of the state.

Read More »

Woodhull Freedom Foundation Report: Age-Verification Laws Impacting Sex Educators

Woodhull Freedom logo

WASHINGTON, D.C., April 2, 2026 — A new report from the Woodhull Freedom Foundation finds that age-verification laws aimed at restricting access to online pornography are having broader effects on sex educators and sexual health professionals. While the laws were promoted as a way to limit minors’ access to explicit content, the report indicates that many professionals say the measures are affecting access to educational resources.

Preliminary findings from a national survey conducted in March show a wide-ranging impact across the field. Among those surveyed, sex educators reported the highest level of concern. According to the data, 73% of sex educators said they are concerned that age-verification laws will affect their work, practice, or resources, while 76% said they fear the laws could be used to further restrict access to sex education and related materials. Eighteen percent of educators said the laws have already affected their work, a figure that rises to 33% among educators in states with existing mandates.

The report also found that concerns extend beyond educators. Among all sexual health professionals surveyed, 58% said they are worried that age-verification laws could be used to limit access to sex education and other resources, while 53% reported concern about the potential impact on their work or practice.

“Age-verification laws are already impacting sex education in the US,” said Ricci Joy Levy, President and CEO of the Woodhull Freedom Foundation. “Again and again, we were told this was only about keeping minors from accessing porn. Woodhull warned these vague and overly broad policies would also result in censorship of vital, non-explicit information about sex and gender, and the data bear this out. The current age-verification protocols are ripe for abuse, and educators are right to be scared.”

Since 2023, nearly half of U.S. states have enacted laws requiring age verification to access material defined as “harmful to minors.” Lawmakers have described the measures as a way to prevent minors from accessing pornographic websites. However, the report notes that the definition of “harmful to minors” is broad and has been applied in some states to restrict access to sex education and LGBTQ+ content for individuals under 18.

The Woodhull survey was conducted between March 3 and March 28, 2026, and distributed through professional networks and organizations. Respondents included professionals working in sex education, research, mental health services, relationship counseling, reproductive care, wellness, and advocacy. A total of 56 respondents completed the survey during its initial phase.

The findings represent an early assessment of how age-verification laws are affecting sexual health and education. The Woodhull Freedom Foundation stated that it plans to expand the survey in the coming months to gather additional data on how different populations and areas of practice are being impacted.

For more information on the survey, including additional data, contact Woodhull Freedom Foundation at info@woodhullfoundation.org.

Read More »

FSC Releases a Policy Statement on Utah’s recently-passed SB 73

Free Speech Coalition logo

LOS ANGELES — The Free Speech Coalition issued a policy statement outlining Utah’s recently passed SB 73, which revises the state’s age verification law and introduces a new 2% excise tax on certain digital content revenue.

According to the policy statement, beginning October 1, 2026, entities subject to Utah’s age verification requirements must pay a 2% excise tax on sales of access to digital images, audio-visual works, audio works, digital books, or gaming services, including streaming and subscription-based access. The update also notes that changes to the state’s existing age verification law will take effect May 6, 2026.

The policy statement states that under the updated law, “Any individual actually located in Utah is considered to be accessing the site from Utah, regardless of whether they are using a VPN or proxy server to mask their location.” It adds that sites may not facilitate or encourage the use of VPNs or proxy services to circumvent age verification requirements.

The Free Speech Coalition further reports that enforcement authority has been expanded. The Utah Division of Consumer Protection is authorized to investigate violations, issue citations, impose administrative fines of up to $2,500 per violation, and initiate court actions. Courts may impose additional civil penalties of up to $2,500 per violation, order disgorgement of revenue, and award damages. Violations of administrative or court orders may result in penalties of up to $5,000 per violation. The Division will coordinate enforcement efforts with the Utah Attorney General’s Office and the Internet Crimes Against Children Task Force, while private civil actions remain available.

The policy statement also describes a new safe harbor provision, stating that a commercial entity will be considered compliant if it uses an age verification method that meets standards to be established by the Utah Division of Consumer Protection.

In the policy statement, the Free Speech Coalition notes, “We realize that it is difficult, if not impossible, to accurately determine if a visitor is using a VPN and, if so, if that a person disguising their location via VPN might be from Utah.” The organization adds, “We do not have a good answer as to how to be compliant with this law, short of treating every potential visitor as if they were coming from Utah.”

The statement continues, “This, of course, is untenable. We are working with members, partners and allies to find potential solutions and will keep members abreast of any developments.”

The policy statement further clarifies that, unlike Alabama’s gross revenue tax, Utah’s tax applies only to specified digital content. It also notes that the enforcement structure is self-funding, with excise tax revenue and civil penalties directed to the Division of Consumer Protection’s enforcement activities. The Utah Legislature has appropriated $4,000,000 in initial funding for implementation.

Finally, the Free Speech Coalition states that the Division of Consumer Protection has been granted rulemaking authority to establish age verification standards and safe harbor criteria, indicating that compliance requirements may evolve after the May 6 effective date. The organization said it will monitor rulemaking activity and provide updates.

Read More »

Being Opposed to Age Verification Measures Doesn’t Mean You Hate Kids by Morley Safeword

Age verification

Over the years, I’ve had many conversations with friends and peers who, unlike me, do not work in the adult entertainment industry, in which those friends and peers have expressed their confusion over why the adult industry opposes things like age verification mandates for porn websites.

Who could be against preventing kids from accessing online porn, right? Isn’t it just sensible and reasonable to impose the same limitations we have in the brick-and-mortar world on the internet, when it comes to accessing pornography?

Frankly, if verifying a person’s age online were as simple and straightforward as checking a patron’s ID as they walk into an adult shop, I doubt many (if any) folks in the adult industry would be against it. The problem is, checking someone’s ID on the internet isn’t remotely the same as doing so in person – and the circumstances requiring a merchant or service provider to check the ID isn’t nearly as straightforward as it is in the physical world, either.

As state level age verification mandates continue to proliferate across the U.S. and the globe, it’s fair to say most legislators are firmly in favor of these measures. And while plenty of people are noting some of the downsides of age verification statutes, including emerging evidence that such regulation tends to drive users to darker, less safe corners of the web, there seems to be no slowing the momentum of the drive to impose age verification in a global fashion.

As governments continue to adopt these rules, it falls to those of us who must live under them to at least try to keep those governments honest, to assure that if they must pass these laws and establish these regulations, they at least do so under a set of clear, easily understood and narrow definitions.

Spoiler alert: Staying within clear, easy to understand definitions typically isn’t a strong suit of governments or legislators.

In critiquing social media age restrictions, which are supported by arguments that closely track the ones used to argue in favor of age verification mandates for adult sites, the Electronic Frontier Foundation notes that in the UK, the choices made about which platforms and sites are subject to the rules is a “process is devoid of checks or accountability mechanisms as ministers will not be required to demonstrate specific harms to young people, which essentially unravels years-long efforts by Ofcom to assess online services according to their risks.”

“And given the moment the UK is currently in, such as refusing to protect trans and LGBTQ+ communities and flaming hostile and racist discourses, it is not unlikely that we’ll see ministers start restricting content that they ideologically or morally feel opposed to, rather than because the content is harmful based, as established by evidence and assessed pursuant to established human rights principles,” adds Paige Collings for EFF.

Collings adds that we already know from actions taken in other jurisdictions, including the U.S., “that legislation seeking to protect young people typically sweeps up a slew of broadly defined topics.”

“Some block access to websites that contain some ‘sexual material harmful to minors,’ which has historically meant explicit sexual content,” Collings notes. “But some states are now defining the term more broadly so that ‘sexual material harmful to minors’ could encompass anything like sex education; others simply list a variety of vaguely defined harms. In either instance, this bill would enable ministers to target LGBTQ+ content online by pushing this behind an under-18s age gate, and this risk is especially clear given what we already know about platform content policies.”

In other words, these regulations are going to be applied in ways the sponsors of age verification legislation never copped to when crafting or debating the laws. In some cases, the people who wrote the laws may not even have foreseen the problem created by their loose approach to statutory construction.

At this point, we probably can’t stop the forward march of the age verification mandate trend. More of these laws will be written, more will be passed and – given the Supreme Court’s ruling in Free Speech Coalition v. Paxton, more will survive court scrutiny.

What we can do, as citizens of the jurisdictions covered by these age verification mandates, is make our voices heard on the more problematic aspects of these laws. We can contact our legislators, lobby for changes to the laws, lobby for better definitions and support candidates who show a willingness to listen. After all, a law surviving court scrutiny doesn’t mean we have to like it – or that we must stop telling our elected officials we don’t like it.

The bottom line is, porn continues to be popular, whether the people who would like to ban it (or effectively regulate it into a dark corner) like it or not. If people who enjoy adult entertainment are willing to speak up, we may not be able to strike a decisive blow in the War on Porn, but we can at least mitigate the collateral damage.

Read More »

Ofcom Expands Probes Into FTV and Web Prime Over Age-Check Compliance

Ofcom logo

LONDON — Ofcom has issued a provisional finding that First Time Videos, the operator of FTVGirls.com and FTVMilfs.com, may be in breach of age assurance obligations set out in the Online Safety Act.

The regulator had previously launched an inquiry to determine whether the company’s platforms were using “highly effective” age checks to prevent access by minors.

After completing that review, Ofcom said it has “provisionally determined that there are reasonable grounds to believe that First Time Videos LLC has failed to comply with section 81 of the Online Safety Act,” according to a statement published on its website.

The notice of contravention sets out the basis for Ofcom’s findings and the enforcement steps under consideration. The company has been given 20 working days to respond, and the regulator said any submissions “will be carefully considered before reaching a final decision.”

Separately, Ofcom said it is widening the scope of an existing investigation into Web Prime Inc., which operates www.anysex.com, www.fapality.com, www.mylust.com, www.xcafe.com and www.yourlust.com. The inquiry, which began in September 2025, now includes both compliance with age verification rules and a potential failure to respond to a formal information notice issued in February.

The regulator’s enforcement powers under the law include fines of up to 18 million pounds or 10% of a company’s qualifying global revenue. It may also apply to the courts to require payment providers or advertisers to withdraw services from a platform, or direct internet service providers to block access to sites within the U.K.

In a separate matter, Ofcom said it has concluded its investigation into Duplanto Ltd., the operator of www.pornhaven.ai, after the company restricted access to users with U.K. IP addresses, “reducing the likelihood that children in the U.K. will be exposed to pornographic content present on its service.”

Read More »