The War on Porn

Australia Tells Search Engines to Blur Pornographic Images

The Australian flag

Some policies don’t land with a thud so much as a slow, unsettling echo — the kind that makes you sit up a little straighter and wonder what pushed things this far. That’s how it felt when Australia’s eSafety commissioner, Julie Inman Grant, announced that search engines across the country will soon be required to blur pornographic and violent images. The rule kicks in on December 27, and you can almost sense the mix of urgency and inevitability behind it.

“We know that a high proportion of this accidental exposure happens through search engines as the primary gateway to harmful content, and once a child sees a sexually violent video, for instance, maybe of a man aggressively choking a woman during sex, they can’t cognitively process, let alone unsee that content,” Grant said.

There’s something chilling about that last part — the idea of “can’t unsee.” It’s true for adults, too, but with kids it lands differently. Grant continued: “From 27 December, search engines have an obligation to blur image results of online pornography and extreme violence to protect children from this incidental exposure, much the same way safe search mode already operates on services like Google and Bing when enabled.”

Blurred results aren’t new. They’ve been tucked inside safe search settings for years, hidden like a fire extinguisher behind a glass panel — available, but only if someone remembers to activate it. What’s shifting now is the weight of responsibility: it’s no longer on parents to flip a switch; the platforms have to build the guardrails themselves. And all of this is happening alongside another major change hitting Australia on December 10, when the country’s social media ban for under-16s comes into force.

Earlier this month, it was estimated that around 150,000 Facebook accounts and 350,000 Instagram accounts belonging to under-16s will disappear from the digital landscape, like entire classrooms quietly going offline. Grant framed the larger moment this way: “These are important societal innovations that will provide greater protections for all Australians, not just children, who don‘t wish to see ‘lawful but awful’ content.”

That phrase — “lawful but awful” — hangs in the air. It’s complicated, a little subjective, and strangely honest. And maybe that’s the real story here: a country trying to redraw the line between what the internet allows and what people can realistically live with.

Read More »

U.S. OCC Releases Preliminary Report on Debanking

OCC Debanking Report

Some mornings, the news hits you like a jolt of cold water — shocking at first, then oddly clarifying. That’s how it felt when the U.S. Office of the Comptroller of the Currency (OCC) dropped a preliminary report on debanking, finally calling out what so many in the adult industry have been living with for years. It’s strange to feel victorious over a problem you never should’ve had in the first place, but here we are, holding something that looks a lot like validation.

The OCC names nine major banks — the kind everyone’s parents told them were “safe,” including Chase, Wells Fargo, and Bank of America — for potentially violating the President’s Executive Order against discriminating toward people engaged in “lawful business activities.” Reading it, I had one of those moments where you want to underline every other sentence because someone, somewhere in government, actually said the quiet part out loud. The OCC states that the adult industry, among others, was subjected to “inappropriate distinctions” when trying to access basic financial services:

“The OCC’s preliminary findings show that, between 2020 and 2023, these nine banks made inappropriate distinctions among customers in the provision of financial services on the basis of their lawful business activities by maintaining policies restricting access to banking services… For example, the OCC identified instances where at least one bank imposed restrictions on certain industry sectors because they engaged in ‘activities that, while not illegal, are contrary to [the bank’s] values.’ Sectors subjected to restricted access included oil and gas exploration, coal mining, firearms, private prisons, tobacco and e-cigarette manufacturers, adult entertainment, and digital assets.”

Seeing adult entertainment listed there — not as a punchline, not as an afterthought, but as a recognized target of discrimination — is surreal. It’s proof that the federal government isn’t just aware of the problem; it’s saying, pretty plainly, that the problem matters. That we matter. And for once, the burden shifts off the people running these businesses and onto the banks that have quietly punished them under the guise of “values.”

This marks the first time in a long time that the adult industry isn’t shouting into the void. The OCC has confirmed that we’re covered under the Executive Order. Banks now know that the old playbook — the one where they shut down accounts for “reputational risk” and shrug — might actually put them on the wrong side of federal policy.

There’s still a road ahead, of course. In the coming weeks and months, the OCC will move into the rule-making phase, and that’s where the shape of all this becomes real. We’ll learn more as they flesh out the details, and so will everyone who’s been denied a basic checking account simply for doing legal work that made some executive squeamish. But for the first time in years, there’s a crack of daylight. Maybe — just maybe — we’re watching the beginning of the end of a discrimination problem that never should’ve existed in the first place.

And honestly? It’s about time the people holding the money had to explain themselves.

Read More »

Oh Good, Warning Labels Are Back Again by Stan Q. Brick

Cigarette warning label

Good news, everyone: The Nanny State is back and coming to a computer screen near you!

In fact, if you live in Washington state or Missouri, the Nanny State is coming to a computer screen very near you indeed, because it will be your own computer’s screen. Or smartphone screen, or smart watch screen, or pretty much any other screen you can connect to the internet.

As you may have read here on The War on Porn or elsewhere, both states currently are considering bills which would not only impose age verification requirements on adult websites but would require such sites to publish warning notices about their content, as well.

The Washington bill is the murkier of the two, stipulating that the warning labels to come are “to be developed by the department of health.” The Missouri bill, on the other hand, is quite specific indeed.

The legislation being pondered in Missouri would require sites to publish warnings stating that “Pornography is potentially biologically addictive, is proven to harm human brain development, desensitizes brain reward circuits, increases conditioned responses, and weakens brain function;” that “exposure to this content is associated with low self-esteem and body image, eating disorders, impaired brain development, and other emotional and mental illnesses;” and finally that “pornography increases the demand for prostitution, child exploitation, and child pornography.”

To say that these claims are disputed would be to put it mildly. Most of the evidence for these assertions is anecdotal in nature, in part because it’s very difficult to evaluate them without intentionally exposing a group of minors to pornography (which is illegal to do) in the context of clinical study.

Regardless of their basis in fact (or lack thereof) these labels are what attorneys and Constitutional scholars call “compelled speech,” something which is a bit of a no-no under First Amendment jurisprudence and the appropriately named “compelled speech doctrine.”

As explained by David L. Hudson Jr., writing for the Free Speech Center at Middle Tennessee State University, the compelled speech doctrine “sets out the principle that the government cannot force an individual or group to support certain expression.”

“Thus, the First Amendment not only limits the government from punishing a person for his speech, but it also prevents the government from punishing a person for refusing to articulate, advocate, or adhere to the government’s approved messages,” Hudson adds.

The compelled speech doctrine has been invoked by the Chief Justice John C. Roberts-era Supreme Court as recently as the case Rumsfeld v. Forum for Academic and Institutional Rights.

“Some of this Court’s leading First Amendment precedents have established the principle that freedom of speech prohibits the government from telling people what they must say,” Roberts wrote for the Court in 2006.

When some folks hear about these labels, doubtlessly they say to themselves something like “How is this any different from requiring cigarette packages to carry warning labels?” And that would be a good question, if cigarettes were a form of speech that presumptively enjoys protection under the First Amendment.

Beyond that distinction, there’s another obvious difference here. Cigarettes, unlike pornography, have been subjected to extensive clinical study, research which has confirmed that nicotine is addictive, and that tobacco (along with the myriad other substances found in cigarettes) is strongly associated with the development of lung cancer and various cardiopulmonary disorders and diseases.

In short, the analogy between pornography and cigarettes is a terrible one, scientifically and legally.

There was a time when I would very confidently assert that the Supreme Court will eventually reject these warning labels as textbook compelled speech and shoot down at least the labeling requirements in the bills pending in Washington and Missouri. But after their decision in Free Speech Coalition v. Paxton, I’m not so sure.

For those who like the contours of our First Amendment just the way they are, this uncertainly should be even more alarming than the warning labels the Nanny State wants us to start seeing on porn sites.

Read More »

Washington AV Bill Adds Controversial ‘Health Warning’ Requirement

Washington House

OLYMPIA, Wash. — Every so often, a bill shows up that feels like it’s carrying baggage from an old argument — the kind you thought had already been settled. That’s the energy surrounding a new age-verification proposal in Washington state, which would require adult websites not only to verify users’ ages but also to post warning notices about alleged health risks — despite a previous federal court ruling that struck down similar requirements.

Sponsored by Republican Rep. Chris Corry and Democratic Rep. Mari Leavitt, HB 2112 — known as the “Keep Our Children Safe Act” — would require adult sites to verify the age of users accessing content from within the state. Structurally, it mirrors many of the AV laws already enacted or currently moving through legislatures across the country.

Where it veers into new territory is with an extra mandate: websites subject to the law would also need to display “information pertaining to the youth health risks associated with adult content” on their landing pages and in advertisements, alongside contact details for the Substance Abuse and Mental Health Services Administration helpline. It’s the kind of addition that shifts the bill from regulatory to ideological — and into legally risky waters.

Notably, the legislation doesn’t spell out what those warnings would actually say. Instead, it leaves the wording “to be developed by the department of health,” a placeholder that creates as many questions as it answers.

A similar approach surfaced just last week in Missouri, where lawmakers introduced their own bill calling for nearly identical warning requirements on adult websites.

Missouri’s proposal, HB 1831, explicitly demands that the warning language match what was originally written into Texas’ HB 1181 — the age-verification statute at the center of the landmark Supreme Court case Free Speech Coalition v. Paxton.

That ruling ultimately cleared the path for states nationwide to begin enforcing AV laws. But the court upheld a version of the Texas law that no longer included the so-called health warnings. Those provisions had already been rejected by the U.S. Court of Appeals for the 5th Circuit, which upheld a lower-court injunction finding the warnings amounted to unconstitutional compelled speech.

Given that precedent, it’s far from clear how either Missouri’s HB 1831 or Washington’s HB 2112 would survive legal scrutiny. Asked about this potential obstacle, Rep. Corry acknowledged he wasn’t aware of the 5th Circuit decision and said he planned to look into it.

“As we move through the process we can amend as needed,” Corry said. “I am sure we will hear from both proponents and opponents of the legislation. It is rare that a bill does not need amending after being introduced. I am committed to making sure all legislation follows both the federal and state constitution.”

Industry attorney Corey Silverstein remains “cautiously confident” that the bill as written won’t withstand a constitutional challenge.

“The health warnings are blatantly unconstitutional, and I can’t imagine a world where even the most conservative justices turn their back on this level of First Amendment violations,” Silverstein said. “One of the biggest drawbacks to the Paxton decision was always going to be overly aggressive legislators trying to push the envelope as far as they can. It emboldened many conservative lawmakers to push forward in their anti-porn crusade.

“Washington and Missouri won’t be the last two states to engage in these tactics,” he predicted. “Since Paxton, state lawmakers are basically trying to one-up each other as to how far they are willing to cross the line. My home state of Michigan is the best example of this madness, with the introduction of the Anticorruption of Public Morals Act, which aims for a total ban on online pornography in Michigan.

“The industry must continue to aggressively fight these laws,” he added. “Without resistance, things will only get worse.”

Proposed Warnings Contradict Public Health Experts

Earlier this year, a North Dakota House committee amended a resolution that originally sought to label pornography a “public health hazard,” dialing the language back and replacing it with a call for further research into whether that designation was even appropriate.

The original version of the resolution claimed pornography was a “critical public health issue,” arguing that adult material “perpetuate[s] the demand for sex trafficking, prostitution, child pornography, and sexual abuse images,” and asserting that it had been “linked to detrimental health effects,” including purported impacts on brain development, emotional health, intimate relationships, and the creation of sexual addiction.

At the time, state Rep. Cynthia Schreiber-Beck publicly pushed back on those claims, noting that pornography “does not fulfill the public health field’s definition of a public health crisis.”

“After doing some research… this doesn’t really fly,” Schreiber-Beck told members of the House Education Committee.

Between 2016 and 2020, several states adopted similar “public health crisis” resolutions around pornography. But health and medical authorities have consistently rejected that framing, including in a study published by the American Public Health Association.

The study’s authors concluded: “The movement to declare pornography a public health crisis is rooted in an ideology that is antithetical to many core values of public health promotion, and is a political stunt, not reflective of best available evidence.”

Sometimes laws don’t reveal their real purpose until you scan the fine

Read More »

EU Regulators Set to Expand Age-Verification Crackdown to Smaller Adult Sites

France flag

DUBLIN — There’s a quiet shift happening behind the scenes — the kind that doesn’t make noise until suddenly it does. Ireland’s media regulator is signaling that enforcement of age-verification rules is about to widen its reach, stretching past the big, familiar adult platforms and into the long tail of smaller sites across Europe.

Digital Services Media Commissioner John Evans laid it out plainly for members of Ireland’s lower house of Parliament, warning that targeting only the giants won’t actually solve the problem. “If you come down hard on a few platforms, users, including minors, will simply move to smaller ones. So we and many other digital services coordinators are mapping below-threshold pornographic service providers and will tackle those at a national level.”

Ireland’s Online Safety Code has technically been in effect since July, requiring adult sites headquartered in the country to move beyond self-reported age confirmation and implement real age-assurance measures — a meaningful shift that has already begun reshaping compliance conversations behind the scenes.

For sites based elsewhere in the EU, oversight falls under the Digital Services Act (DSA). National digital service coordinators work together beneath that framework to enforce shared standards, including age-verification mandates intended to curb underage access.

So far, formal investigations launched by the European Commission have largely centered on the biggest names — especially those labeled “Very Large Online Platforms” (VLOPs), defined as services reaching at least 45 million monthly users. These platforms face the most punitive regulatory scrutiny, though not all have accepted that classification without legal pushback.

What’s changing now feels familiar. The strategy mirrors a move announced in September by France’s media regulator, Arcom, which declared it would begin ramping up enforcement efforts toward smaller adult platforms, rather than focusing exclusively on the market heavyweights.

That approach has found some supporters in Ireland’s parliament. During the meeting, several lawmakers reportedly pointed to France’s Law Aiming to Secure and Regulate the Digital Space (SREN) as a possible template for tougher regulation. Arcom’s aggressive cross-border enforcement has already sparked ongoing legal disputes over whether individual EU nations can compel compliance from adult sites based elsewhere inside the bloc. In September, an advocate general at the European Union’s Court of Justice offered a nonbinding opinion supporting France’s authority to enforce its AV rules on foreign-based pornographic websites. The court’s final judgment is still pending.

For Evans, though, the direction of travel feels clear — and increasingly urgent. “There is a significant amount of enforcement activity under way on age verification and pornographic services,” he told Irish lawmakers. “We are hopeful we will see changes soon.”

Sometimes the law doesn’t arrive with a bang. Sometimes it just quietly redraws the map — and only later does everyone realize the borders have moved.

Read More »

Ofcom Hits AVS Group with $1.3M Penalty Over AV Violations

Ofcom logo

London, United Kingdom: The number—one million pounds—lands heavy, doesn’t it? On Wednesday, U.K. regulator Ofcom brought the hammer down on AVS Group Ltd., slapping the company with a penalty worth roughly $1.3 million after concluding it hadn’t bothered to put truly solid age checks in place across 18 adult websites. One of those moments where you read the headline and feel that quiet uh-oh ripple through the industry.

Back in July, it was reported that Ofcom had widened its lens, investigating four companies operating adult platforms to see whether they were meeting the strict age-assurance demands of the U.K.’s Online Safety Act (OSA). The law isn’t coy—if you publish pornography, you’re required to use “highly effective” age checks specifically designed to keep minors out. AVS Group was on that list.

By October, Ofcom confirmed that its inquiry had uncovered what it called “reasonable grounds” to believe AVS was falling short of those obligations, triggering a provisional notice and a 20-working-day window for the company to respond. It was the regulatory equivalent of a warning shot across the bow, a chance to explain—or fix—what already looked shaky.

That chance didn’t change the outcome. In a fresh update posted Thursday, Ofcom stated, “From 25 July 2025 until at least 25 November 2025, each of the AVS Group websites either did not implement any age assurance measures or implemented measures that were not highly effective at determining whether a user was a child.”

Zooming in on the details, the agency took particular issue with the company’s photo-upload checks, noting that AVS “deployed a photo upload check on its services that does not include liveness detection and as such is vulnerable to circumvention by children (for example, by uploading a photo of an adult). Ofcom considers that this method is not capable of being highly effective within the meaning of the Act.” In other words: nice try, not even close.

The AVS-run websites scrutinized by the regulator include pornzog.com, txxx.com, txxx.tube, upornia.com, hdzog.com, hdzog.tube, thegay.com, thegay.tube, ooxxx.com, hotmovs.com, hclips.com, vjav.com, pornl.com, voyeurhit.com, manysex.com, tubepornclassic.com, shemalez.com and shemalez.tube.

On top of the main penalty, Ofcom tacked on an additional £50,000 fine for AVS’s failure to respond properly to information requests during the investigation—a sort of regulatory side-eye for not cooperating when questions were asked.

And here’s the real mic-drop: what AVS received isn’t even close to the upper ceiling. Ofcom has the authority to levy fines of up to £18 million or 10% of a company’s qualifying global revenue, whichever is higher. Beyond money, the agency can seek court orders pushing payment providers or advertisers to cut ties—or, in the most extreme cases, require U.K. internet service providers to block access to sites entirely. It’s a reminder that in the new regulatory climate, the fines are loud… but the silence that follows losing access can be even louder.

Read More »

NoFap Founder Sues Aylo, UCLA, Scientists & Academic Publisher

A lawsuit filed in Pennsylvania alleges that NoFap founder Alexander Rhodes was targeted in a years-long civil conspiracy involving Aylo (Pornhub’s parent company), UCLA, scientists Nicole Prause and David Ley, and academic publisher Taylor & Francis. Rhodes claims the defendants coordinated to silence and discredit him and NoFap by portraying the group and some of its members as aligned with extremist or pseudoscientific beliefs, and by promoting research asserting that pornography is not addictive and that NoFap is not an effective treatment. The suit casts Aylo as the central player in this alleged scheme, pointing to its legal efforts against state laws regulating porn and its ties to Ley as an expert witness, although the filing acknowledges no evidence that Aylo paid Prause or otherwise directly funded the researchers’ work.

The complaint seeks apologies, retractions, and gag orders and names dozens of journalists and other commentators whose largely factual reporting about NoFap is labeled defamatory. It frames the case not as a cultural debate but as a sweeping claim of disinformation, exploitation, and racketeering aimed at critics of the porn industry, while also accusing Taylor & Francis of trademark dilution and UCLA of aiding the alleged plot through employment of Prause. Observers note the contradiction between these claims and established academic and professional positions—such as the APA’s stance that pornography addiction is not a recognized diagnosis—raising questions about the lawsuit’s breadth and its implications for journalism and scientific inquiry.

Read More »

Aylo to Move Ahead With Age Verification Across EU

EU flag

BERLIN — There’s something quietly fascinating about watching a digital giant inch toward compromise. After years of courtroom sparring and regulatory standoffs, Aylo — the parent company behind Pornhub — is preparing to take part in the European Commission’s pilot program for its new “white label” age-verification app, according to reports out of Germany.

An Aylo spokesperson confirmed the company is among the first group of participants testing the program, a notable shift for a business that has typically treated age-verification mandates with deep skepticism — and, often, outright resistance.

The European Commission rolled out the white-label AV app publicly in July, positioning it as a practical tool to help online platforms comply with child-safety guidelines mandated under the EU’s Digital Services Act (DSA). During the pilot stage, the system is being tested across Denmark, France, Greece, Italy, and Spain — sort of a dress rehearsal for what could become a broader, standardized solution.

Italy has already moved to make the tool part of its compliance framework. As regulators have made clear, any age-verification system must accommodate the white-label app. Aylo’s sites — Pornhub, YouPorn, and Redtube — have been placed on a preliminary list of high-traffic platforms expected to comply under the country’s new rules. It’s the first tangible moment where “study phase” starts shading into real-world consequences.

Elsewhere in the EU, the pressure campaign hasn’t exactly been subtle. In France, after long legal back-and-forth, Aylo chose to block French users entirely instead of implementing the country’s age-verification mandates. Germany has delivered a partial reprieve: a court temporarily halted an order that would have required telecom providers to block access to Pornhub and YouPorn due to alleged AV noncompliance. 

Much of Aylo’s cross-border legal wrangling boils down to a simple but thorny question: who actually has the authority to enforce these rules — individual EU nations or the bloc as a whole? In September, an advocate general at the European Union’s Court of Justice offered a nonbinding opinion advising that France does, in fact, have the power to require age verification from adult websites operating elsewhere in the EU but accessible to French residents. Final judgment remains pending, but the tea leaves have started to tilt toward national enforcement power. At the same time, the European Commission announced plans earlier this year to conduct a broad study evaluating how well Pornhub and other major adult platforms are complying with DSA requirements.

Whether this rising regulatory drumbeat pushed Aylo closer to cooperation — or whether the design of the EU tool itself made compromise palatable — remains an open question. The Commission’s guidelines emphasize “non-intrusiveness” as a central principle, a far cry from the heavy-handed verification models surfacing in much of the United States. That approach seems to resonate. When Aylo announced in June it would comply with the United Kingdom’s age-assurance rules under the Online Safety Act, the company praised Ofcom’s framework as “the most robust in terms of actual and meaningful protection we’ve seen to date.”

For now, there’s still a big unanswered question hanging in the air: timing. Aylo hasn’t said when — or exactly how — Pornhub users across the EU might encounter the new app. As the spokesperson declined to clarify a rollout date, a request for comment remains outstanding. Until then, the industry is left with that familiar, uneasy pause — waiting to see whether this step marks a real turning point or just another cautious toe dipped into regulatory waters.

Read More »

Missouri Legislator Reintroduces Proposal for Adult Site Health Warnings

Missouri flag

JEFFERSON CITY, Mo. — In Missouri’s capital, one lawmaker is trying once again to put a moral warning label on people’s browsing habits. A state representative has filed a bill that would force adult websites to post stark notices about supposed physical, mental and social harms tied to pornography — even though a federal court has already pushed back on this kind of requirement.

The measure, HB 1831, comes from Rep. Sherri Gallick, a Republican member of the Missouri House of Representatives. Her proposal is one of two fresh attempts to lock age verification for adult sites into state law, effectively turning what the attorney general tried to do unilaterally into a legislative mandate. That earlier move from the attorney general has already stirred controversy and could still end up in court. The second AV bill, HB 1878, was introduced by Rep. Renee Reuter.

While HB 1878 tracks closely with the wave of age verification laws already on the books in other states, Gallick’s HB 1831 goes further. It adds a requirement that adult sites display a prominent notice declaring, “Pornography is potentially biologically addictive, is proven to harm human brain development, desensitizes brain reward circuits, increases conditioned responses, and weakens brain function. Exposure to this content is associated with low self-esteem and body image, eating disorders, impaired brain development, and other emotional and mental illnesses. Pornography increases the demand for prostitution, child exploitation, and child pornography.” It reads less like a neutral disclosure and more like a manifesto, which is part of what makes it so legally fraught.

On top of that, the bill insists that every page of an adult site carry a footer pointing users to a helpline “for individuals and family members facing mental health or substance use disorders.” The message is clear: visiting an adult website is being framed as behavior that belongs in the same bucket as addiction and crisis.

If the language feels familiar, it’s because it is. The text is lifted directly from Texas’ HB 1181, the age verification law that eventually became the center of the U.S. Supreme Court case Free Speech Coalition v. Paxton. That case turned into a major turning point, opening the door for enforcement of age verification laws across the country after the Court upheld the Texas statute.

But there’s a twist that matters here. The version of the Texas law that survived at the Supreme Court did not include those “health warning” provisions. They were already stripped out by the United States Court of Appeals for the 5th Circuit, which upheld a lower court’s injunction and agreed that forcing websites to post such statements amounted to unconstitutional compelled speech.

With that precedent hanging in the air, it’s hard to see a clear path forward for HB 1831 in its current form. Any serious attempt to move it could be walking straight into a legal buzzsaw.

Read More »

As the ‘No Tax on Tips’ Rule Rolls Out, IRS Must Decide What Counts as Porn

IRS Building

WASHINGTON—Somewhere between the fine print and the culture war headlines, tax officials at the Internal Revenue Service now find themselves staring down an oddly philosophical assignment: deciding what counts as online pornography—and who, exactly, qualifies as an adult content creator—while implementing the Trump administration’s so-called “no tax on tips” provision tucked into the One Big Beautiful Bill Act.

The initial draft guidance for the deduction applies only to “qualified tips” earned by a limited list of approved professions, a list that notably includes “digital content creators.”

That sounds simple enough at first glance—until you realize the law never actually defines what a “digital content creator” is. The guidance carves out an exception, barring deductions tied to illegal activity, prostitution, escorting, and pornography. Tips connected to “pornographic activity are not qualified tips,” the IRS notes. Clear words, blurry boundaries.

Because what is pornography, anyway? The term doesn’t come with a neat statutory definition. Instead, IRS officials—auditors and law enforcement agents in the agency’s criminal investigations division among them—are reportedly expected to apply a looser, almost old-school standard of “obscenity,” meaning they’ll rely on something closer to “knowing it when they see it.”

The exemption itself didn’t come out of nowhere. It was pushed by a coalition of Christian nationalist and conservative advocacy groups who lobbied hard to ensure adult creators would be excluded.

In a joint letter sent in September—led by Advancing American Freedom, an organization founded by former Vice President Mike Pence—the groups urged Treasury Secretary Scott Bessent to block deductions for adult content creators or any tipped professionals engaged in what they labeled “pornographic activity.”

“Rather than distort Congress’ language, Treasury should stick to the statute: providing tax relief for the waiters, hairdressers, delivery drivers, and other workers who make up America’s proud service industries,” reads the letter dated Sept. 18, falsely framing adult content creation as a moral hazard standing outside legitimate work.

“We, the undersigned organizations, urge you to ensure pornographic content creators are not included under Treasury’s definition of eligible entities,” the groups wrote. “Our government should not give tax breaks to predatory industries that profit from exploiting young men and women, destroying marriages, families, and lives.”

The IRS guidance arrived just days after that letter went public. Among the groups involved were the Family Research Council and the Ethics and Public Policy Center. Many of the signatories are classified as anti-LGBTQ+ and anti-government hate groups by the Southern Poverty Law Center and 

Other participating organizations have connections to Project 2025, the controversial Heritage Foundation initiative that has openly called for outlawing pornography altogether. Several of the same groups are also on record pushing for age-verification mandates across adult websites, mainstream social platforms, and mobile app stores.

Nate Mallory, a tax attorney who works extensively with adult industry clients, previously said the “no tax on tips” provision would almost certainly leave creators on platforms like OnlyFans, LoyalFans, Fansly, and others out in the cold. Mallory put it bluntly: “Tax policy should be based on economic principles, not moral judgments.”

Read More »