Legal Attacks

Yet Another Version of the “PROTECT Act” Introduced by Morley Safeword

Section 230

Add Congressman Jimmy Patronis (R-Fla.) to the list of elected officials hellbent on repealing Section 230 of the Communications Decency Act.

In a press release issued January 14th, Patronis celebrated his introduction of H.R. 7045, AKA the “Promoting Responsible Online Technology and Ensuring Consumer Trust” (PROTECT) Act.

The argument Patronis made in support of his proposal is a well-worn one, rooted in the notion that Section 230 is enabling evil tech platforms to ruin America’s children by shielding them from liability for things published by third parties on those platforms.

“As a father of two young boys, I refuse to stand by while Big Tech poisons our kids without consequence,” Patronis said. “This is the only industry in America that can knowingly harm children, some with deadly consequences, and walk away without responsibility. Big Tech is digital fentanyl that is slowly killing our kids, pushing parents to the sidelines, acting as therapists, and replacing relationships with our family and friends. This must stop.”

There’s a reasonable argument to be had about whether the courts have extended Section 230’s coverage too far in some cases, but to hear people like Patronis tell it, the statute’s safe harbor provision allows “Big Tech” to do anything it pleases with total impunity.

“These companies design their platforms to hook children, exploit their vulnerability, and keep them scrolling no matter the cost,” Patronis added. “When children are told by an algorithm, or a chatbot, that the world would be better without them, and no one is being held responsible, something is deeply broken. I bet they would actually self-police their sticky apps and technologies if they knew they would have to pay big without the Big Tech Liability Protection of Section 230.”

In his press release, Patronis claims that “Section 230 shields social media companies and other online platforms from liability for content published on their sites.” This claim is a half-truth, at best. Section 230 shields social media companies from liability for content published by others on their sites. That’s an important distinction, not a distinction without a difference.

Let’s try a thought experiment here: Let’s suppose you’re a congressman whose website permits users to post comments in response to things you post on the site. Let’s further suppose one of your site’s users decides to post something defamatory about another of your colleagues. Would you want to be held directly liable for that comment? How about if instead of something defamatory, the user posted something patently illegal, like an image of a child being sexually abused; is Patronis saying my hypothetical congressman ought to go to prison in that scenario?

There are many reasons why groups like the Computer and Communications Industry Association (CCIA) are against the repeal of Section 230 – and yes, one of those reasons is that the CCIA is funded by everyone’s current favorite boogeyman, Big Tech. Another more important reason is the people behind the CCIA can see where this is all heading, if Section 230 is outright repealed and no safe harbor at all is provided for those who offer forums in which users can publish their content and comments.

“In the absence of Section 230, digital services hosting user-created content, including everything from online reviews to posts on social media, would risk constant litigation,” the CCIA asserted in an analysis published January 12th. “Continuing to provide services optimized for user experience would require massively increased legal expenses.”

How massively would those legal expenses increase? The CCIA said, given the sheer volume of user-generated posts published in a year, if “just one post or comment in a million led to a lawsuit, digital services could face over 1.1 million lawsuits per year following a Section 230 repeal.”

“A single lawsuit reaching discovery typically costs over $100K in fees, and sometimes much more,” CCIA correctly noted. “If companies face 1.1 million lawsuits, that’s $110 billion in legal costs annually.”

I suppose those who say Big Tech is the devil (while using the platforms enabled by Big Tech to say so) might think this is a good thing, I’m not sure they’ve thought this all the way through. If social media platforms can’t operate due to overwhelming legal costs, we lose all the good things about social media, too – not to mention a whole lot of jobs when those platforms inevitably go out of business.

From the perspective of the adult industry and those who enjoy adult entertainment, repealing Section 230 would likely spell the end of platforms allowing adult content creators to post self-produced content, as well. What platform would want to risk being held strictly liable for anything and everything depicted in the videos and photos adult creators produce? It would be absolute madness for platforms like OnlyFans and its competitors to maintain their current business model in the absence of Section 230 safe harbor.

Again, for those who think porn should be abolished, that development might be seen as a feature and not a bug where the idea of repealing Section 230 is concerned. But extend that same outcome to some platform they DO like – YouTube, TikTok, Facebook, Instagram, X or what have you – and they might not like the collapse quite as much.

From where I sit, the idea of repealing Section 230 should be accompanied by that old standby of a warning: “Be careful what you wish for, because you might just get it.”

Read More »

Florida Lawmaker Introduces New Bill to Repeal Section 230

Section 230

It starts with that familiar little jolt in the gut — the kind you get when a political idea lands a little too close to home. WASHINGTON — Rep. Jimmy Patronis of Florida has become the latest member of Congress to float legislation that would repeal Section 230 of the Communications Decency Act, the rule that shields interactive computer services — including adult platforms — from being held responsible for user-generated content. One of those moments where you pause mid-scroll and think, Oh… this could get interesting. Or messy. Or both.

Patronis introduced HR 7045 in the House of Representatives earlier this week, slipping it into the legislative bloodstream where big ideas tend to either explode or quietly mutate over time. Sometimes you can almost hear the gears grinding behind the scenes.

A statement posted on his website declared, “For too long, the law has prevented parties harmed by online content from obtaining relief. Instead of protecting our younger generations from sensitive content, these sites prioritize profit over safety while continuing to push out harmful, explicit, and dangerous materials without any accountability.” Strong words, the kind that land heavy and don’t really leave much room for nuance.

Would-be reformers on both sides of the aisle have been taking swings at “Big Tech” for years now, accusing platforms of profiting off illegal and harmful content while hiding behind legal shields. The idea is to force companies to moderate more aggressively by making them legally responsible for what users post. Meanwhile, right-wing critics argue the same rule lets platforms censor conservative voices, and they want limits placed on how much moderation power these companies can wield. It’s like watching two very different fires being fueled by the same match.

Back in December, two other repeal bills were already making their way through Congress: HR 6746, the Sunset to Reform Section 230 Act, which would amend the law by simply adding, “This section shall have no force or effect after December 31, 2026,” and S 3546, which calls for a full repeal of Section 230 two years after enactment. The clock imagery alone makes you feel like something is quietly counting down in the background.

Industry attorneys and advocates, though, have been sounding alarms. They worry that once lawmakers start tinkering with Section 230, it opens the door to a patchwork of carve-outs — the kind that slowly chip away at protections, much like what happened with FOSTA/SESTA and its exemptions targeting sites that “unlawfully promote and facilitate” prostitution or sex trafficking. It’s rarely just one small change, is it? It’s the domino effect.

A carve-out aimed at — or even loosely touching — the adult industry would effectively gut Section 230 for those platforms. That would suddenly make sites hosting user-generated content legally responsible for what users upload, inviting a flood of civil lawsuits and uncertainty. And once that door cracks open, it’s hard not to wonder how wide it eventually swings.

Read More »

Florida Candidate Floats Plan for 50% Tax on OnlyFans Earnings

Florida flag

Fifty percent. Just saying it out loud makes your eyebrows climb a little. Half of someone’s income — gone in the name of morality, politics, and a campaign soundbite. That’s the idea a Florida gubernatorial candidate tossed into the air this week, announcing plans to slap a 50 percent “sin” tax on OnlyFans creators if he wins in November.

James Fishback, an investor and a member of a far-right wing of the Republican Party, floated the proposal during an interview on a right-wing podcast called NXR Studios, released earlier this week. It wasn’t exactly whispered. It landed with the kind of confidence that suggests he wanted people arguing about it before the episode even finished buffering.

During the interview, Fishback framed the tax as punishment for what he called the “sin” of being an online sex worker, arguing that content creators on OnlyFans should be paying a special levy for existing in that space.

“As Florida governor, in year one, I would push for the first-of-its-kind OnlyFans sin tax,” Fishback said during the interview. “If you are a so-called OnlyFans creator in Florida, you are going to pay 50 percent to the state on whatever you so-called earn via that online degeneracy platform.”

He went on to say that part of the revenue from the proposed tax would be used to fund a “czar” for men’s mental health. Fishback explained, “Men have been told for far too long that they are guilty of masculinity, that they are guilty of all of society’s ills.” It’s one of those lines that sounds designed to travel fast on social media, whether people cheer or cringe.

Additional funds, he said, would also support education and religious nonprofit “crisis pregnancy centers” that aim to discourage women from having abortions. The list of beneficiaries reads like a snapshot of his political priorities.

In what felt very much like a social media provocation, Fishback later posted about the proposal on X, reposting another user who highlighted the idea of an OnlyFans tax. He didn’t stop there.

He tagged popular Miami-based OnlyFans creator Sophie Rain and wrote, “Hey Sophie Rain, Pay up or quit OnlyFans. As Florida Governor, I will not allow a generation of smart and capable young women to sell their bodies online.” Rain, known for her viral presence and softcore content, fired back shortly after.

Read More »

Irish Lawmakers Consider Tougher Age Verification Rules

Irish flag with hand holding blurred site

There’s a certain tension in rooms like this — the kind where lawmakers sit around long tables, coffee going cold, talking about the internet as if it were a living thing that keeps slipping out of their hands. That mood hovered in the air Wednesday as Ireland’s Joint Committee on Arts, Media, Communications, Culture and Sport gathered to wrestle with online platform regulation and digital safety, including renewed calls for tougher age verification rules for adult sites.

In a written statement submitted ahead of the session, Detective Chief Superintendent Barry Walsh, who leads the Garda National Cyber Crime Bureau, didn’t mince words about what he described as “the widespread and unrestricted availability of pornography.”

“As an overarching observation, it is difficult to understand why robust age verification is not yet a standard operating procedure in respect of any platform where pornography or other child inappropriate content is either readily accessible or where there is a realistic danger that it could be accessed,” Walsh argued. “This would appear to represent a very simple, yet robust, safeguard.”

He went further, pointing to what he called “very extreme pornography that is serving to corrupt teenage males in particular into regarding this as normal, acceptable sexual behavior to be expressed in practice” — language that mirrors recent arguments from U.K. lawmakers who’ve pushed to outlaw depictions of “choking” in adult content. It’s the kind of claim that lands heavy in a room, even if everyone hears it differently depending on where they’re standing.

Ireland already has an Online Safety Code, which took effect in July 2025 and includes a requirement that adult sites headquartered in the country implement age assurance measures. On paper, at least, the framework is already there.

Sites based outside Ireland aren’t off the hook either. They fall under the European Union’s Digital Services Act (DSA), with digital service coordinators across member states working together to enforce rules, including age assurance obligations. It’s a bit like a cross-border relay race — everyone’s running, but the baton keeps moving.

The European Commission has already launched formal proceedings against several adult sites for suspected DSA breaches, though the spotlight has largely been on higher-traffic platforms so far.

In December, however, a representative from Irish media regulator Coimisiún na Meán told legislators that regulators across the EU were preparing to widen enforcement to include smaller adult sites as well. Some Irish lawmakers reportedly pushed for even tougher age verification laws, pointing to France’s Law Aiming to Secure and Regulate the Digital Space (SREN) as a possible template.

That sentiment surfaced again during Wednesday’s meeting when Senator Rónán Mullen doubled down, telling fellow committee members: “Strict age verification is necessary to shield children from the harmful effects of pornography.”

“That’s the only thing that has worked,” Mullen said. “Look at certain states in the USA where, with bipartisan support, pornography providers have gone offline because they were civilly or criminally liable if they did not ensure strict age verification.”

His argument echoes a familiar line from U.S. lawmakers who often point to site withdrawals as proof that these laws are “effective” or “working” — a claim that quietly assumes the real goal is stopping everyone from accessing adult content, not just minors. It’s one of those uncomfortable subtexts nobody quite says out loud, but you can feel it humming under the conversation.

Not everyone in the room was convinced the strategy holds up in the real world. Deputy Peter Cleere raised the practical problem of virtual private networks, which can make age verification systems easy to sidestep.

“It makes a mockery of all the regulations we want to put in place,” Cleere said. “We’re going in circles.”

And maybe that’s the lingering question — are these policies actually building safer digital spaces, or are they just drawing tighter circles around a problem that keeps slipping through the cracks?

Read More »

Ofcom Sets New Rules for Where Adult Sites Must Place Age Checks

Ofcom logo

There’s something oddly symbolic about a digital front door. You don’t see what’s inside yet. You pause. You decide whether you’re allowed in. That’s essentially the vision being pushed now, as U.K. regulator Ofcom laid out new guidance on where and how adult sites should place age checks under the Online Safety Act.

In a statement posted on its website, the agency noted that while adult sites have been experimenting with different ways to position age checks, Ofcom’s preferred method is a so-called “front gate,” where users encounter “a blank landing page, with no content visible until they have completed the age check.”

The agency considers this to be the safest and most compliant way to handle age gating — basically, no peeking through the curtains before you prove you’re old enough to be there.

Other approaches to age assurance are still allowed, the agency added, as long as they truly prevent children from seeing pornography before hitting the age check. For instance:

Blur gate. If a site opts to use a “blur gate,” where users see only blurred images before the age check, the blurring must be “sufficiently strong and across enough content to ensure that the content is not pornographic.” Not a polite fog — more like a real visual wall.

Image gate. A site using an “image gate,” where users can see clearly visible images but must click a thumbnail to reach an age check, has to ensure that the images and preview videos shown beforehand are not pornographic. In other words, curiosity can’t be rewarded too generously.

In-video gate. A site using an “in-video gate,” where users see thumbnails with clearly visible images and may even watch parts of videos before being sent to an age check, would need to make sure that any video content available before the check is not pornographic. That gray zone? It’s getting a lot smaller.

Ofcom’s statement also makes it clear that content doesn’t have to show nudity or explicit sex acts to qualify as pornographic. Sometimes the tone, framing, or intent tells the real story — anyone who’s ever scrolled the internet for more than five minutes knows how slippery that line can be.

“When deciding what content is suitable to include before an age gate, services should refer to our guidance on pornographic content, considering the wider context of their site and whether the images or videos are accompanied by sexually explicit language in titles, for example … What matters is whether it is reasonable to assume the content was produced principally for the purpose of sexual arousal,” the statement clarifies.

The statement concludes, “We will continue to engage with and monitor the adult sector to identify and address non-compliance, including whether services’ placement of the age check is compliant.”

Which really translates to this: the front door is being watched, the rules are tightening, and the era of half-open windows and blurred excuses is quietly coming to an end.

Read More »

Utah’s “Porn Tax”: A Levy on Paper and Ink for the Internet Age by Morley Safeword

Tax

Back in the early 1970s, the Minnesota legislature altered the state’s sales tax such that it created a “use tax” on the cost of paper and ink, while exempting the first $100,000 worth of such materials in any calendar year.

In part due to that exemption, the use tax clearly was directed at the state’s larger periodicals, including the Minneapolis Star Tribune. The Star Tribune wasn’t simply one of eleven publications incurring tax liability under the statute in the early 70s; of the $893,355 in total tax revenue collected under the statute in 1974, the Star Tribune paid $608,634 – roughly two-thirds of the total revenue collected.

The Star Tribune sued the Minnesota Commissioner of Revenue, alleging that the state’s tax scheme violated the First Amendment. The resulting case, Minneapolis Star & Tribune Co v. Minnesota Commissioner of Revenue, was decided by the U.S. Supreme Court in 1983.

The Supreme Court ruled in favor of the newspaper, holding that the “main interest asserted by Minnesota in this case is the raising of revenue” and that while this interest was “critical to any government,” it wasn’t, by itself, enough for the law to survive scrutiny under the First Amendment.

“Standing alone, however, it cannot justify the special treatment of the press, for an alternative means of achieving the same interest without raising concerns under the First Amendment is clearly available,” Justice Sandra Day O’Connor wrote for the court, “the State could raise the revenue by taxing businesses generally, avoiding the censorial threat implicit in a tax that singles out the press.”

I’ve been thinking a lot about the Star Tribune case since first reading about SB 73, a new bill in Utah proposed by State Senator Calvin R. Musselman. There was a time when the 1983 decision would have given me confidence that Musselman’s bill wouldn’t survive court scrutiny, assuming it becomes law in the state. After the Supreme Court’s decision last summer in Free Speech Coalition v. Paxton, however, I’m a lot less certain.

Granted, there’s not a perfect analogy between the Minnesota law at issue in the Star Tribune case and the bill proposed in Utah, nor between the 1983 case and the Paxton case. But what has me feeling uneasy is the readiness of the current Supreme Court to throw aside precedent and impose a lower standard of review in cases where material alleged to be “harmful to minors” is at issue.

It’s not just reasonably well-informed laymen like me who are uncertain, either. In recent adult industry media coverageabout the Utah bill, litigators who are experts in the First Amendment were divided in their analysis, as well.

Speaking to XBIZ, attorney Larry Walters of FirstAmendment.com pointed out that in one recent case, “the Georgia Supreme Court upheld a 1% gross revenue tax on adult entertainment establishments in the face of a constitutional challenge.”

“But the court reasoned that the tax was justified not based on the entertainment content produced by the businesses, but on the alleged ‘adverse secondary effects’ of physical adult establishments, such as prostitution and child exploitation,” Walters noted, adding that there are “no recognized adverse secondary effects of online adult entertainment businesses. Accordingly, the Utah bill, if adopted, could be subject to a constitutional challenge as a violation of the First Amendment.”

Walters also observed that the Utah bill also could have trouble for it looming in the form of the Dormant Commerce Clause, which limits states’ ability to pass legislation that discriminates against of unreasonably burdens interstate commerce.

Walters’ colleague Corey Silverstein, commenting for the same article, was less optimistic.

“After the state-by-state pummeling of AV laws, this is only the beginning of another new trend of anti-adult and anti-free-speech laws that the entire industry needs to prepare for,” Silverstein said, also predicting that a challenge to the Utah bill, should it become law, would be unlikely to succeed.

One thing is certain: The Utah “porn tax” bill won’t be the end of state governments seeking to impose further regulatory burdens on the adult entertainment industry. Emboldened by their success in establishing age verification requirements, state legislatures across the country can be relied upon to cook up additional hurdles to put in the path of adult businesses, performers, content creators and anyone else engaged in expressing themselves through sexually explicit materials.

Read More »

Age Verification Push Sends Millions of Britons to Unregulated Porn Sites, Charity Says

Age verification

Something strange has been happening since age verification checks quietly became part of everyday internet life in the UK. You’d think stricter rules would close doors. Instead, a lot of people seem to be wandering down darker hallways.

New research from the Lucy Faithfull Foundation suggests that nearly 45 per cent of UK porn users have visited websites without age verification checks since the rules came into force last summer under the Online Safety Act.

The poll, which surveyed more than 3,700 people across Britain, found that 39 per cent of those who visited these sites ended up watching content that made them feel uncomfortable. Even more telling, 40 per cent said what they saw was enough to put them off returning altogether.

The charity, which focuses on preventing online child sexual abuse, has warned that these unregulated spaces can quietly increase the risk of people stumbling into harmful — and illegal — material.

Vicky Young, who leads the Stop It Now UK and Ireland anonymous helpline, said these sites can become a dangerous stepping stone toward indecent images of children.

“We work with people who have looked at indecent images of children to try and address that behaviour, to help support them to change their behaviour,” she said.

“One of the things that people say to us frequently is that they started looking at legal adult pornography and that their behaviour escalated. In part they were spending maybe longer online, but also the sort of content that they were looking at became more extreme and often started getting younger, and that’s when they then crossed into illegal behaviour, so looking at indecent images of children.”

That pattern — the slow creep from curiosity to something far more serious — is what worries the charity most.

“Because of that pathway and that coming out constantly in conversations we have with people, it concerns us that if people are accessing sites where there is this riskier content, that actually they are putting themselves at a higher chance of accessing indecent images,” she said.

“Sometimes that might not be intentional in the beginning, but what people tell us is that actually, if they come across those images as part of their other pornography, that then sparks curiosity. There’s something that kind of adds to the excitement around the risk, and they don’t necessarily stop at one image. They actually then start looking for more images.”

The survey also revealed a quieter anxiety bubbling under the surface. Nearly 30 per cent of respondents said they were worried about how much pornography they consume. That concern was highest among young men aged 18 to 24, with more than half admitting it’s something that troubles them — a group the Foundation describes as particularly vulnerable.

At the same time, the rules appear to be forcing a moment of self-reflection for many. Almost 47 per cent said they’ve reduced how much pornography they watch since age checks were introduced, while 55 per cent said the changes made them stop and think about their habits.

Recent data backs that up. Enforcement of highly effective age assurance led to an immediate drop in traffic to major porn sites from late July. But as one door closed, another cracked open: VPN use surged as people looked for ways around the new barriers.

The UK’s most visited adult site recorded a drop of around 1.5 million viewers year on year, falling from 11.3 million in August 2024 to 9.8 million this August.

Meanwhile, VPN usage more than doubled after the rules came in, jumping from roughly 650,000 daily users to a peak of over 1.4 million in mid-August 2025. Although that number has since dipped, it still hovered around 900,000 by November.

In other words, fewer people are walking through the front door — but plenty are still trying the side entrance.

Dr Alexandra Bailey, head of psychology at the Foundation and an associate professor at the University of Roehampton, said the intention behind age verification is sound, but the consequences are more complicated.

“Age verification is vital to protect children, and we fully support it,” she said. “But we also need to recognise that some adults are choosing riskier sites to avoid age checks. These sites can expose people to harmful material, including illegal content depicting child sexual abuse. Even if you’re not looking for it, you could encounter it — and that can have serious life-changing consequences.”

She added that the rules have created a pause many people probably needed, but not everyone is responding in a safe way.

“Age verification is also prompting adults to reflect on their online behaviour, which can be a good thing for people worried about their porn use. But we need to address the risks for those who are turning to sites that avoid the new regulations.

“Every day, our advisors speak to people whose pornography use has spiralled into something much more harmful. We know embarrassment can stop people from reaching out, but confidential help is available. If you’re worried about your own behaviour or someone else’s, contact Stop It Now before it’s too late.”

Sometimes the most dangerous part isn’t the rule itself — it’s what people do when they decide to dodge it.

Read More »

When HBO’s Industry Meets the Age Verification Reckoning

Age verification image

WHEN THEY DECIDED to take on age verification in their latest season, Industry cocreators Konrad Kay and Mickey Down didn’t expect to wander straight into a political minefield. It probably felt, at first, like one more sharp storyline—edgy, timely, a little dangerous in the way good TV often is. But sometimes a writers’ room accidentally opens a door to something bigger. And once it’s open, there’s no quietly closing it again.

“It was in the ether of British politics, but it wasn’t front and center when we started writing the scripts or shooting it, and then it really flared up as a kind of front-page-of-BBC topic of conversation,” Kay says.

Season 4 of HBO’s sexy, darkly funny financial drama—premiering Sunday—pushes Industry even further beyond the blood-slick trading floors that first defined it. This time, the story spills into tech, porn, age verification, and the uncomfortable politics sitting between them. Early in the season, tensions rise inside Tender, a fintech firm fresh off its IPO, as executives debate whether to keep processing payments for Siren, an adult platform in the OnlyFans mold. Siren—and other porn and gambling businesses—account for a sizable slice of Tender’s revenue. But looming threats of new age-verification laws and a rising tide of anti-porn rhetoric from the UK’s Labour Party have some leaders wondering if reputational cleanup might be more profitable than cashing controversial checks. It’s boardroom fear dressed up as moral clarity, the kind that tends to surface right before regulators do.

In the real world, the UK’s Online Safety Act—requiring age verification to access porn and other restricted content—didn’t take effect until July 2025, long after Kay and Down had mapped out this season’s arc. Still, the parallels are hard to ignore. Platforms like Pornhub saw UK traffic plunge by nearly 80 percent after the rules kicked in, and similar pressures are mounting in the U.S., where roughly half of all states now enforce some form of age-verification law. Even Capitol Hill is circling the issue: in December alone, lawmakers considered 19 bills aimed at protecting minors online. Critics, meanwhile, argue that several of those proposals stray into unconstitutional territory. It’s messy, unresolved, and very much still unfolding.

“It’s kind of shown how fragile free speech absolutism is,” says Down, pointing to the “wildly different” reactions the issue has provoked—from puritan instincts cropping up in liberal circles to a more blunt, censor-first “shut everything down” posture on the conservative side. And that tension, hanging in the air, feels like the real cliffhanger. Not who wins the argument—but what gets lost while everyone’s busy shouting.

Read More »

Utah Senator Floats Porn Tax to Pay for Age-Verification Enforcement

Utah House building

SALT LAKE CITY—There are some ideas that arrive quietly and others that walk in like they own the place. This one does the latter. At the opening of Utah’s new legislative session, a Republican lawmaker dropped a bill that would tax online adult content, funneling the money toward age-verification enforcement and teen mental health programs.

Sen. Calvin R. Musselman, who represents the small town of West Haven, is the driving force behind Senate Bill (SB) 73. The proposal introduces what it calls a “material harmful to minors tax,” set at seven percent of the “gross receipts” from sales of content classified under that label.

SB 73 has been formally introduced but hasn’t yet landed in a committee. Even so, the odds of it clearing the legislature are widely considered high.

The bill defines “gross receipts” as “the total amount of consideration received for a transaction […] without deduction for the cost of materials, labor, service, or other expenses.” In other words, it’s the top line, not the leftovers.

And the reach is… expansive. The tax would apply to “the gross receipts of all sales, distributions, memberships, subscriptions, performances, and content, amounting to material harmful to minors that is: (a) produced in this state; (b) sold in this state; (c) filmed in this state; (d) generated in this state; or (e) otherwise based in this state.” That’s a wide net, and it’s not subtle about it.

Because of that scope, the tax wouldn’t just hit one corner of the industry. Producers, creators, platforms—anyone touching qualifying content—would likely feel it. And it wouldn’t exist in a vacuum. The levy would stack on top of existing obligations, including Utah’s digital sales tax and other state fees.

Revenue from the tax would flow into a newly created government account, earmarked for teen mental health treatment through the state Department of Health and Human Services. It’s worth noting that Utah is among the states that formally frame pornography consumption as a public health crisis, a position tied to the still-contested concept of “pornography addiction.”

The bill doesn’t stop at taxation. It also introduces a $500 annual recurring fee, paid into accounts overseen by the Division of Consumer Protection. This so-called “notification fee” would apply to companies producing content deemed “harmful to minors” and is tied directly to age-verification compliance.

Those funds would be used by the Division to monitor compliance in a system modeled after the United Kingdom’s Ofcom framework. Companies would need to notify annually. Miss that step, and the penalty jumps to $1,000 per day until the paperwork—and compliance—are in order.

Utah, of course, has already been down this road. It was one of the first states to pass a statewide age-verification law structured as a “bounty law,” allowing private individuals to sue on the state’s behalf over noncompliance. That approach famously led Aylo, the owner of Pornhub, to block Utah IP addresses, just as it has done in other states with similar laws.

Utah wouldn’t be alone in adding a porn-specific tax to the mix. Alabama already has one on the books, imposing a ten percent levy on top of existing digital goods and sales taxes.

And the idea is still spreading. In Pennsylvania, a bipartisan pair of state senators recently announced plans to propose a measure that would tax online pornography subscriptions in that state’s digital marketplace.

Read More »

2025: The Year Tighter Regulation Came to Town for the Online Adult Industry by Morley Safeword

Adults only button

When I got my start in the online sector of the adult entertainment business, back in the mid-nineties, there was no video streaming. Individual photos often took web users several minutes to download. And you hardly heard a peep from anyone suggesting that the fledgling industry needed to be reined in.

To be fair, many people were only vaguely aware of what was available on the internet at the time, let alone worried about what their kids might be looking for on there – and frankly, the web was so slow, using it exceeded the patience of a lot of kids, anyway.

Oh, how things have changed.

What evolved fastest, of course, was the technology underpinning the internet. As high-speed connectivity became the norm rather than the exception and video streaming capabilities increased year over year, online porn went from something enjoyed by a small subset of early adopters to a massive, multibillion dollar industry. Along with those changes in technology came ever louder calls for the online adult industry to be more tightly regulated – or regulated at all, in the still-early-internet days of the mid-nineties.

In the United States, Congress began cooking up proposals to prevent minors from accessing online porn. While these proposals enjoyed broad bipartisan support (within the legislature, at least), what they didn’t get was much support from the courts.

Early attempts to impose things like age verification requirements were slapped down by the courts, most notably in cases like Reno v. ACLU, decided in 1997. In Reno, the Supreme Court held that certain provisions of the Communications Decency Act of 1996 (“CDA”) violated the First Amendment. Specifically, the court found that the CDA’s “indecent transmission” and “patently offensive display” provisions trod upon the freedom of speech protected by the First Amendment.

What changed in 2025, as the Supreme Court again considered an age verification proposal, this time a state law passed in Texas (“HB 1181”), was in part the continued forward march of technology. But more crucially, what changed was the court’s disposition as to which “standard of review” ought to be applied.

In previous cases involving online age verification proposals, the court has applied “strict scrutiny,” a high bar that requires the government to show its actions (and laws) are “narrowly tailored” to further a “compelling government interest” and are the “least restrictive means” to further that interest.

In the case Free Speech Coalition v. Paxton, which the Supreme Court decided in June, the district court had applied strict scrutiny and found that HB 1181 failed to satisfy the standard. When the case reached the Supreme Court, however, the majority decided the court had erred in applying strict scrutiny and that the correct standard to apply was “intermediate scrutiny,” which sets the bar much lower for the government.

Writing for the majority, Justice Clarence Thomas asserted that HB 1181 has “only an incidental effect on protected speech.”

“The First Amendment leaves undisturbed States’ traditional power to prevent minors from accessing speech that is obscene from their perspective,” Thomas wrote. “That power includes the power to require proof of age before an individual can access such speech. It follows that no person – adult or child – has a First Amendment right to access such speech without first submitting proof of age.”

Since the law “simply requires adults to verify their age before they can access speech that is obscene to children,” Thomas found that HB 1181 “is therefore subject only to intermediate scrutiny, which it readily survives.”

The three justices who dissented from the majority’s position didn’t see things quite the same way, naturally. In her dissent, Justice Elena Kagan criticized the majority’s holding as “confused” and highlighted the ways in which it departed from the court’s previous rulings in similar cases.

“Cases raising that question have reached this Court on no fewer than four prior occasions – and we have given the same answer, consistent with general free speech principles, each and every time,” Kagan observed. “Under those principles, we apply strict scrutiny, a highly rigorous but not fatal form of constitutional review, to laws regulating protected speech based on its content. And laws like H. B. 1181 fit that description: They impede adults from viewing a class of speech protected for them (even though not for children) and defined by its content. So, when we have confronted those laws before, we have always asked the strict scrutiny question: Is the law the least restrictive means of achieving a compelling state interest? There is no reason to change course.”

Whether there was reason to change course or not, surely now the course has been changed. Make no mistake, laws like HB 1181 are here to stay – and they will be followed by other measures designed to restrict access to sexually-explicit materials online, as well as regulation which goes much further and sweeps in an even broader range of controversial content.

The old cliché about the “canary in the coal mine” has often been applied to pornography in the context of free speech discussions. Even those who don’t like or approve of porn have often warned that crackdowns on sexually explicit expression can presage attempts at regulating other forms of speech.

If indeed those of us who work in the adult industry are part of a sentinel species, the warning to our peers in the broader world of entertainment and self-expression could not be more clear, as we look out to 2026 and beyond: Here in the world of online porn canaries, we’re choking on this new regulatory push – and most likely, some of you other birds are going to be feeling short of breath too, soon enough.

Read More »