Commentary

The Human Cost of Overregulation by Morley Safeword

Age verification

Over the decades I’ve worked in the adult entertainment business, it has struck me many times how concerned the industry’s critics appear to be about the welfare of those of us who work in the industry – and how quickly that concern turns to consternation and scorn, should we insist that we’re doing what we do gladly and of our own free will.

“Nonsense,” the critics say, “these poor souls only think they are engaging in this depravity willingly; the truth is they have been brainwashed, coerced, cajoled and manipulated into believing they want to participate in this filth.”

Granted, not a lot of people have spilled ink along these lines to fret over the wellbeing of freelance writers like me. I think we’re counted as being among the exploiters, rather than the exploited, or perhaps as enablers of exploitation. Still, there’s no denying I derive my living, meager though it may be, from adult entertainment, even if all I do is write about it, rather than perform in or film it.

While many of the regulations aimed at the adult industry are couched as attempts to protect minors from the alleged harm of viewing pornography, when these measures are discussed by their proponents, “success” is often defined as making the adult industry retreat from their jurisdiction altogether. If a site like Pornhub blocks visitors from an entire state, including all the adults in that state who are still legally entitled to access the site even under newly established age verification mandates, those who cooked up the laws often describe this development as a sign the law is “working.” As I’ve written before, the chilling effect is a feature of these measures, not a bug.

By the same token, if a new law or regulation makes it harder for adult content creators to make their own movies, distribute their own photos or perform live on webcams, that too is something to be celebrated by the legislators and activists who champion those regulations.

Gone is all thought or discussion of the wellbeing of adult content creators and performers, once the potential cause of harm is the law itself. This holds true of purported “anti-trafficking” statutes. While sex workers themselves largely oppose measures like FOSTA/SESTA and say the law has made them less safe, not more, the proponents and sponsors of such legislation don’t want to hear it. Yes, these paternalistic politicos and crusading critics will protect these wayward adults from themselves, even if it kills them.

I can only imagine that if a state legislator from any of the dozens of states that have passed age verification requirements were to learn that adult content creators (and the platforms that host their work) are having a harder time earning a living under these new regulatory schemes, their response would be brief and callous: “Good,” they’d probably say, “now they can go out and look for more respectable work!”

And what happens when former porn performers do find work in other fields? The stigma of porn follows them. They get fired. They are told their mere presence in a classroom is disruptive. They are hounded on social media. They are treated like pariahs by the very people who supposedly care about their welfare.

A law or regulation can be well-intended and still do harm. I don’t doubt some of the politicians involved in crafting age-verification laws and other purportedly protective regulations believe they are doing things in the best interests of both minors and the adults who work in porn, or in the broader world of sex work. But it’s hard to believe they truly care about the latter two when there’s so little thought given to the potential negative impact on them during the crafting of these laws.

As more states toy with the idea of establishing a “porn tax,” will any of them pause to consider the impact on the human beings targeted by such taxes? I’d strongly advise not trying to hold your breath while waiting for that manner of concern to be expressed.

Read More »

Click Here to Keep Clicking Here, So You Can Click There (Eventually) by Stan Q. Brick

Ofcom logo

I was on the lookout for something to write about. “I know,” I thought, “I’ll see what the latest news is to come out of OfCom, the UK’s regulatory authority for broadcast, internet, telecommunications and postal services!

In the old days, days I remember with great fondness, I could have just typed Ofcom.org.uk into the nav bar on my browser and I’d be there, reading the latest from Ofcom. Not anymore – because now, even to read a drab, dull, regulatory agency’s website, first I must satisfy a machine’s demand that I prove I’m human, first.

No big deal. Just a simple captcha test (one probably easily defeated by a sophisticated enough bot, tbh) and I’m on my way… sort of. Which is to say I would be on my way, except now I must read a disclosure about cookies, perhaps adjust some settings and then “accept” or “save” or “surrender to” those preferences, or whatever the verbiage might be.

This is using the internet now, apparently. Instead of “surfing” and the freedom of movement that term suggests, it’s more like navigating a joyless obstacle course, in which I’m required to verify my age and/or my very humanity as I hop from step to step.

I’m sure this seems to many people like an overstated complaint. “So what?” they might say. Why is it a big deal to verify minor details like your age, or to have your internet path blocked in one way or another, based largely on where I live and where the site I’m accessing is located?

People used to call the internet the “information superhighway.” While this was an admittedly irritating buzz phrase, the term did at least capture the sense that the internet was something largely unfettered, where data, entertainment, information, misinformation and all manner of expressive content was available to all those able to access it.

Now, despite the fact I’ve been an adult for nearly 40 years, every time I turn around while online, I’m being asked to verify the fact of my adulthood anew. (Yes, I do visit a lot of porn sites; it sort of comes with the territory of – you know – working in and writing about the online porn industry.)

I understand a lot of people are hot to make the internet “safer,” but to me, this desire betrays an ignorance of what the internet is – or if not an ignorance of its nature, a stubborn desire to transform the internet to something else. But the internet, whatever else it might be, is a massive computer network about which the best thing has always been the worst thing, as well: Virtually anyone can publish virtually anything on it.

Slap as many age gates and regulations as you’d like on a massive, global, computer network; you’re still just engaging in an endless game of whack-a-mole. OfCom themselves reported that after the requirement that adult sites employ “Highly Effective Age Assurance” (HEAA) methods, VPN usage in the UK more than doubled, “rising from about 650k daily users before 25 July 2025 and peaking at over 1.4m in mid-August 2025.”

OfCom is undeterred by numbers like these, of course. Their inevitable answer will be to impose restrictions on VPN use. Because like any government regulatory agency, if there’s one thing OfCom will not be able to tolerate, it will be the sense they can’t control that which is in their remit to tame.

Speaking of OfCom, when I did finally satisfy their system that I’m a human who doesn’t want to spend a lot of time deciding which cookies he does and doesn’t want attaching to his browser, what I found was an explanation of – and almost an apology for – the upper limit of the agency’s regulatory reach with respect to AI chatbots.

After stating with apparent pride that OfCom was “one of the first regulators in the world to act on concerning reports of the Grok AI chatbot account on X being used to create and share demeaning sexual deepfakes of real people,” OfCom goes on to explain that “not all AI chatbots are regulated” by the agency.

“Broadly, the Online Safety Act regulates user-to-user services, search services and services that publish pornographic content,” OfCom explained. (They don’t say so, but just for your edification, this limited scope is due to sexually explicit depictions being awful, youth-corrupting and inherently sinister, while depictions of people getting shot in the head or beaten bloody with lead pipes are fine.)

On the other hand, “AI chatbots are not subject to regulation if they… only allow people to interact with the chatbot itself and no other users (i.e. they are not user-to-user services);

do not search multiple websites or databases when giving responses to users (i.e. are not search services); and cannot generate pornographic content.”

OfCom ends its notice with a how-to guide on reporting anything you find online “that you think might be harmful or illegal.”

I’d try reporting OfCom’s website itself for harmful content, because I sure feel like I’m getting dumber just by reading it… but I suspect to execute this vengeful little practical joke, I’d have to pass at least three captcha tests, verify my age seven times and produce some manner of HCPN (“Highly Compelling Proof of Netizenship”).

You know what? I think I’ll just read a book. So far as I’m aware, I’m not required to present ID to grab an old tome off the shelves in my study… yet.

Read More »

Conservative Push for Porn Taxes Sparks Constitutional Backlash

Tax

It feels like the walls are closing in a little more every week. As age-verification laws continue to reshape—and in some cases dismantle—the adult industry, a Utah lawmaker has now stepped forward with a bill that would slap a new tax on porn sites operating in the state. It’s the kind of proposal that makes you pause, reread the headline, and wonder how we got here so fast.

Introduced by Republican state senator Calvin Musselman, the bill would impose a 7 percent tax on total receipts “from sales, distributions, memberships, subscriptions, performances, and content amounting to material harmful to minors that is produced, sold, filmed, generated, or otherwise based” in Utah. If passed, it would take effect in May and require adult sites to pay an additional $500 annual fee to the State Tax Commission. According to the legislation, revenue from the tax would be directed to Utah’s Department of Health and Human Services to expand mental health support for teens.

A new strain of American conservatism is asserting itself more boldly, and lawmakers across the US are calling for tighter restrictions on adult content. In September, Alabama became the first state to introduce a porn tax—10 percent on adult entertainment companies—after passing age-verification mandates that require users to upload ID or other personal documentation before accessing explicit material. Pennsylvania lawmakers are also exploring a proposal that would tack an extra 10 percent tax onto subscriptions and one-time purchases from online adult platforms, despite already charging a 6 percent sales and use tax on digital products, two state senators wrote in an October memo. Other states have flirted with similar ideas before. In 2019, Arizona state senator Gail Griffin, a Republican, proposed taxing adult content distributors to help fund a border wall during Donald Trump’s first term. To date, 25 US states have enacted some form of age verification.

Professor Answers Television History Questions

Efforts to criminalize sex workers and regulate the industry have been unfolding for years, accelerating alongside increased online surveillance and censorship. Yet targeted taxes have repeatedly stalled, in part because the legality of such measures remains deeply contested.

“This kind of porn tax is blatantly unconstitutional,” says Evelyn Douek, an associate professor of law at Stanford Law School. “It singles out a particular type of protected speech for disfavored treatment, purely because the legislature doesn’t like it—that’s exactly what the First Amendment is designed to protect against. Utah may not like porn, but as the Supreme Court affirmed only last year, adults have a fully protected right to access it.”

Utah, Alabama, and Pennsylvania are among 16 states that have adopted resolutions declaring pornography a public health crisis. “We realize this is a bold assertion not everyone will agree on, but it’s the full-fledged truth,” Utah governor Gary Herbert tweeted in 2016 after signing the resolution. Utah’s early response to the spread of adult content dates back to 2001, when it became the first state to establish an office focused on sexually explicit material by appointing an obscenity and pornography complaints ombudsman. The role—often referred to as the “porn czar”—was eliminated in 2017.

“Age restriction is a very complex subject that brings with it data privacy concerns and the potential for uneven and inconsistent application for different digital platforms,” Alex Kekesi, vice president of brand and community at Pornhub, said previously. In November, the company urged Google, Microsoft, and Apple to adopt device-based verification across app stores and operating systems. “We have seen several states and countries try to impose platform-level age verification requirements, and they have all failed to adequately protect children.” To comply with existing mandates, Pornhub has blocked access to users in 23 states.

Critics argue that age verification has never truly been about protecting children, but about quietly scrubbing porn from the internet. In 2024, a leaked video from the Centre for Climate Reporting showed Russell Vought, a Trump ally and Project 2025 coauthor, describing age-verification laws as a “back door” to a federal porn ban.

Platforms like OnlyFans and Pornhub have pushed sex work further into the mainstream, but they’ve also made it easier to monitor and police both performers and audiences. As states consider new tariffs and penalties, it’s creators who are most likely to absorb the shock.

The cultural conservatism taking shape under Trump 2.0 is driven by a desire to punish sexual expression, says Mike Stabile, director of public policy at the Free Speech Coalition, the US adult industry’s trade association. “When we talk about free speech, we generally mean the freedom to speak, the ability to speak freely without government interference. But in this case, free also means not having to pay for the right to do so. A government tax on speech limits that right to those who can afford it.”

OnlyFans says it complies with all tax requirements in the jurisdictions where it operates, while creators remain responsible for their own tax obligations. Pornhub, which is currently blocked in Utah and Alabama, did not respond to a request for comment.

Douek points out that while states can regulate minors’ access to explicit material following the Supreme Court’s decision upholding Texas’ age-verification law, “a porn tax does nothing to limit minors’ access to this speech—it simply makes it more expensive to provide this content to adults.” A 2022 report from Common Sense Media found that 73 percent of teens aged 13 to 17 had viewed adult content online. Today, much of that exposure happens through social media platforms like X and Snap. A recent survey from the UK’s Office of the Children’s Commissioner found that 59 percent of minors encounter porn accidentally—up from 38 percent the year before—mostly via social feeds.

In Alabama, as would be the case in Utah, revenue from the porn tax is earmarked for behavioral health services, including prevention, treatment, and recovery programs for young people.

Last year, Alabama state representative Ben Robbins, the Republican sponsor of the bill, said adult content was “a driver in causing mental health issues” in the state. It’s a familiar claim among lawmakers advocating for a nationwide porn ban. While some studies suggest adolescent exposure to porn may correlate with depression, low self-esteem, or normalized violence, medical experts have never reached a clear consensus.

As lawmakers increasingly frame the issue around harm to minors, Stabile says it’s crucial to remember that adult content is not a special category outside the bounds of free expression. Courts have repeatedly struck down content-specific taxes as unconstitutional censorship.

“What if a state decided that Covid misinformation was straining state health resources and taxed newsletters who promoted it? What if the federal government decided to require a costly license to start a podcast? What if a state decided to tax a certain newspaper it didn’t like?” he says. “Porn isn’t some magical category of speech separate from movies, streaming services, or other forms of entertainment. Adult businesses already pay taxes on the income they earn, just as every other business does. Taxing them because of imagined harms isn’t just dangerous for our industry—it’s a dangerous expansion of government power.”

Read More »

‘An Embarrassment’: Critics Slam UK’s Proposed VPN Age Checks

VPN

It started the way these things always seem to start lately—with a vote that felt small on paper and enormous everywhere else. Politicians, technologists, and civil society groups reacted with visible dismay after the House of Lords backed a move that would ban children from using VPNs and force providers to roll out age verification.

The backlash was swift. Wikipedia co-founder Jimmy Wales blasted the decision on X, calling the UK’s position an embarrassment. Windscribe CEO Yegor Sak had already summed up the idea as the “dumbest possible fix,” warning that forcing age checks on VPNs would set a deeply troubling precedent for digital privacy.

By Tuesday morning, the argument had spilled fully into the open. Online debate surged, with X logging more than 20,000 posts on the issue in just 24 hours—one of those moments where you can almost hear the internet arguing with itself.

Labour, Lords & VPN laws

Last week, the House of Lords voted in favor of an amendment to the Children’s Wellbeing and Schools Bill that would, in effect, bar anyone under 18 from using VPNs.

The proposal would require commercial VPN providers to deploy mandatory age assurance technology, specifically to stop minors from using VPNs to bypass online safety measures. It sounds tidy in theory. In reality, it opens a can of worms no one seems eager to fully acknowledge.

Notably, the government itself opposed the amendment. Instead, it has opened a three-month consultation on children’s social media use, which includes a broader look at VPNs and how—or whether—they should be addressed.

Political pushback

Even though the House of Lords has shown its hand, the proposal now heads to the House of Commons, where it’s expected to hit serious resistance from the Labour government.

If the Commons throws it out, as many expect, the Lords will have to decide whether to dig in and trigger a round of parliamentary “ping-pong” or quietly step aside.

Labour’s Lord Knight of Weymouth, who voted against the amendment, suggested there’s little appetite for a drawn-out fight. He told TechRadar that it’s unlikely politicians will “die in a ditch” over banning VPNs.

In his view, many lawmakers are chasing “something iconic” on child safety—something headline-friendly—rather than wading into the technical swamp that regulating VPNs would require.

That said, Knight didn’t dismiss the broader concern. He argued that regulator Ofcom “needs to do better” at enforcing existing safety laws and agreed that more should be done to protect children online, provided it’s handled “carefully.” That word—carefully—does a lot of work here.

Civil society’s response

Regardless of whether this particular amendment survives, one thing is clear: VPNs are under a brighter spotlight than ever, and not just in the UK.

In the United States, lawmakers in Wisconsin are pushing a bill that would require adult websites to block access from users connected via a VPN. In Michigan, legislators have floated ideas around ISP-level blocking of circumvention tools. Different routes, same destination.

Evan Greer, director of the US-based group Fight for the Future, warned that policies designed to discourage or ban VPN use will “will put human rights activists, journalists, abuse survivors and other vulnerable people in immediate danger.”

Fight for the Future is running a campaign that lets users contact lawmakers directly, arguing in an open letter that the ability to use the internet safely and privately is a fundamental human right.

Back in the UK, a public petition is urging the government to reject any plan that would effectively ban VPNs for children.

The Open Rights Group has also been vocal, pointing out that detecting or banning VPN use isn’t realistically possible without resorting to what it calls an “extreme level of digital authoritarianism.”

And just in case anyone missed the point the first time, the reaction hasn’t softened. Politicians, technologists, and civil society organizations continue to express dismay after the House of Lords vote to ban children from using VPNs and force providers to introduce age verification.

Jimmy Wales again called the UK’s stance an embarrassment, while Windscribe CEO Yegor Sak repeated his warning that this is the “dumbest possible fix” and a terrible precedent for privacy.

The conversation flared once more as public debate peaked Tuesday morning, with more than 20,000 posts appearing on X in a single day—a reminder that when it comes to privacy, the internet rarely stays quiet for long.

Read More »

Alabama’s Latest Adult Content Law Pushes Creators Out, Not Toward Safety

Picture of the Alabama Flag

Most adult creators didn’t need a push notification to feel it. The moment the news started circulating, it landed with a familiar weight: Clips4Sale has restricted access in Alabama after the passage of House Bill 164, a law that introduces notarized consent requirements for performers and platforms. The company frames the decision as compliance—necessary, even prudent. Creators read it differently. To many, it felt like the ground quietly disappearing beneath their feet.

Both interpretations can coexist. And maybe that’s the most unsettling part.

Legislation like Alabama’s is almost always sold as “protective.” The language is comforting, even noble—designed to reassure the public that something dangerous is being handled. But when you listen to the people living under these laws—performers, indie creators, small operators—the tone shifts. What comes through isn’t relief. It’s confusion. Anxiety. A creeping sense that they’re being legislated out of existence without anyone actually talking to them.

House Bill 164 didn’t arrive out of nowhere. It’s part of a broader pattern unfolding across the country, where states are targeting adult platforms through new consent rules, age checks, and documentation standards. On paper, they sound reasonable. In reality, they unravel fast.

What they create isn’t safety. It’s splintering.

A Law That Misses the Reality of the Industry

Adult performers aren’t operating without rules. They never have been. For decades, the industry has been bound by strict federal record-keeping requirements—ID verification, age documentation, signed releases. These systems already exist. They’re already enforced. They’re already audited. And they’re treated seriously because the penalties for failure are brutal.

Which is exactly why Alabama’s law sparked disbelief instead of reassurance.

Adult performer Leilani Lei cut through the noise on X by asking a simple question: do lawmakers actually understand what notarization does? A notary verifies identity and witnesses a signature. That’s the full job description. They don’t assess consent. They don’t evaluate content. They don’t make legal judgments. Requiring notarization doesn’t increase safety—it adds friction, expense, and logistical chaos.

Is a notary expected on every set? For every solo clip? For content created privately by independent performers in their own homes? These aren’t dramatic hypotheticals. They’re practical questions that expose how disconnected the law is from how adult work actually functions.

When laws ignore operational reality, compliance stops being ethical and starts being geographic. Platforms block states. Creators lose access. Income vanishes—not because of misconduct, but because following the rules becomes impossible.

When “Protection” Quietly Becomes Economic Damage

One consequence of laws like HB 164 rarely gets discussed out loud: money.

Adult creators aren’t faceless entities. They’re people paying rent, covering medical bills, supporting families. For many, digital platforms aren’t side hustles—they’re lifelines. When a state gets geoblocked, creators living there lose their audience instantly. When platforms restrict access, creators with fans in that state watch sales drop overnight.

Cupcake SinClair’s response on X captured the mood perfectly—not panic, but dread. Not fear of regulation itself, but fear of where this path leads. If these laws keep spreading—each state tweaking the rules just enough—what does the landscape look like in a year? Two years? Does access slowly shrink until it’s determined entirely by ZIP code?

That’s not protection. That’s erosion.

And while platforms like Clips4Sale may view geoblocking as the least damaging option on the table, the fallout doesn’t land on the platform. It lands on creators. The backlash reflects more than anger—it reflects a growing sense that major decisions are being made without creators in the room, reshaping livelihoods without alternatives or support.

From the creator’s side, these aren’t abstract compliance choices. They translate into fewer customers, lower visibility, and more instability in an already fragile industry.

The Patchwork Problem Everyone Pretends Isn’t a Problem

One of the most dangerous aspects of this legislative trend is how inconsistent it is.

Each state passes its own version of “protective” law, often without coordination, consultation, or technical understanding. The result is a patchwork of requirements no platform can realistically meet across the board. What’s compliant in one state may be illegal in the next.

For massive tech companies, patchwork laws are an inconvenience. For adult platforms—already operating under heavier scrutiny, higher fees, and greater risk—they can be fatal.

For independent creators, they’re destabilizing by design.

When lawmakers ignore the cumulative effect of these laws, compliance becomes less about doing the right thing and more about surviving. Platforms that can’t afford bespoke, state-by-state systems opt out entirely. Creators are left scrambling to adapt, relocate, or rebuild somewhere else.

Who Is Actually Being Protected Here?

Supporters of laws like HB 164 often speak in moral absolutes. They invoke exploitation, trafficking, consent—serious issues that deserve serious responses.

But when legislation refuses to distinguish between criminal behavior and lawful adult work, it ends up punishing the latter while barely touching the former.

Bad actors don’t notarize forms. They don’t operate transparently. They don’t comply with documentation requirements. Meanwhile, compliant creators and legitimate platforms absorb the cost of laws that don’t meaningfully address wrongdoing.

Protection that collapses under scrutiny isn’t protection. It’s performance.

A Future Built on Exclusion Isn’t a Fix

The adult industry isn’t asking for no rules. It’s asking for rules that reflect reality.

That means lawmakers engaging with performers, platforms, and legal experts who understand how consent, documentation, and digital distribution actually work. It means recognizing that piling on procedural hurdles doesn’t automatically make anyone safer—and that cutting off access often harms the very people these laws claim to defend.

If this trend continues unchecked, the future of adult content in the U.S. won’t look like reform. It will look like retreat. More geoblocking. More platform withdrawals. More creators pushed out of legitimate marketplaces and into less secure corners of the internet.

That outcome serves no one—not performers, not platforms, and not the public.

Until the conversation moves beyond slogans and starts grappling with consequences, laws like Alabama’s will keep feeling less like protection and more like disappearance.

Read More »

Yet Another Version of the “PROTECT Act” Introduced by Morley Safeword

Section 230

Add Congressman Jimmy Patronis (R-Fla.) to the list of elected officials hellbent on repealing Section 230 of the Communications Decency Act.

In a press release issued January 14th, Patronis celebrated his introduction of H.R. 7045, AKA the “Promoting Responsible Online Technology and Ensuring Consumer Trust” (PROTECT) Act.

The argument Patronis made in support of his proposal is a well-worn one, rooted in the notion that Section 230 is enabling evil tech platforms to ruin America’s children by shielding them from liability for things published by third parties on those platforms.

“As a father of two young boys, I refuse to stand by while Big Tech poisons our kids without consequence,” Patronis said. “This is the only industry in America that can knowingly harm children, some with deadly consequences, and walk away without responsibility. Big Tech is digital fentanyl that is slowly killing our kids, pushing parents to the sidelines, acting as therapists, and replacing relationships with our family and friends. This must stop.”

There’s a reasonable argument to be had about whether the courts have extended Section 230’s coverage too far in some cases, but to hear people like Patronis tell it, the statute’s safe harbor provision allows “Big Tech” to do anything it pleases with total impunity.

“These companies design their platforms to hook children, exploit their vulnerability, and keep them scrolling no matter the cost,” Patronis added. “When children are told by an algorithm, or a chatbot, that the world would be better without them, and no one is being held responsible, something is deeply broken. I bet they would actually self-police their sticky apps and technologies if they knew they would have to pay big without the Big Tech Liability Protection of Section 230.”

In his press release, Patronis claims that “Section 230 shields social media companies and other online platforms from liability for content published on their sites.” This claim is a half-truth, at best. Section 230 shields social media companies from liability for content published by others on their sites. That’s an important distinction, not a distinction without a difference.

Let’s try a thought experiment here: Let’s suppose you’re a congressman whose website permits users to post comments in response to things you post on the site. Let’s further suppose one of your site’s users decides to post something defamatory about another of your colleagues. Would you want to be held directly liable for that comment? How about if instead of something defamatory, the user posted something patently illegal, like an image of a child being sexually abused; is Patronis saying my hypothetical congressman ought to go to prison in that scenario?

There are many reasons why groups like the Computer and Communications Industry Association (CCIA) are against the repeal of Section 230 – and yes, one of those reasons is that the CCIA is funded by everyone’s current favorite boogeyman, Big Tech. Another more important reason is the people behind the CCIA can see where this is all heading, if Section 230 is outright repealed and no safe harbor at all is provided for those who offer forums in which users can publish their content and comments.

“In the absence of Section 230, digital services hosting user-created content, including everything from online reviews to posts on social media, would risk constant litigation,” the CCIA asserted in an analysis published January 12th. “Continuing to provide services optimized for user experience would require massively increased legal expenses.”

How massively would those legal expenses increase? The CCIA said, given the sheer volume of user-generated posts published in a year, if “just one post or comment in a million led to a lawsuit, digital services could face over 1.1 million lawsuits per year following a Section 230 repeal.”

“A single lawsuit reaching discovery typically costs over $100K in fees, and sometimes much more,” CCIA correctly noted. “If companies face 1.1 million lawsuits, that’s $110 billion in legal costs annually.”

I suppose those who say Big Tech is the devil (while using the platforms enabled by Big Tech to say so) might think this is a good thing, I’m not sure they’ve thought this all the way through. If social media platforms can’t operate due to overwhelming legal costs, we lose all the good things about social media, too – not to mention a whole lot of jobs when those platforms inevitably go out of business.

From the perspective of the adult industry and those who enjoy adult entertainment, repealing Section 230 would likely spell the end of platforms allowing adult content creators to post self-produced content, as well. What platform would want to risk being held strictly liable for anything and everything depicted in the videos and photos adult creators produce? It would be absolute madness for platforms like OnlyFans and its competitors to maintain their current business model in the absence of Section 230 safe harbor.

Again, for those who think porn should be abolished, that development might be seen as a feature and not a bug where the idea of repealing Section 230 is concerned. But extend that same outcome to some platform they DO like – YouTube, TikTok, Facebook, Instagram, X or what have you – and they might not like the collapse quite as much.

From where I sit, the idea of repealing Section 230 should be accompanied by that old standby of a warning: “Be careful what you wish for, because you might just get it.”

Read More »

Utah’s “Porn Tax”: A Levy on Paper and Ink for the Internet Age by Morley Safeword

Tax

Back in the early 1970s, the Minnesota legislature altered the state’s sales tax such that it created a “use tax” on the cost of paper and ink, while exempting the first $100,000 worth of such materials in any calendar year.

In part due to that exemption, the use tax clearly was directed at the state’s larger periodicals, including the Minneapolis Star Tribune. The Star Tribune wasn’t simply one of eleven publications incurring tax liability under the statute in the early 70s; of the $893,355 in total tax revenue collected under the statute in 1974, the Star Tribune paid $608,634 – roughly two-thirds of the total revenue collected.

The Star Tribune sued the Minnesota Commissioner of Revenue, alleging that the state’s tax scheme violated the First Amendment. The resulting case, Minneapolis Star & Tribune Co v. Minnesota Commissioner of Revenue, was decided by the U.S. Supreme Court in 1983.

The Supreme Court ruled in favor of the newspaper, holding that the “main interest asserted by Minnesota in this case is the raising of revenue” and that while this interest was “critical to any government,” it wasn’t, by itself, enough for the law to survive scrutiny under the First Amendment.

“Standing alone, however, it cannot justify the special treatment of the press, for an alternative means of achieving the same interest without raising concerns under the First Amendment is clearly available,” Justice Sandra Day O’Connor wrote for the court, “the State could raise the revenue by taxing businesses generally, avoiding the censorial threat implicit in a tax that singles out the press.”

I’ve been thinking a lot about the Star Tribune case since first reading about SB 73, a new bill in Utah proposed by State Senator Calvin R. Musselman. There was a time when the 1983 decision would have given me confidence that Musselman’s bill wouldn’t survive court scrutiny, assuming it becomes law in the state. After the Supreme Court’s decision last summer in Free Speech Coalition v. Paxton, however, I’m a lot less certain.

Granted, there’s not a perfect analogy between the Minnesota law at issue in the Star Tribune case and the bill proposed in Utah, nor between the 1983 case and the Paxton case. But what has me feeling uneasy is the readiness of the current Supreme Court to throw aside precedent and impose a lower standard of review in cases where material alleged to be “harmful to minors” is at issue.

It’s not just reasonably well-informed laymen like me who are uncertain, either. In recent adult industry media coverageabout the Utah bill, litigators who are experts in the First Amendment were divided in their analysis, as well.

Speaking to XBIZ, attorney Larry Walters of FirstAmendment.com pointed out that in one recent case, “the Georgia Supreme Court upheld a 1% gross revenue tax on adult entertainment establishments in the face of a constitutional challenge.”

“But the court reasoned that the tax was justified not based on the entertainment content produced by the businesses, but on the alleged ‘adverse secondary effects’ of physical adult establishments, such as prostitution and child exploitation,” Walters noted, adding that there are “no recognized adverse secondary effects of online adult entertainment businesses. Accordingly, the Utah bill, if adopted, could be subject to a constitutional challenge as a violation of the First Amendment.”

Walters also observed that the Utah bill also could have trouble for it looming in the form of the Dormant Commerce Clause, which limits states’ ability to pass legislation that discriminates against of unreasonably burdens interstate commerce.

Walters’ colleague Corey Silverstein, commenting for the same article, was less optimistic.

“After the state-by-state pummeling of AV laws, this is only the beginning of another new trend of anti-adult and anti-free-speech laws that the entire industry needs to prepare for,” Silverstein said, also predicting that a challenge to the Utah bill, should it become law, would be unlikely to succeed.

One thing is certain: The Utah “porn tax” bill won’t be the end of state governments seeking to impose further regulatory burdens on the adult entertainment industry. Emboldened by their success in establishing age verification requirements, state legislatures across the country can be relied upon to cook up additional hurdles to put in the path of adult businesses, performers, content creators and anyone else engaged in expressing themselves through sexually explicit materials.

Read More »

GitHub Purges Adult Game Developers, Offers No Explanation

Anime two women

Something strange started rippling through a small, niche corner of the internet not long ago. Developers who build mods and plugins for hentai games and even interactive sex toys began waking up to missing repositories, locked accounts, and dead links. GitHub, the place many of them had treated like home base, had quietly pulled the rug out. No warning. No explanation. Just… gone.

From conversations within the community, the rough headcount quickly took shape: somewhere between 80 and 90 repositories, representing the work of roughly 40 to 50 people, vanished in a short window. Many of the takedowns seemed to cluster around late November and early December. A large number of the affected accounts belonged to modders working on games from Illusion, a now-defunct Japanese studio known for titles that mixed gameplay with varying degrees of erotic content. One banned account alone reportedly hosted contributions from more than 30 people across 40-plus repositories, according to members of the modding scene.

What made the situation feel especially surreal was the silence. Most suspended developers say they were never told which rule they’d broken—if any. Their accounts simply stopped working. Several insisted they’d been careful to stay within GitHub’s acceptable use guidelines, avoiding anything overtly explicit. The code was functional, technical, sometimes cheeky in naming, but never pornographic. At least, not in the way most people would define it.

“Amongst my repositories there were no explicitly sexual names or images anywhere in the code or the readme, the most suggestive naming would be on the level of referencing the dick as ‘the men thing’ or referencing the sex as ‘huffing puffing,’” one developer, Danil Zverev, told me. He makes plugins for an Illusion game called Koikatsu. Zverev said he’s been using Github for this purpose since 2024, but on November 18, his Github page was “completely deleted,” he said. “No notifications anywhere, simply a 404 error when accessing the page and inability to log in on the web or in the mobile app. Also it does not allow me to register a new account with the same name or email.”

The timing raised eyebrows. GitHub had updated its acceptable use policies in October 2025, adding language that forbids “sexually themed or suggestive content that serves little or no purpose other than to solicit an erotic or shocking response, particularly where that content is amplified by its placement in profiles or other social contexts.” The policy explicitly bars pornographic material and “graphic depictions of sexual acts including photographs, video, animation, drawings, computer-generated images, or text-based content.”

At the same time, the policy leaves room for interpretation. “We recognize that not all nudity or content related to sexuality is obscene. We may allow visual and/or textual depictions in artistic, educational, historical or journalistic contexts, or as it relates to victim advocacy,” GitHub’s terms of use state. “In some cases a disclaimer can help communicate the context of the project. However, please understand that we may choose to limit the content by giving users the option to opt in before viewing.”

The Anti-Porn Crusade That Censored Steam and Itch.io Started 30 Years Ago

Keywords and tags have never been a useful metric for distilling nuance. Pushing for regulations based on them is repeating a 30-year history of porn panic online.

SAMANTHA COLE

Zverev didn’t bother appealing. He said writing to support felt pointless and chose instead to move on to another platform. Others tried to fight it—and found themselves stuck in limbo.

A developer who goes by VerDevin, known for Blender modding guides, utility tools, and plugins for the game Custom Order Maid 3D2, said users began reporting trouble accessing his repositories in late October. Oddly, he could still see his account when logged in, but not when browsing while logged out.

“Turned out, as you already know, that my account was ‘signaled’ and I had to purposefully go to the report section of Github to learn about it. I never received any notifications, by mail or otherwise,” VerDevin told me. “At that point I sent a ticket asking politely for clarifications and the proceedings for reinstatement.”

The response from GitHub Trust & Safety was vague and procedural: “If you agree to abide by our Terms of Service going forward, please reply to this email and provide us more information on how you hope to use GitHub in the future. At that time we will continue our review of your request for reinstatement.”

VerDevin replied the next day, agreeing to comply and offering to remove whatever GitHub considered inappropriate—despite still not knowing what that was. “I did not take actual steps toward it as at that point I still didn’t know what was reproach of me,” they said.

A full month passed before GitHub followed up. “Your account was actioned due to violation of the following prohibition found in our Acceptable Use Policies: Specifically, the content or activity that was reported included multiple sexually explicit content in repositories, which we found to be in violation of our Acceptable Use Policies,” GitHub wrote.

“At that point I took down several repositories that might qualify as an attempt to show good faith (like a plugin named COM3D2.Interlewd),” they said. GitHub restored the account on December 17—weeks later, and just one day after additional questions were raised about the ban—but never clarified which content had triggered the action in the first place.

Requests for explanation went unanswered. Even when specific banned accounts were flagged to GitHub’s press team, the response was inconsistent. Some accounts were reinstated. Others weren’t. No clear reasoning was ever shared.

The whole episode highlights a problem that feels painfully familiar to anyone who’s worked on the edges of platform rules: adult content policies that are vague, inconsistently enforced, and devastating when applied without warning. These repositories weren’t fringe curiosities—they were tools used by potentially hundreds of thousands of people. The English-speaking Koikatsu modding Discord alone has more than 350,000 members. Another developer, Sauceke, whose account was suspended without explanation in mid-November, said users of his open-source adult toy mods are now running into broken links or missing files.

“Perhaps most frustratingly, all of the tickets, pull requests, past release builds and changelogs are gone, because those things are not part of Git (the version control system),” Sauceke told me. “So even if someone had the foresight to make mirrors before the ban (as I did), those mirrors would only keep up with the code changes, not these ‘extra’ things that are pretty much vital to our work.”

GitHub eventually reinstated Sauceke’s account on a Tuesday—seven weeks after the original suspension—following renewed questions about the bans. Support sent a brief note: “Thank you for the information you have provided. Sorry for the time taken to get back to you. We really do appreciate your patience. Sometimes our abuse detecting systems highlight accounts that need to be manually reviewed. We’ve cleared the restrictions from your account, so you have full access to GitHub again.”

Even so, the damage lingers. In Sauceke’s account and others, including the IllusionMods repository, release files remain hidden. “This makes the releases both inaccessible to users and impossible to migrate to other sites without some tedious work,” Sauceke said.

Accounts may come back. Repositories might be restored. But for many developers, the trust is already gone—and that’s the kind of thing that doesn’t reinstall quite so easily.

GitHub isn’t just another code host—it’s the town square for open-source developers. For adult creators especially, who are used to being quietly shoved to the margins everywhere else, visibility there actually matters. It’s how people find each other, trade ideas, and build something that feels bigger than a solo side project. “It’s the best place to build a community, to find like-minded people who dig your stuff and want to collaborate,” Sauceke said. But if this wave of bans stretches beyond hentai game and toy modders, they warned, it could trigger a slow exodus. Some developers aren’t waiting around to find out, already packing up their repositories and moving them to GitGoon, a platform built specifically for adult developers, or Codeberg, a nonprofit, Berlin-based alternative that runs on a similar model.

Read More »

Age Verification Push Sends Millions of Britons to Unregulated Porn Sites, Charity Says

Age verification

Something strange has been happening since age verification checks quietly became part of everyday internet life in the UK. You’d think stricter rules would close doors. Instead, a lot of people seem to be wandering down darker hallways.

New research from the Lucy Faithfull Foundation suggests that nearly 45 per cent of UK porn users have visited websites without age verification checks since the rules came into force last summer under the Online Safety Act.

The poll, which surveyed more than 3,700 people across Britain, found that 39 per cent of those who visited these sites ended up watching content that made them feel uncomfortable. Even more telling, 40 per cent said what they saw was enough to put them off returning altogether.

The charity, which focuses on preventing online child sexual abuse, has warned that these unregulated spaces can quietly increase the risk of people stumbling into harmful — and illegal — material.

Vicky Young, who leads the Stop It Now UK and Ireland anonymous helpline, said these sites can become a dangerous stepping stone toward indecent images of children.

“We work with people who have looked at indecent images of children to try and address that behaviour, to help support them to change their behaviour,” she said.

“One of the things that people say to us frequently is that they started looking at legal adult pornography and that their behaviour escalated. In part they were spending maybe longer online, but also the sort of content that they were looking at became more extreme and often started getting younger, and that’s when they then crossed into illegal behaviour, so looking at indecent images of children.”

That pattern — the slow creep from curiosity to something far more serious — is what worries the charity most.

“Because of that pathway and that coming out constantly in conversations we have with people, it concerns us that if people are accessing sites where there is this riskier content, that actually they are putting themselves at a higher chance of accessing indecent images,” she said.

“Sometimes that might not be intentional in the beginning, but what people tell us is that actually, if they come across those images as part of their other pornography, that then sparks curiosity. There’s something that kind of adds to the excitement around the risk, and they don’t necessarily stop at one image. They actually then start looking for more images.”

The survey also revealed a quieter anxiety bubbling under the surface. Nearly 30 per cent of respondents said they were worried about how much pornography they consume. That concern was highest among young men aged 18 to 24, with more than half admitting it’s something that troubles them — a group the Foundation describes as particularly vulnerable.

At the same time, the rules appear to be forcing a moment of self-reflection for many. Almost 47 per cent said they’ve reduced how much pornography they watch since age checks were introduced, while 55 per cent said the changes made them stop and think about their habits.

Recent data backs that up. Enforcement of highly effective age assurance led to an immediate drop in traffic to major porn sites from late July. But as one door closed, another cracked open: VPN use surged as people looked for ways around the new barriers.

The UK’s most visited adult site recorded a drop of around 1.5 million viewers year on year, falling from 11.3 million in August 2024 to 9.8 million this August.

Meanwhile, VPN usage more than doubled after the rules came in, jumping from roughly 650,000 daily users to a peak of over 1.4 million in mid-August 2025. Although that number has since dipped, it still hovered around 900,000 by November.

In other words, fewer people are walking through the front door — but plenty are still trying the side entrance.

Dr Alexandra Bailey, head of psychology at the Foundation and an associate professor at the University of Roehampton, said the intention behind age verification is sound, but the consequences are more complicated.

“Age verification is vital to protect children, and we fully support it,” she said. “But we also need to recognise that some adults are choosing riskier sites to avoid age checks. These sites can expose people to harmful material, including illegal content depicting child sexual abuse. Even if you’re not looking for it, you could encounter it — and that can have serious life-changing consequences.”

She added that the rules have created a pause many people probably needed, but not everyone is responding in a safe way.

“Age verification is also prompting adults to reflect on their online behaviour, which can be a good thing for people worried about their porn use. But we need to address the risks for those who are turning to sites that avoid the new regulations.

“Every day, our advisors speak to people whose pornography use has spiralled into something much more harmful. We know embarrassment can stop people from reaching out, but confidential help is available. If you’re worried about your own behaviour or someone else’s, contact Stop It Now before it’s too late.”

Sometimes the most dangerous part isn’t the rule itself — it’s what people do when they decide to dodge it.

Read More »

2025: The Year Tighter Regulation Came to Town for the Online Adult Industry by Morley Safeword

Adults only button

When I got my start in the online sector of the adult entertainment business, back in the mid-nineties, there was no video streaming. Individual photos often took web users several minutes to download. And you hardly heard a peep from anyone suggesting that the fledgling industry needed to be reined in.

To be fair, many people were only vaguely aware of what was available on the internet at the time, let alone worried about what their kids might be looking for on there – and frankly, the web was so slow, using it exceeded the patience of a lot of kids, anyway.

Oh, how things have changed.

What evolved fastest, of course, was the technology underpinning the internet. As high-speed connectivity became the norm rather than the exception and video streaming capabilities increased year over year, online porn went from something enjoyed by a small subset of early adopters to a massive, multibillion dollar industry. Along with those changes in technology came ever louder calls for the online adult industry to be more tightly regulated – or regulated at all, in the still-early-internet days of the mid-nineties.

In the United States, Congress began cooking up proposals to prevent minors from accessing online porn. While these proposals enjoyed broad bipartisan support (within the legislature, at least), what they didn’t get was much support from the courts.

Early attempts to impose things like age verification requirements were slapped down by the courts, most notably in cases like Reno v. ACLU, decided in 1997. In Reno, the Supreme Court held that certain provisions of the Communications Decency Act of 1996 (“CDA”) violated the First Amendment. Specifically, the court found that the CDA’s “indecent transmission” and “patently offensive display” provisions trod upon the freedom of speech protected by the First Amendment.

What changed in 2025, as the Supreme Court again considered an age verification proposal, this time a state law passed in Texas (“HB 1181”), was in part the continued forward march of technology. But more crucially, what changed was the court’s disposition as to which “standard of review” ought to be applied.

In previous cases involving online age verification proposals, the court has applied “strict scrutiny,” a high bar that requires the government to show its actions (and laws) are “narrowly tailored” to further a “compelling government interest” and are the “least restrictive means” to further that interest.

In the case Free Speech Coalition v. Paxton, which the Supreme Court decided in June, the district court had applied strict scrutiny and found that HB 1181 failed to satisfy the standard. When the case reached the Supreme Court, however, the majority decided the court had erred in applying strict scrutiny and that the correct standard to apply was “intermediate scrutiny,” which sets the bar much lower for the government.

Writing for the majority, Justice Clarence Thomas asserted that HB 1181 has “only an incidental effect on protected speech.”

“The First Amendment leaves undisturbed States’ traditional power to prevent minors from accessing speech that is obscene from their perspective,” Thomas wrote. “That power includes the power to require proof of age before an individual can access such speech. It follows that no person – adult or child – has a First Amendment right to access such speech without first submitting proof of age.”

Since the law “simply requires adults to verify their age before they can access speech that is obscene to children,” Thomas found that HB 1181 “is therefore subject only to intermediate scrutiny, which it readily survives.”

The three justices who dissented from the majority’s position didn’t see things quite the same way, naturally. In her dissent, Justice Elena Kagan criticized the majority’s holding as “confused” and highlighted the ways in which it departed from the court’s previous rulings in similar cases.

“Cases raising that question have reached this Court on no fewer than four prior occasions – and we have given the same answer, consistent with general free speech principles, each and every time,” Kagan observed. “Under those principles, we apply strict scrutiny, a highly rigorous but not fatal form of constitutional review, to laws regulating protected speech based on its content. And laws like H. B. 1181 fit that description: They impede adults from viewing a class of speech protected for them (even though not for children) and defined by its content. So, when we have confronted those laws before, we have always asked the strict scrutiny question: Is the law the least restrictive means of achieving a compelling state interest? There is no reason to change course.”

Whether there was reason to change course or not, surely now the course has been changed. Make no mistake, laws like HB 1181 are here to stay – and they will be followed by other measures designed to restrict access to sexually-explicit materials online, as well as regulation which goes much further and sweeps in an even broader range of controversial content.

The old cliché about the “canary in the coal mine” has often been applied to pornography in the context of free speech discussions. Even those who don’t like or approve of porn have often warned that crackdowns on sexually explicit expression can presage attempts at regulating other forms of speech.

If indeed those of us who work in the adult industry are part of a sentinel species, the warning to our peers in the broader world of entertainment and self-expression could not be more clear, as we look out to 2026 and beyond: Here in the world of online porn canaries, we’re choking on this new regulatory push – and most likely, some of you other birds are going to be feeling short of breath too, soon enough.

Read More »