Law books and a gavel

Court Dismisses NCOSE-Supported Cases Targeting Adult Websites Under Kansas AV Law

A federal judge quietly slammed the brakes this week on a pair of lawsuits that were meant to test Kansas’ age-verification law — and in doing so, reminded everyone that the internet doesn’t neatly respect state lines, no matter how badly lawmakers might want it to.

Last year, a conservative anti-pornography group brought lawsuits against four adult websites on behalf of a 14-year-old Kansas resident and her mother. Two of those suits went after Titan Websites, which runs HentaiCity.com, and ICF Technology, which operates Jerkmate.com. The core claim was simple and familiar by now: the teenager allegedly accessed content on the sites without her age being verified.

But Judge Holly Teeter of the U.S. District Court for the District of Kansas wasn’t persuaded. She dismissed both cases outright, pointing to something that often gets lost in the political noise — jurisdiction still matters.

In the case against Titan Websites, Teeter ruled that the plaintiffs failed to show HentaiCity.com had “purposefully directed its activities at Kansas.”

“The contacts between Defendant and the forum were not due to discriminating, intentional conduct that targeted Kansas,” Teeter wrote. “Rather, they were the random, and fortuitous contacts inherent in the operation of an indiscriminate and universally accessible website … This is insufficient to support the exercise of specific personal jurisdiction.”

If you’ve ever run a website, that language probably lands with a thud of recognition. The internet is global by default. You don’t wake up deciding to “target Kansas” unless you’re buying billboards off I-70 or running a very specific ad campaign. Sometimes a click is just a click.

In a statement, the Free Speech Coalition welcomed the decision as a meaningful step forward.

“As the first age verification case filed by a private plaintiff to reach final resolution, the ruling suggests that private plaintiffs can lack personal jurisdiction to sue out-of-state website operators under the Kansas statute,” the statement reads.

The organization’s executive director said the ruling offers “critical guidance” for platforms trying to navigate age-verification laws in Kansas and beyond — a legal patchwork that’s getting harder to track by the month.

“While not precedent-setting, nor necessarily applicable in every case, the District Court’s ruling is an important victory against state laws enforced by private rights of action,” said Boden. “In the meantime, the threat of litigation is real, and we encourage our members to continue to comply with all applicable laws.”

That last part matters. This wasn’t a mic-drop that ends the conversation forever. The plaintiffs still have the option to appeal, and the broader legal fight is far from over.

Two other cases backed by the same group are still winding their way through the system. In the lawsuit against Multi Media LLC, which operates Chaturbate.com, the judge granted a motion to compel arbitration and paused the case while that process plays out. In the case against Techpump Solutions, which runs Superporn.com, the court hasn’t yet ruled on a motion to dismiss for lack of jurisdiction.

So yes, one door just closed — but plenty of others remain cracked open. And if there’s a lesson here, it’s this: the battle over age verification isn’t just about who clicks what. It’s about where the law thinks the internet actually lives.

Read More »
Discord logo

Discord Plans Mandatory Age Verification for All Users in 2026

Something quietly fundamental is about to change on one of the internet’s most familiar hangouts. Discord’s senior leadership confirmed this week that age verification will become mandatory for all users starting in March 2026, alongside a shift to what the company calls “teen-by-default” settings across the entire platform.

The expanded safety rollout, according to the company, is meant to create “a safer and more inclusive experience for users over the age of 13.” On paper, it sounds tidy. In practice, it signals a pretty big cultural shift for a platform that’s long felt like the digital equivalent of a messy, unlocked group chat.

“As part of this update, all new and existing users worldwide will have a teen-appropriate experience by default, with updated communication settings, restricted access to age-gated spaces, and content filtering that preserves the privacy and meaningful connections that define Discord,” the company said. It’s a careful balance they’re trying to strike — safety without sanding off the personality that made people show up in the first place.

“Nowhere is our safety work more important than when it comes to teen users, which is why we are announcing these updates in time for Safer Internet Day,” said Savannah Badalich, Discord’s head of product policy, referencing the February 10 awareness initiative. “Rolling out teen-by-default settings globally builds on Discord’s existing safety architecture, giving teens strong protections while allowing verified adults flexibility.”

Badalich added, “We design our products with teen safety principles at the core and will continue working with safety experts, policymakers, and Discord users to support meaningful, long-term wellbeing for teens on the platform.” It’s the kind of statement you’d expect — earnest, forward-looking, and clearly written with regulators peeking over shoulders.

Under the new “teen-by-default” framework, users will have to go through age-verification steps to access channels and servers labeled as “age-restricted.” That includes spaces run by adult content creators, online sex workers, sexually themed communities, sexual animation hubs, and certain fan communities that live closer to the edges of the platform.

There’s also an unspoken tension here that’s hard to ignore. Discord has been down this road before, and not without bruises.

The platform previously experienced a data breach involving one of its age-verification vendors, exposing sensitive verification materials, including government-issued identification. For users who already feel uneasy about handing over personal documents online, that memory hasn’t exactly faded.

That incident stemmed from mistakes by a customer experience vendor, 5CA, which outsources work to customer service agents in countries including the Philippines. Discord’s primary age-verification partner, K-ID, later stated that it had no involvement in the breach tied to its standard verification systems.

So here we are again — a platform promising better protection, safer defaults, and stronger guardrails, while carrying the weight of past missteps. Maybe this time the systems hold. Maybe trust rebuilds. Or maybe the internet does what it always does and asks the same old question, just in a new tone: how much safety is worth how much control?

Read More »
Age verification

The Human Cost of Overregulation by Morley Safeword

Over the decades I’ve worked in the adult entertainment business, it has struck me many times how concerned the industry’s critics appear to be about the welfare of those of us who work in the industry – and how quickly that concern turns to consternation and scorn, should we insist that we’re doing what we do gladly and of our own free will.

“Nonsense,” the critics say, “these poor souls only think they are engaging in this depravity willingly; the truth is they have been brainwashed, coerced, cajoled and manipulated into believing they want to participate in this filth.”

Granted, not a lot of people have spilled ink along these lines to fret over the wellbeing of freelance writers like me. I think we’re counted as being among the exploiters, rather than the exploited, or perhaps as enablers of exploitation. Still, there’s no denying I derive my living, meager though it may be, from adult entertainment, even if all I do is write about it, rather than perform in or film it.

While many of the regulations aimed at the adult industry are couched as attempts to protect minors from the alleged harm of viewing pornography, when these measures are discussed by their proponents, “success” is often defined as making the adult industry retreat from their jurisdiction altogether. If a site like Pornhub blocks visitors from an entire state, including all the adults in that state who are still legally entitled to access the site even under newly established age verification mandates, those who cooked up the laws often describe this development as a sign the law is “working.” As I’ve written before, the chilling effect is a feature of these measures, not a bug.

By the same token, if a new law or regulation makes it harder for adult content creators to make their own movies, distribute their own photos or perform live on webcams, that too is something to be celebrated by the legislators and activists who champion those regulations.

Gone is all thought or discussion of the wellbeing of adult content creators and performers, once the potential cause of harm is the law itself. This holds true of purported “anti-trafficking” statutes. While sex workers themselves largely oppose measures like FOSTA/SESTA and say the law has made them less safe, not more, the proponents and sponsors of such legislation don’t want to hear it. Yes, these paternalistic politicos and crusading critics will protect these wayward adults from themselves, even if it kills them.

I can only imagine that if a state legislator from any of the dozens of states that have passed age verification requirements were to learn that adult content creators (and the platforms that host their work) are having a harder time earning a living under these new regulatory schemes, their response would be brief and callous: “Good,” they’d probably say, “now they can go out and look for more respectable work!”

And what happens when former porn performers do find work in other fields? The stigma of porn follows them. They get fired. They are told their mere presence in a classroom is disruptive. They are hounded on social media. They are treated like pariahs by the very people who supposedly care about their welfare.

A law or regulation can be well-intended and still do harm. I don’t doubt some of the politicians involved in crafting age-verification laws and other purportedly protective regulations believe they are doing things in the best interests of both minors and the adults who work in porn, or in the broader world of sex work. But it’s hard to believe they truly care about the latter two when there’s so little thought given to the potential negative impact on them during the crafting of these laws.

As more states toy with the idea of establishing a “porn tax,” will any of them pause to consider the impact on the human beings targeted by such taxes? I’d strongly advise not trying to hold your breath while waiting for that manner of concern to be expressed.

Read More »
Virginia flag

Virginia Lawmakers Hit Pause on Proposed ‘Porn Tax’ Until 2027

It stalled quietly, almost anticlimactically — a pause button hit on a bill that was supposed to make noise. A Virginia House of Delegates subcommittee voted Monday to push off, until next year, a proposal that would slap a 10% tax on the gross receipts of adult websites doing business in the state.

The bill, HB 720, would apply that tax to revenue from adult sites “produced, sold, filmed, generated, or otherwise based” in Virginia — a definition broad enough to make even seasoned tax attorneys squint and reread it twice.

At a Finance subcommittee hearing, the bill’s sponsor, Delegate Eric Zehr, framed the measure as something more than just another line item in the tax code. While most proposed tax hikes, he said, tend to discourage businesses that “contribute in a positive way without societal detriment,” he argued that commercial adult sites fall into a different category altogether.

“They contribute to the mental health crisis straining our behavioral health system,” Zehr said. “Their profit is our loss.”

Under the proposal, money raised by the tax would flow into Virginia’s Behavioral Health and Developmental Services Trust Fund — a pool created to support care and treatment for people relying on public mental health, developmental, and substance abuse services.

“The purveyors of this content are profiting while the rest of us are paying,” Zehr argued. “Those profiting at the expense of our children need to pitch in.”

He also pointed to Virginia’s age verification law, passed in 2023, praising it as a step forward — but not nearly a final one. In his view, the law simply hasn’t gone far enough.

“This legislation would disincentivize these providers from further damaging our children’s mental health and development, and simultaneously help promote their mental health by increasing the income going into the Behavioral Health and Developmental Services Trust Fund,” Zehr told the subcommittee. “This isn’t simply an ideological attack. There is a direct connection, a direct line between this and what we’re paying for in the mental health system.”

Others in the room weren’t convinced. Subcommittee members raised red flags about constitutionality — concerns that legislative counsel tied to potential conflicts with freedom of speech — and about how the bill would work in practice, especially given the global, borderless nature of the adult industry.

Delegate Vivian Watts put it bluntly: “Trying to determine how we could enforce this, particularly as a tax matter, would be extraordinarily complicated.”

In the end, the subcommittee voted unanimously to carry the bill over until the 2027 legislative session, effectively shelving it for now.

As the gavel came down, Subcommittee Chair Phil Hernandez offered Zehr a parting note that sounded less like a rejection and more like a long pause: “We want to give you a chance to keep working on this.”

Read More »
Virginia flag

Another State, Same Playbook: Virginia Eyes Tax on Adult Content

Something about Virginia lawmakers circling adult websites with a calculator in hand feels oddly familiar. Down in Richmond, the House of Delegates is weighing a bill that would slap a 10% tax on the gross receipts of adult websites doing business in the state—and you can almost hear the gears turning as it lands on the docket.

House member Eric Zehr’s HB 720 proposes a new 10% levy on the gross receipts of “any commercial entity operating an adult website for all sales, distributions, memberships, subscriptions, performances, and other content amounting to material harmful to minors that is produced, sold, filmed, generated, or otherwise based in the Commonwealth.” It’s the kind of language that sprawls across the page, dense and deliberate, like it wants to leave as little wiggle room as possible.

If that wording gives you déjà vu, you’re not imagining it. In what could be an early signal of a new “copycat” trend—eerily reminiscent of the wave of state age-verification bills that followed Louisiana’s 2022 AV law—the Virginia proposal mirrors language in a bill currently pending in Utah, which calls for a 7% tax on adult sites.

Virginia wouldn’t be breaking new ground here. Alabama imposed a similar 10% tax last year, and since then, state senators in Pennsylvania have openly kicked around the idea of doing the same, pointing to what they describe as “successful approaches in other jurisdictions.” That phrase has a way of traveling fast once it enters the legislative bloodstream.

The concept has even been turned up to eleven elsewhere. Most recently, a candidate seeking the Republican gubernatorial nomination in Florida grabbed headlines by proposing a 50% “sin tax” on the earnings of OnlyFans models living in the Sunshine State—an eye-popping number that felt designed as much for attention as for policy debate.

In Virginia’s case, supporters say the money wouldn’t just disappear into the general fund. Revenue from the proposed tax would be directed to the state’s Behavioral Health and Developmental Services Trust Fund, which supports care and treatment for individuals receiving public mental health, developmental, and substance abuse services.

That earmark isn’t unique, either. It echoes a similar directive in the Utah bill, reinforcing the sense that these proposals aren’t just inspired by one another—they’re following a template, almost step for step, as if the next version is already being drafted somewhere just offstage.

Read More »
Arcom logo

Arcom Moves to Block or Delist Adult Sites Over Age-Verification Failures

PARIS — There’s a particular kind of chill that runs through an industry when the letters stop being polite reminders and start sounding like countdown clocks.

Earlier this month, France’s digital watchdog Arcom quietly moved from warning shots to something sharper. In a statement released Tuesday, the agency confirmed that, at the beginning of December 2025, it sent enforcement letters to three adult websites it believes were ignoring the country’s age-verification requirements under the Security and Regulation of the Digital Space (SREN) law.

A few weeks passed. Enough time to fix things. Enough time to at least try. Two of the sites didn’t.

So now the tone has changed. Arcom has issued formal notices to those two operators, giving them 15 days to comply with the law or risk being blocked and/or delisted entirely. Fifteen days isn’t much time in tech, but it’s a lifetime in regulatory terms. It’s the kind of deadline that makes inboxes sweat.

The third site isn’t off the hook either. Arcom says it plans to work directly with that operator to evaluate whether its age-verification solution actually does what it claims to do. Not just ticking a box, but functioning in the real world, where friction, privacy, and compliance collide.

Notably, the agency didn’t name the websites involved or disclose where they’re based. That silence feels intentional. This isn’t about shaming specific players; it’s about setting a precedent. The statement frames the move as part of Arcom’s already-telegraphed plan to widen enforcement beyond the biggest platforms and start pulling smaller adult sites into the compliance spotlight.

It’s a reminder that flying under the radar isn’t a strategy anymore. The radar got better.

Read More »
Ofcom logo

Click Here to Keep Clicking Here, So You Can Click There (Eventually) by Stan Q. Brick

I was on the lookout for something to write about. “I know,” I thought, “I’ll see what the latest news is to come out of OfCom, the UK’s regulatory authority for broadcast, internet, telecommunications and postal services!

In the old days, days I remember with great fondness, I could have just typed Ofcom.org.uk into the nav bar on my browser and I’d be there, reading the latest from Ofcom. Not anymore – because now, even to read a drab, dull, regulatory agency’s website, first I must satisfy a machine’s demand that I prove I’m human, first.

No big deal. Just a simple captcha test (one probably easily defeated by a sophisticated enough bot, tbh) and I’m on my way… sort of. Which is to say I would be on my way, except now I must read a disclosure about cookies, perhaps adjust some settings and then “accept” or “save” or “surrender to” those preferences, or whatever the verbiage might be.

This is using the internet now, apparently. Instead of “surfing” and the freedom of movement that term suggests, it’s more like navigating a joyless obstacle course, in which I’m required to verify my age and/or my very humanity as I hop from step to step.

I’m sure this seems to many people like an overstated complaint. “So what?” they might say. Why is it a big deal to verify minor details like your age, or to have your internet path blocked in one way or another, based largely on where I live and where the site I’m accessing is located?

People used to call the internet the “information superhighway.” While this was an admittedly irritating buzz phrase, the term did at least capture the sense that the internet was something largely unfettered, where data, entertainment, information, misinformation and all manner of expressive content was available to all those able to access it.

Now, despite the fact I’ve been an adult for nearly 40 years, every time I turn around while online, I’m being asked to verify the fact of my adulthood anew. (Yes, I do visit a lot of porn sites; it sort of comes with the territory of – you know – working in and writing about the online porn industry.)

I understand a lot of people are hot to make the internet “safer,” but to me, this desire betrays an ignorance of what the internet is – or if not an ignorance of its nature, a stubborn desire to transform the internet to something else. But the internet, whatever else it might be, is a massive computer network about which the best thing has always been the worst thing, as well: Virtually anyone can publish virtually anything on it.

Slap as many age gates and regulations as you’d like on a massive, global, computer network; you’re still just engaging in an endless game of whack-a-mole. OfCom themselves reported that after the requirement that adult sites employ “Highly Effective Age Assurance” (HEAA) methods, VPN usage in the UK more than doubled, “rising from about 650k daily users before 25 July 2025 and peaking at over 1.4m in mid-August 2025.”

OfCom is undeterred by numbers like these, of course. Their inevitable answer will be to impose restrictions on VPN use. Because like any government regulatory agency, if there’s one thing OfCom will not be able to tolerate, it will be the sense they can’t control that which is in their remit to tame.

Speaking of OfCom, when I did finally satisfy their system that I’m a human who doesn’t want to spend a lot of time deciding which cookies he does and doesn’t want attaching to his browser, what I found was an explanation of – and almost an apology for – the upper limit of the agency’s regulatory reach with respect to AI chatbots.

After stating with apparent pride that OfCom was “one of the first regulators in the world to act on concerning reports of the Grok AI chatbot account on X being used to create and share demeaning sexual deepfakes of real people,” OfCom goes on to explain that “not all AI chatbots are regulated” by the agency.

“Broadly, the Online Safety Act regulates user-to-user services, search services and services that publish pornographic content,” OfCom explained. (They don’t say so, but just for your edification, this limited scope is due to sexually explicit depictions being awful, youth-corrupting and inherently sinister, while depictions of people getting shot in the head or beaten bloody with lead pipes are fine.)

On the other hand, “AI chatbots are not subject to regulation if they… only allow people to interact with the chatbot itself and no other users (i.e. they are not user-to-user services);

do not search multiple websites or databases when giving responses to users (i.e. are not search services); and cannot generate pornographic content.”

OfCom ends its notice with a how-to guide on reporting anything you find online “that you think might be harmful or illegal.”

I’d try reporting OfCom’s website itself for harmful content, because I sure feel like I’m getting dumber just by reading it… but I suspect to execute this vengeful little practical joke, I’d have to pass at least three captcha tests, verify my age seven times and produce some manner of HCPN (“Highly Compelling Proof of Netizenship”).

You know what? I think I’ll just read a book. So far as I’m aware, I’m not required to present ID to grab an old tome off the shelves in my study… yet.

Read More »
Click to cancel

FTC Inches Closer to a New ‘Click to Cancel’ Subscription Rule

It always seems to start the same way: you notice a charge you don’t recognize, scroll through your bank app, and realize—again—that you’re paying for something you thought you canceled months ago. That frustration is the backdrop as the Federal Trade Commission once more steps into the messy, bureaucratic maze of subscription rules, trying to revive its long-stalled effort to rein in negative option plans after a federal court knocked down its last attempt.

In a statement released Friday, the FTC said it has submitted a draft Advance Notice of Proposed Rulemaking, or ANPRM, on its Negative Option Rule to the Office of Information and Regulatory Affairs. OIRA, which sits inside the Office of Management and Budget, now gets to scrutinize the proposal before the FTC can publish it in the Federal Register. Only then does the public get a say—one more round of comments, one more chance for consumers to vent about subscriptions that refuse to die.

The commission’s vote to approve sending the draft to OIRA was unanimous, though that unanimity comes with an asterisk. The FTC currently has only two sitting commissioners, leaving three seats empty. One of those two, Chairman Andrew N. Ferguson, had previously voted against the updated Negative Option Rule when it narrowly passed in October 2024. It’s a strange kind of consensus, the sort you get when the room is half-empty.

That earlier rule didn’t survive long anyway. The U.S. Court of Appeals for the 8th Circuit vacated it while further review plays out, siding with critics who argued the agency overstepped its authority and skipped required procedural steps by failing to issue a preliminary regulatory analysis. In regulatory terms, it was less a slap on the wrist and more a reminder that process still matters—even when intentions are good.

Back in December 2025, the FTC also posted a petition for rulemaking from the Consumer Federation of America and the American Economic Liberties Project. The public comment window on that petition closed Jan. 2, quietly adding another layer of pressure and paperwork to an already complicated path forward.

The Negative Option Rule itself isn’t new. It dates back to the 1970s, born in an era of mail-order clubs and surprise shipments, designed to stop consumers from being signed up—and billed—without clear consent. The 2024 amendments would have dramatically expanded its reach, covering nearly all negative option programs, from auto-renewing subscriptions to “free trial” offers that quietly flip into paid plans. For many websites, that would have meant rethinking how sign-ups work and, more importantly, how easy it is to cancel.

Now, with the process restarted yet again, the FTC could circle back with the same ideas, or something close to them. Whether this time leads to real change—or just another loop through regulatory limbo—remains the open question hanging over every “Cancel subscription” button that somehow never quite does what it promises.

Read More »
Tax

Conservative Push for Porn Taxes Sparks Constitutional Backlash

It feels like the walls are closing in a little more every week. As age-verification laws continue to reshape—and in some cases dismantle—the adult industry, a Utah lawmaker has now stepped forward with a bill that would slap a new tax on porn sites operating in the state. It’s the kind of proposal that makes you pause, reread the headline, and wonder how we got here so fast.

Introduced by Republican state senator Calvin Musselman, the bill would impose a 7 percent tax on total receipts “from sales, distributions, memberships, subscriptions, performances, and content amounting to material harmful to minors that is produced, sold, filmed, generated, or otherwise based” in Utah. If passed, it would take effect in May and require adult sites to pay an additional $500 annual fee to the State Tax Commission. According to the legislation, revenue from the tax would be directed to Utah’s Department of Health and Human Services to expand mental health support for teens.

A new strain of American conservatism is asserting itself more boldly, and lawmakers across the US are calling for tighter restrictions on adult content. In September, Alabama became the first state to introduce a porn tax—10 percent on adult entertainment companies—after passing age-verification mandates that require users to upload ID or other personal documentation before accessing explicit material. Pennsylvania lawmakers are also exploring a proposal that would tack an extra 10 percent tax onto subscriptions and one-time purchases from online adult platforms, despite already charging a 6 percent sales and use tax on digital products, two state senators wrote in an October memo. Other states have flirted with similar ideas before. In 2019, Arizona state senator Gail Griffin, a Republican, proposed taxing adult content distributors to help fund a border wall during Donald Trump’s first term. To date, 25 US states have enacted some form of age verification.

Professor Answers Television History Questions

Efforts to criminalize sex workers and regulate the industry have been unfolding for years, accelerating alongside increased online surveillance and censorship. Yet targeted taxes have repeatedly stalled, in part because the legality of such measures remains deeply contested.

“This kind of porn tax is blatantly unconstitutional,” says Evelyn Douek, an associate professor of law at Stanford Law School. “It singles out a particular type of protected speech for disfavored treatment, purely because the legislature doesn’t like it—that’s exactly what the First Amendment is designed to protect against. Utah may not like porn, but as the Supreme Court affirmed only last year, adults have a fully protected right to access it.”

Utah, Alabama, and Pennsylvania are among 16 states that have adopted resolutions declaring pornography a public health crisis. “We realize this is a bold assertion not everyone will agree on, but it’s the full-fledged truth,” Utah governor Gary Herbert tweeted in 2016 after signing the resolution. Utah’s early response to the spread of adult content dates back to 2001, when it became the first state to establish an office focused on sexually explicit material by appointing an obscenity and pornography complaints ombudsman. The role—often referred to as the “porn czar”—was eliminated in 2017.

“Age restriction is a very complex subject that brings with it data privacy concerns and the potential for uneven and inconsistent application for different digital platforms,” Alex Kekesi, vice president of brand and community at Pornhub, said previously. In November, the company urged Google, Microsoft, and Apple to adopt device-based verification across app stores and operating systems. “We have seen several states and countries try to impose platform-level age verification requirements, and they have all failed to adequately protect children.” To comply with existing mandates, Pornhub has blocked access to users in 23 states.

Critics argue that age verification has never truly been about protecting children, but about quietly scrubbing porn from the internet. In 2024, a leaked video from the Centre for Climate Reporting showed Russell Vought, a Trump ally and Project 2025 coauthor, describing age-verification laws as a “back door” to a federal porn ban.

Platforms like OnlyFans and Pornhub have pushed sex work further into the mainstream, but they’ve also made it easier to monitor and police both performers and audiences. As states consider new tariffs and penalties, it’s creators who are most likely to absorb the shock.

The cultural conservatism taking shape under Trump 2.0 is driven by a desire to punish sexual expression, says Mike Stabile, director of public policy at the Free Speech Coalition, the US adult industry’s trade association. “When we talk about free speech, we generally mean the freedom to speak, the ability to speak freely without government interference. But in this case, free also means not having to pay for the right to do so. A government tax on speech limits that right to those who can afford it.”

OnlyFans says it complies with all tax requirements in the jurisdictions where it operates, while creators remain responsible for their own tax obligations. Pornhub, which is currently blocked in Utah and Alabama, did not respond to a request for comment.

Douek points out that while states can regulate minors’ access to explicit material following the Supreme Court’s decision upholding Texas’ age-verification law, “a porn tax does nothing to limit minors’ access to this speech—it simply makes it more expensive to provide this content to adults.” A 2022 report from Common Sense Media found that 73 percent of teens aged 13 to 17 had viewed adult content online. Today, much of that exposure happens through social media platforms like X and Snap. A recent survey from the UK’s Office of the Children’s Commissioner found that 59 percent of minors encounter porn accidentally—up from 38 percent the year before—mostly via social feeds.

In Alabama, as would be the case in Utah, revenue from the porn tax is earmarked for behavioral health services, including prevention, treatment, and recovery programs for young people.

Last year, Alabama state representative Ben Robbins, the Republican sponsor of the bill, said adult content was “a driver in causing mental health issues” in the state. It’s a familiar claim among lawmakers advocating for a nationwide porn ban. While some studies suggest adolescent exposure to porn may correlate with depression, low self-esteem, or normalized violence, medical experts have never reached a clear consensus.

As lawmakers increasingly frame the issue around harm to minors, Stabile says it’s crucial to remember that adult content is not a special category outside the bounds of free expression. Courts have repeatedly struck down content-specific taxes as unconstitutional censorship.

“What if a state decided that Covid misinformation was straining state health resources and taxed newsletters who promoted it? What if the federal government decided to require a costly license to start a podcast? What if a state decided to tax a certain newspaper it didn’t like?” he says. “Porn isn’t some magical category of speech separate from movies, streaming services, or other forms of entertainment. Adult businesses already pay taxes on the income they earn, just as every other business does. Taxing them because of imagined harms isn’t just dangerous for our industry—it’s a dangerous expansion of government power.”

Read More »
VPN

‘An Embarrassment’: Critics Slam UK’s Proposed VPN Age Checks

It started the way these things always seem to start lately—with a vote that felt small on paper and enormous everywhere else. Politicians, technologists, and civil society groups reacted with visible dismay after the House of Lords backed a move that would ban children from using VPNs and force providers to roll out age verification.

The backlash was swift. Wikipedia co-founder Jimmy Wales blasted the decision on X, calling the UK’s position an embarrassment. Windscribe CEO Yegor Sak had already summed up the idea as the “dumbest possible fix,” warning that forcing age checks on VPNs would set a deeply troubling precedent for digital privacy.

By Tuesday morning, the argument had spilled fully into the open. Online debate surged, with X logging more than 20,000 posts on the issue in just 24 hours—one of those moments where you can almost hear the internet arguing with itself.

Labour, Lords & VPN laws

Last week, the House of Lords voted in favor of an amendment to the Children’s Wellbeing and Schools Bill that would, in effect, bar anyone under 18 from using VPNs.

The proposal would require commercial VPN providers to deploy mandatory age assurance technology, specifically to stop minors from using VPNs to bypass online safety measures. It sounds tidy in theory. In reality, it opens a can of worms no one seems eager to fully acknowledge.

Notably, the government itself opposed the amendment. Instead, it has opened a three-month consultation on children’s social media use, which includes a broader look at VPNs and how—or whether—they should be addressed.

Political pushback

Even though the House of Lords has shown its hand, the proposal now heads to the House of Commons, where it’s expected to hit serious resistance from the Labour government.

If the Commons throws it out, as many expect, the Lords will have to decide whether to dig in and trigger a round of parliamentary “ping-pong” or quietly step aside.

Labour’s Lord Knight of Weymouth, who voted against the amendment, suggested there’s little appetite for a drawn-out fight. He told TechRadar that it’s unlikely politicians will “die in a ditch” over banning VPNs.

In his view, many lawmakers are chasing “something iconic” on child safety—something headline-friendly—rather than wading into the technical swamp that regulating VPNs would require.

That said, Knight didn’t dismiss the broader concern. He argued that regulator Ofcom “needs to do better” at enforcing existing safety laws and agreed that more should be done to protect children online, provided it’s handled “carefully.” That word—carefully—does a lot of work here.

Civil society’s response

Regardless of whether this particular amendment survives, one thing is clear: VPNs are under a brighter spotlight than ever, and not just in the UK.

In the United States, lawmakers in Wisconsin are pushing a bill that would require adult websites to block access from users connected via a VPN. In Michigan, legislators have floated ideas around ISP-level blocking of circumvention tools. Different routes, same destination.

Evan Greer, director of the US-based group Fight for the Future, warned that policies designed to discourage or ban VPN use will “will put human rights activists, journalists, abuse survivors and other vulnerable people in immediate danger.”

Fight for the Future is running a campaign that lets users contact lawmakers directly, arguing in an open letter that the ability to use the internet safely and privately is a fundamental human right.

Back in the UK, a public petition is urging the government to reject any plan that would effectively ban VPNs for children.

The Open Rights Group has also been vocal, pointing out that detecting or banning VPN use isn’t realistically possible without resorting to what it calls an “extreme level of digital authoritarianism.”

And just in case anyone missed the point the first time, the reaction hasn’t softened. Politicians, technologists, and civil society organizations continue to express dismay after the House of Lords vote to ban children from using VPNs and force providers to introduce age verification.

Jimmy Wales again called the UK’s stance an embarrassment, while Windscribe CEO Yegor Sak repeated his warning that this is the “dumbest possible fix” and a terrible precedent for privacy.

The conversation flared once more as public debate peaked Tuesday morning, with more than 20,000 posts appearing on X in a single day—a reminder that when it comes to privacy, the internet rarely stays quiet for long.

Read More »