Section 230

Florida Lawmaker Introduces New Bill to Repeal Section 230

It starts with that familiar little jolt in the gut — the kind you get when a political idea lands a little too close to home. WASHINGTON — Rep. Jimmy Patronis of Florida has become the latest member of Congress to float legislation that would repeal Section 230 of the Communications Decency Act, the rule that shields interactive computer services — including adult platforms — from being held responsible for user-generated content. One of those moments where you pause mid-scroll and think, Oh… this could get interesting. Or messy. Or both.

Patronis introduced HR 7045 in the House of Representatives earlier this week, slipping it into the legislative bloodstream where big ideas tend to either explode or quietly mutate over time. Sometimes you can almost hear the gears grinding behind the scenes.

A statement posted on his website declared, “For too long, the law has prevented parties harmed by online content from obtaining relief. Instead of protecting our younger generations from sensitive content, these sites prioritize profit over safety while continuing to push out harmful, explicit, and dangerous materials without any accountability.” Strong words, the kind that land heavy and don’t really leave much room for nuance.

Would-be reformers on both sides of the aisle have been taking swings at “Big Tech” for years now, accusing platforms of profiting off illegal and harmful content while hiding behind legal shields. The idea is to force companies to moderate more aggressively by making them legally responsible for what users post. Meanwhile, right-wing critics argue the same rule lets platforms censor conservative voices, and they want limits placed on how much moderation power these companies can wield. It’s like watching two very different fires being fueled by the same match.

Back in December, two other repeal bills were already making their way through Congress: HR 6746, the Sunset to Reform Section 230 Act, which would amend the law by simply adding, “This section shall have no force or effect after December 31, 2026,” and S 3546, which calls for a full repeal of Section 230 two years after enactment. The clock imagery alone makes you feel like something is quietly counting down in the background.

Industry attorneys and advocates, though, have been sounding alarms. They worry that once lawmakers start tinkering with Section 230, it opens the door to a patchwork of carve-outs — the kind that slowly chip away at protections, much like what happened with FOSTA/SESTA and its exemptions targeting sites that “unlawfully promote and facilitate” prostitution or sex trafficking. It’s rarely just one small change, is it? It’s the domino effect.

A carve-out aimed at — or even loosely touching — the adult industry would effectively gut Section 230 for those platforms. That would suddenly make sites hosting user-generated content legally responsible for what users upload, inviting a flood of civil lawsuits and uncertainty. And once that door cracks open, it’s hard not to wonder how wide it eventually swings.

Read More »
Florida flag

Florida Candidate Floats Plan for 50% Tax on OnlyFans Earnings

Fifty percent. Just saying it out loud makes your eyebrows climb a little. Half of someone’s income — gone in the name of morality, politics, and a campaign soundbite. That’s the idea a Florida gubernatorial candidate tossed into the air this week, announcing plans to slap a 50 percent “sin” tax on OnlyFans creators if he wins in November.

James Fishback, an investor and a member of a far-right wing of the Republican Party, floated the proposal during an interview on a right-wing podcast called NXR Studios, released earlier this week. It wasn’t exactly whispered. It landed with the kind of confidence that suggests he wanted people arguing about it before the episode even finished buffering.

During the interview, Fishback framed the tax as punishment for what he called the “sin” of being an online sex worker, arguing that content creators on OnlyFans should be paying a special levy for existing in that space.

“As Florida governor, in year one, I would push for the first-of-its-kind OnlyFans sin tax,” Fishback said during the interview. “If you are a so-called OnlyFans creator in Florida, you are going to pay 50 percent to the state on whatever you so-called earn via that online degeneracy platform.”

He went on to say that part of the revenue from the proposed tax would be used to fund a “czar” for men’s mental health. Fishback explained, “Men have been told for far too long that they are guilty of masculinity, that they are guilty of all of society’s ills.” It’s one of those lines that sounds designed to travel fast on social media, whether people cheer or cringe.

Additional funds, he said, would also support education and religious nonprofit “crisis pregnancy centers” that aim to discourage women from having abortions. The list of beneficiaries reads like a snapshot of his political priorities.

In what felt very much like a social media provocation, Fishback later posted about the proposal on X, reposting another user who highlighted the idea of an OnlyFans tax. He didn’t stop there.

He tagged popular Miami-based OnlyFans creator Sophie Rain and wrote, “Hey Sophie Rain, Pay up or quit OnlyFans. As Florida Governor, I will not allow a generation of smart and capable young women to sell their bodies online.” Rain, known for her viral presence and softcore content, fired back shortly after.

Read More »
Irish flag with hand holding blurred site

Irish Lawmakers Consider Tougher Age Verification Rules

There’s a certain tension in rooms like this — the kind where lawmakers sit around long tables, coffee going cold, talking about the internet as if it were a living thing that keeps slipping out of their hands. That mood hovered in the air Wednesday as Ireland’s Joint Committee on Arts, Media, Communications, Culture and Sport gathered to wrestle with online platform regulation and digital safety, including renewed calls for tougher age verification rules for adult sites.

In a written statement submitted ahead of the session, Detective Chief Superintendent Barry Walsh, who leads the Garda National Cyber Crime Bureau, didn’t mince words about what he described as “the widespread and unrestricted availability of pornography.”

“As an overarching observation, it is difficult to understand why robust age verification is not yet a standard operating procedure in respect of any platform where pornography or other child inappropriate content is either readily accessible or where there is a realistic danger that it could be accessed,” Walsh argued. “This would appear to represent a very simple, yet robust, safeguard.”

He went further, pointing to what he called “very extreme pornography that is serving to corrupt teenage males in particular into regarding this as normal, acceptable sexual behavior to be expressed in practice” — language that mirrors recent arguments from U.K. lawmakers who’ve pushed to outlaw depictions of “choking” in adult content. It’s the kind of claim that lands heavy in a room, even if everyone hears it differently depending on where they’re standing.

Ireland already has an Online Safety Code, which took effect in July 2025 and includes a requirement that adult sites headquartered in the country implement age assurance measures. On paper, at least, the framework is already there.

Sites based outside Ireland aren’t off the hook either. They fall under the European Union’s Digital Services Act (DSA), with digital service coordinators across member states working together to enforce rules, including age assurance obligations. It’s a bit like a cross-border relay race — everyone’s running, but the baton keeps moving.

The European Commission has already launched formal proceedings against several adult sites for suspected DSA breaches, though the spotlight has largely been on higher-traffic platforms so far.

In December, however, a representative from Irish media regulator Coimisiún na Meán told legislators that regulators across the EU were preparing to widen enforcement to include smaller adult sites as well. Some Irish lawmakers reportedly pushed for even tougher age verification laws, pointing to France’s Law Aiming to Secure and Regulate the Digital Space (SREN) as a possible template.

That sentiment surfaced again during Wednesday’s meeting when Senator Rónán Mullen doubled down, telling fellow committee members: “Strict age verification is necessary to shield children from the harmful effects of pornography.”

“That’s the only thing that has worked,” Mullen said. “Look at certain states in the USA where, with bipartisan support, pornography providers have gone offline because they were civilly or criminally liable if they did not ensure strict age verification.”

His argument echoes a familiar line from U.S. lawmakers who often point to site withdrawals as proof that these laws are “effective” or “working” — a claim that quietly assumes the real goal is stopping everyone from accessing adult content, not just minors. It’s one of those uncomfortable subtexts nobody quite says out loud, but you can feel it humming under the conversation.

Not everyone in the room was convinced the strategy holds up in the real world. Deputy Peter Cleere raised the practical problem of virtual private networks, which can make age verification systems easy to sidestep.

“It makes a mockery of all the regulations we want to put in place,” Cleere said. “We’re going in circles.”

And maybe that’s the lingering question — are these policies actually building safer digital spaces, or are they just drawing tighter circles around a problem that keeps slipping through the cracks?

Read More »
Ofcom logo

Ofcom Sets New Rules for Where Adult Sites Must Place Age Checks

There’s something oddly symbolic about a digital front door. You don’t see what’s inside yet. You pause. You decide whether you’re allowed in. That’s essentially the vision being pushed now, as U.K. regulator Ofcom laid out new guidance on where and how adult sites should place age checks under the Online Safety Act.

In a statement posted on its website, the agency noted that while adult sites have been experimenting with different ways to position age checks, Ofcom’s preferred method is a so-called “front gate,” where users encounter “a blank landing page, with no content visible until they have completed the age check.”

The agency considers this to be the safest and most compliant way to handle age gating — basically, no peeking through the curtains before you prove you’re old enough to be there.

Other approaches to age assurance are still allowed, the agency added, as long as they truly prevent children from seeing pornography before hitting the age check. For instance:

Blur gate. If a site opts to use a “blur gate,” where users see only blurred images before the age check, the blurring must be “sufficiently strong and across enough content to ensure that the content is not pornographic.” Not a polite fog — more like a real visual wall.

Image gate. A site using an “image gate,” where users can see clearly visible images but must click a thumbnail to reach an age check, has to ensure that the images and preview videos shown beforehand are not pornographic. In other words, curiosity can’t be rewarded too generously.

In-video gate. A site using an “in-video gate,” where users see thumbnails with clearly visible images and may even watch parts of videos before being sent to an age check, would need to make sure that any video content available before the check is not pornographic. That gray zone? It’s getting a lot smaller.

Ofcom’s statement also makes it clear that content doesn’t have to show nudity or explicit sex acts to qualify as pornographic. Sometimes the tone, framing, or intent tells the real story — anyone who’s ever scrolled the internet for more than five minutes knows how slippery that line can be.

“When deciding what content is suitable to include before an age gate, services should refer to our guidance on pornographic content, considering the wider context of their site and whether the images or videos are accompanied by sexually explicit language in titles, for example … What matters is whether it is reasonable to assume the content was produced principally for the purpose of sexual arousal,” the statement clarifies.

The statement concludes, “We will continue to engage with and monitor the adult sector to identify and address non-compliance, including whether services’ placement of the age check is compliant.”

Which really translates to this: the front door is being watched, the rules are tightening, and the era of half-open windows and blurred excuses is quietly coming to an end.

Read More »
Tax

Utah’s “Porn Tax”: A Levy on Paper and Ink for the Internet Age by Morley Safeword

Back in the early 1970s, the Minnesota legislature altered the state’s sales tax such that it created a “use tax” on the cost of paper and ink, while exempting the first $100,000 worth of such materials in any calendar year.

In part due to that exemption, the use tax clearly was directed at the state’s larger periodicals, including the Minneapolis Star Tribune. The Star Tribune wasn’t simply one of eleven publications incurring tax liability under the statute in the early 70s; of the $893,355 in total tax revenue collected under the statute in 1974, the Star Tribune paid $608,634 – roughly two-thirds of the total revenue collected.

The Star Tribune sued the Minnesota Commissioner of Revenue, alleging that the state’s tax scheme violated the First Amendment. The resulting case, Minneapolis Star & Tribune Co v. Minnesota Commissioner of Revenue, was decided by the U.S. Supreme Court in 1983.

The Supreme Court ruled in favor of the newspaper, holding that the “main interest asserted by Minnesota in this case is the raising of revenue” and that while this interest was “critical to any government,” it wasn’t, by itself, enough for the law to survive scrutiny under the First Amendment.

“Standing alone, however, it cannot justify the special treatment of the press, for an alternative means of achieving the same interest without raising concerns under the First Amendment is clearly available,” Justice Sandra Day O’Connor wrote for the court, “the State could raise the revenue by taxing businesses generally, avoiding the censorial threat implicit in a tax that singles out the press.”

I’ve been thinking a lot about the Star Tribune case since first reading about SB 73, a new bill in Utah proposed by State Senator Calvin R. Musselman. There was a time when the 1983 decision would have given me confidence that Musselman’s bill wouldn’t survive court scrutiny, assuming it becomes law in the state. After the Supreme Court’s decision last summer in Free Speech Coalition v. Paxton, however, I’m a lot less certain.

Granted, there’s not a perfect analogy between the Minnesota law at issue in the Star Tribune case and the bill proposed in Utah, nor between the 1983 case and the Paxton case. But what has me feeling uneasy is the readiness of the current Supreme Court to throw aside precedent and impose a lower standard of review in cases where material alleged to be “harmful to minors” is at issue.

It’s not just reasonably well-informed laymen like me who are uncertain, either. In recent adult industry media coverageabout the Utah bill, litigators who are experts in the First Amendment were divided in their analysis, as well.

Speaking to XBIZ, attorney Larry Walters of FirstAmendment.com pointed out that in one recent case, “the Georgia Supreme Court upheld a 1% gross revenue tax on adult entertainment establishments in the face of a constitutional challenge.”

“But the court reasoned that the tax was justified not based on the entertainment content produced by the businesses, but on the alleged ‘adverse secondary effects’ of physical adult establishments, such as prostitution and child exploitation,” Walters noted, adding that there are “no recognized adverse secondary effects of online adult entertainment businesses. Accordingly, the Utah bill, if adopted, could be subject to a constitutional challenge as a violation of the First Amendment.”

Walters also observed that the Utah bill also could have trouble for it looming in the form of the Dormant Commerce Clause, which limits states’ ability to pass legislation that discriminates against of unreasonably burdens interstate commerce.

Walters’ colleague Corey Silverstein, commenting for the same article, was less optimistic.

“After the state-by-state pummeling of AV laws, this is only the beginning of another new trend of anti-adult and anti-free-speech laws that the entire industry needs to prepare for,” Silverstein said, also predicting that a challenge to the Utah bill, should it become law, would be unlikely to succeed.

One thing is certain: The Utah “porn tax” bill won’t be the end of state governments seeking to impose further regulatory burdens on the adult entertainment industry. Emboldened by their success in establishing age verification requirements, state legislatures across the country can be relied upon to cook up additional hurdles to put in the path of adult businesses, performers, content creators and anyone else engaged in expressing themselves through sexually explicit materials.

Read More »
Anime two women

GitHub Purges Adult Game Developers, Offers No Explanation

Something strange started rippling through a small, niche corner of the internet not long ago. Developers who build mods and plugins for hentai games and even interactive sex toys began waking up to missing repositories, locked accounts, and dead links. GitHub, the place many of them had treated like home base, had quietly pulled the rug out. No warning. No explanation. Just… gone.

From conversations within the community, the rough headcount quickly took shape: somewhere between 80 and 90 repositories, representing the work of roughly 40 to 50 people, vanished in a short window. Many of the takedowns seemed to cluster around late November and early December. A large number of the affected accounts belonged to modders working on games from Illusion, a now-defunct Japanese studio known for titles that mixed gameplay with varying degrees of erotic content. One banned account alone reportedly hosted contributions from more than 30 people across 40-plus repositories, according to members of the modding scene.

What made the situation feel especially surreal was the silence. Most suspended developers say they were never told which rule they’d broken—if any. Their accounts simply stopped working. Several insisted they’d been careful to stay within GitHub’s acceptable use guidelines, avoiding anything overtly explicit. The code was functional, technical, sometimes cheeky in naming, but never pornographic. At least, not in the way most people would define it.

“Amongst my repositories there were no explicitly sexual names or images anywhere in the code or the readme, the most suggestive naming would be on the level of referencing the dick as ‘the men thing’ or referencing the sex as ‘huffing puffing,’” one developer, Danil Zverev, told me. He makes plugins for an Illusion game called Koikatsu. Zverev said he’s been using Github for this purpose since 2024, but on November 18, his Github page was “completely deleted,” he said. “No notifications anywhere, simply a 404 error when accessing the page and inability to log in on the web or in the mobile app. Also it does not allow me to register a new account with the same name or email.”

The timing raised eyebrows. GitHub had updated its acceptable use policies in October 2025, adding language that forbids “sexually themed or suggestive content that serves little or no purpose other than to solicit an erotic or shocking response, particularly where that content is amplified by its placement in profiles or other social contexts.” The policy explicitly bars pornographic material and “graphic depictions of sexual acts including photographs, video, animation, drawings, computer-generated images, or text-based content.”

At the same time, the policy leaves room for interpretation. “We recognize that not all nudity or content related to sexuality is obscene. We may allow visual and/or textual depictions in artistic, educational, historical or journalistic contexts, or as it relates to victim advocacy,” GitHub’s terms of use state. “In some cases a disclaimer can help communicate the context of the project. However, please understand that we may choose to limit the content by giving users the option to opt in before viewing.”

The Anti-Porn Crusade That Censored Steam and Itch.io Started 30 Years Ago

Keywords and tags have never been a useful metric for distilling nuance. Pushing for regulations based on them is repeating a 30-year history of porn panic online.

SAMANTHA COLE

Zverev didn’t bother appealing. He said writing to support felt pointless and chose instead to move on to another platform. Others tried to fight it—and found themselves stuck in limbo.

A developer who goes by VerDevin, known for Blender modding guides, utility tools, and plugins for the game Custom Order Maid 3D2, said users began reporting trouble accessing his repositories in late October. Oddly, he could still see his account when logged in, but not when browsing while logged out.

“Turned out, as you already know, that my account was ‘signaled’ and I had to purposefully go to the report section of Github to learn about it. I never received any notifications, by mail or otherwise,” VerDevin told me. “At that point I sent a ticket asking politely for clarifications and the proceedings for reinstatement.”

The response from GitHub Trust & Safety was vague and procedural: “If you agree to abide by our Terms of Service going forward, please reply to this email and provide us more information on how you hope to use GitHub in the future. At that time we will continue our review of your request for reinstatement.”

VerDevin replied the next day, agreeing to comply and offering to remove whatever GitHub considered inappropriate—despite still not knowing what that was. “I did not take actual steps toward it as at that point I still didn’t know what was reproach of me,” they said.

A full month passed before GitHub followed up. “Your account was actioned due to violation of the following prohibition found in our Acceptable Use Policies: Specifically, the content or activity that was reported included multiple sexually explicit content in repositories, which we found to be in violation of our Acceptable Use Policies,” GitHub wrote.

“At that point I took down several repositories that might qualify as an attempt to show good faith (like a plugin named COM3D2.Interlewd),” they said. GitHub restored the account on December 17—weeks later, and just one day after additional questions were raised about the ban—but never clarified which content had triggered the action in the first place.

Requests for explanation went unanswered. Even when specific banned accounts were flagged to GitHub’s press team, the response was inconsistent. Some accounts were reinstated. Others weren’t. No clear reasoning was ever shared.

The whole episode highlights a problem that feels painfully familiar to anyone who’s worked on the edges of platform rules: adult content policies that are vague, inconsistently enforced, and devastating when applied without warning. These repositories weren’t fringe curiosities—they were tools used by potentially hundreds of thousands of people. The English-speaking Koikatsu modding Discord alone has more than 350,000 members. Another developer, Sauceke, whose account was suspended without explanation in mid-November, said users of his open-source adult toy mods are now running into broken links or missing files.

“Perhaps most frustratingly, all of the tickets, pull requests, past release builds and changelogs are gone, because those things are not part of Git (the version control system),” Sauceke told me. “So even if someone had the foresight to make mirrors before the ban (as I did), those mirrors would only keep up with the code changes, not these ‘extra’ things that are pretty much vital to our work.”

GitHub eventually reinstated Sauceke’s account on a Tuesday—seven weeks after the original suspension—following renewed questions about the bans. Support sent a brief note: “Thank you for the information you have provided. Sorry for the time taken to get back to you. We really do appreciate your patience. Sometimes our abuse detecting systems highlight accounts that need to be manually reviewed. We’ve cleared the restrictions from your account, so you have full access to GitHub again.”

Even so, the damage lingers. In Sauceke’s account and others, including the IllusionMods repository, release files remain hidden. “This makes the releases both inaccessible to users and impossible to migrate to other sites without some tedious work,” Sauceke said.

Accounts may come back. Repositories might be restored. But for many developers, the trust is already gone—and that’s the kind of thing that doesn’t reinstall quite so easily.

GitHub isn’t just another code host—it’s the town square for open-source developers. For adult creators especially, who are used to being quietly shoved to the margins everywhere else, visibility there actually matters. It’s how people find each other, trade ideas, and build something that feels bigger than a solo side project. “It’s the best place to build a community, to find like-minded people who dig your stuff and want to collaborate,” Sauceke said. But if this wave of bans stretches beyond hentai game and toy modders, they warned, it could trigger a slow exodus. Some developers aren’t waiting around to find out, already packing up their repositories and moving them to GitGoon, a platform built specifically for adult developers, or Codeberg, a nonprofit, Berlin-based alternative that runs on a similar model.

Read More »
Beautiful woman on bed

Nevada’s Legal Sex Workers Claim They’re Being Muted on X

It didn’t happen slowly. It wasn’t subtle. Within the past month, legal Nevada sex workers have been hit with a sudden, sweeping wave of account suspensions on X, the platform once known as Twitter. Not for doing anything illegal. Not for soliciting crimes. These are licensed workers, operating in the only state where brothel-based sex work is legal — and yet their voices are vanishing from a platform that once wrapped itself in the language of free speech.

That promise came straight from Elon Musk himself when he set his sights on buying Twitter. At the time, he framed the platform as something almost sacred, saying:

“Free speech is the bedrock of a functioning democracy, and Twitter is the digital town square.”

He followed that with another line meant to reassure skeptics:

“By ‘free speech’ I simply mean that which matches the law.”

By that definition, Nevada sex work clearly qualifies.

Prostitution is legal in Nevada when it takes place inside licensed brothels, as outlined in Nevada Revised Statutes 201.354. Counties are empowered to license and regulate these brothels under Nevada Revised Statutes 244.345, while workers comply with additional health and safety standards set by the state. At present, six counties operate legal brothels. This isn’t a loophole or a gray area — it’s a fully regulated, lawful industry.

At first glance, it might look like X is simply enforcing broad rules around adult content. But the reality cuts deeper. When legal Nevada sex workers lose their accounts, they’re erased from public conversation — conversation that increasingly lives and breathes on platforms like X. What’s left behind is a so-called “digital town square” where only certain voices are allowed to stay standing.

Nevada sex workers understand exactly what’s at stake when they’re shut out. Not long ago, anti-brothel groups attempted to dismantle the legal system through ballot initiatives. When voters heard directly from sex workers, those efforts failed — decisively. In the 2018 Lyon County referendum, for example, nearly 80 percent of voters rejected a proposed brothel ban.

That wasn’t an accident. When sex workers are able to speak publicly, explain how the licensed system actually functions, and share their lived experiences, people listen. Voters learn about the safeguards, the structure, and why legal brothels exist in the first place — not from headlines or fear campaigns, but from the people inside the system.

Silencing those voices on X means the public hears less from those with firsthand knowledge. Anti-sex-work narratives remain visible, amplified, and largely unchallenged. The workers most affected by stigma and policy decisions fade into the background.

This isn’t just about clumsy algorithms sweeping up adult content. It’s about who gets to participate in conversations that can shape laws, livelihoods, and lives. Platforms don’t just host debate — they quietly curate it by deciding who stays and who disappears.

When licensed Nevada sex workers are removed from social media, the public square stops reflecting reality. The debate tilts. The story becomes one-sided. And the people whose livelihoods are on the line — most of them women — lose the chance to speak for themselves.

Maybe that’s the most unsettling part. If this can happen to a group operating legally, transparently, and within the law, it raises an uncomfortable question: who’s next when an algorithm decides a voice is inconvenient?

Read More »
Age verification

Age Verification Push Sends Millions of Britons to Unregulated Porn Sites, Charity Says

Something strange has been happening since age verification checks quietly became part of everyday internet life in the UK. You’d think stricter rules would close doors. Instead, a lot of people seem to be wandering down darker hallways.

New research from the Lucy Faithfull Foundation suggests that nearly 45 per cent of UK porn users have visited websites without age verification checks since the rules came into force last summer under the Online Safety Act.

The poll, which surveyed more than 3,700 people across Britain, found that 39 per cent of those who visited these sites ended up watching content that made them feel uncomfortable. Even more telling, 40 per cent said what they saw was enough to put them off returning altogether.

The charity, which focuses on preventing online child sexual abuse, has warned that these unregulated spaces can quietly increase the risk of people stumbling into harmful — and illegal — material.

Vicky Young, who leads the Stop It Now UK and Ireland anonymous helpline, said these sites can become a dangerous stepping stone toward indecent images of children.

“We work with people who have looked at indecent images of children to try and address that behaviour, to help support them to change their behaviour,” she said.

“One of the things that people say to us frequently is that they started looking at legal adult pornography and that their behaviour escalated. In part they were spending maybe longer online, but also the sort of content that they were looking at became more extreme and often started getting younger, and that’s when they then crossed into illegal behaviour, so looking at indecent images of children.”

That pattern — the slow creep from curiosity to something far more serious — is what worries the charity most.

“Because of that pathway and that coming out constantly in conversations we have with people, it concerns us that if people are accessing sites where there is this riskier content, that actually they are putting themselves at a higher chance of accessing indecent images,” she said.

“Sometimes that might not be intentional in the beginning, but what people tell us is that actually, if they come across those images as part of their other pornography, that then sparks curiosity. There’s something that kind of adds to the excitement around the risk, and they don’t necessarily stop at one image. They actually then start looking for more images.”

The survey also revealed a quieter anxiety bubbling under the surface. Nearly 30 per cent of respondents said they were worried about how much pornography they consume. That concern was highest among young men aged 18 to 24, with more than half admitting it’s something that troubles them — a group the Foundation describes as particularly vulnerable.

At the same time, the rules appear to be forcing a moment of self-reflection for many. Almost 47 per cent said they’ve reduced how much pornography they watch since age checks were introduced, while 55 per cent said the changes made them stop and think about their habits.

Recent data backs that up. Enforcement of highly effective age assurance led to an immediate drop in traffic to major porn sites from late July. But as one door closed, another cracked open: VPN use surged as people looked for ways around the new barriers.

The UK’s most visited adult site recorded a drop of around 1.5 million viewers year on year, falling from 11.3 million in August 2024 to 9.8 million this August.

Meanwhile, VPN usage more than doubled after the rules came in, jumping from roughly 650,000 daily users to a peak of over 1.4 million in mid-August 2025. Although that number has since dipped, it still hovered around 900,000 by November.

In other words, fewer people are walking through the front door — but plenty are still trying the side entrance.

Dr Alexandra Bailey, head of psychology at the Foundation and an associate professor at the University of Roehampton, said the intention behind age verification is sound, but the consequences are more complicated.

“Age verification is vital to protect children, and we fully support it,” she said. “But we also need to recognise that some adults are choosing riskier sites to avoid age checks. These sites can expose people to harmful material, including illegal content depicting child sexual abuse. Even if you’re not looking for it, you could encounter it — and that can have serious life-changing consequences.”

She added that the rules have created a pause many people probably needed, but not everyone is responding in a safe way.

“Age verification is also prompting adults to reflect on their online behaviour, which can be a good thing for people worried about their porn use. But we need to address the risks for those who are turning to sites that avoid the new regulations.

“Every day, our advisors speak to people whose pornography use has spiralled into something much more harmful. We know embarrassment can stop people from reaching out, but confidential help is available. If you’re worried about your own behaviour or someone else’s, contact Stop It Now before it’s too late.”

Sometimes the most dangerous part isn’t the rule itself — it’s what people do when they decide to dodge it.

Read More »
Age verification image

When HBO’s Industry Meets the Age Verification Reckoning

WHEN THEY DECIDED to take on age verification in their latest season, Industry cocreators Konrad Kay and Mickey Down didn’t expect to wander straight into a political minefield. It probably felt, at first, like one more sharp storyline—edgy, timely, a little dangerous in the way good TV often is. But sometimes a writers’ room accidentally opens a door to something bigger. And once it’s open, there’s no quietly closing it again.

“It was in the ether of British politics, but it wasn’t front and center when we started writing the scripts or shooting it, and then it really flared up as a kind of front-page-of-BBC topic of conversation,” Kay says.

Season 4 of HBO’s sexy, darkly funny financial drama—premiering Sunday—pushes Industry even further beyond the blood-slick trading floors that first defined it. This time, the story spills into tech, porn, age verification, and the uncomfortable politics sitting between them. Early in the season, tensions rise inside Tender, a fintech firm fresh off its IPO, as executives debate whether to keep processing payments for Siren, an adult platform in the OnlyFans mold. Siren—and other porn and gambling businesses—account for a sizable slice of Tender’s revenue. But looming threats of new age-verification laws and a rising tide of anti-porn rhetoric from the UK’s Labour Party have some leaders wondering if reputational cleanup might be more profitable than cashing controversial checks. It’s boardroom fear dressed up as moral clarity, the kind that tends to surface right before regulators do.

In the real world, the UK’s Online Safety Act—requiring age verification to access porn and other restricted content—didn’t take effect until July 2025, long after Kay and Down had mapped out this season’s arc. Still, the parallels are hard to ignore. Platforms like Pornhub saw UK traffic plunge by nearly 80 percent after the rules kicked in, and similar pressures are mounting in the U.S., where roughly half of all states now enforce some form of age-verification law. Even Capitol Hill is circling the issue: in December alone, lawmakers considered 19 bills aimed at protecting minors online. Critics, meanwhile, argue that several of those proposals stray into unconstitutional territory. It’s messy, unresolved, and very much still unfolding.

“It’s kind of shown how fragile free speech absolutism is,” says Down, pointing to the “wildly different” reactions the issue has provoked—from puritan instincts cropping up in liberal circles to a more blunt, censor-first “shut everything down” posture on the conservative side. And that tension, hanging in the air, feels like the real cliffhanger. Not who wins the argument—but what gets lost while everyone’s busy shouting.

Read More »
Utah House building

Utah Senator Floats Porn Tax to Pay for Age-Verification Enforcement

SALT LAKE CITY—There are some ideas that arrive quietly and others that walk in like they own the place. This one does the latter. At the opening of Utah’s new legislative session, a Republican lawmaker dropped a bill that would tax online adult content, funneling the money toward age-verification enforcement and teen mental health programs.

Sen. Calvin R. Musselman, who represents the small town of West Haven, is the driving force behind Senate Bill (SB) 73. The proposal introduces what it calls a “material harmful to minors tax,” set at seven percent of the “gross receipts” from sales of content classified under that label.

SB 73 has been formally introduced but hasn’t yet landed in a committee. Even so, the odds of it clearing the legislature are widely considered high.

The bill defines “gross receipts” as “the total amount of consideration received for a transaction […] without deduction for the cost of materials, labor, service, or other expenses.” In other words, it’s the top line, not the leftovers.

And the reach is… expansive. The tax would apply to “the gross receipts of all sales, distributions, memberships, subscriptions, performances, and content, amounting to material harmful to minors that is: (a) produced in this state; (b) sold in this state; (c) filmed in this state; (d) generated in this state; or (e) otherwise based in this state.” That’s a wide net, and it’s not subtle about it.

Because of that scope, the tax wouldn’t just hit one corner of the industry. Producers, creators, platforms—anyone touching qualifying content—would likely feel it. And it wouldn’t exist in a vacuum. The levy would stack on top of existing obligations, including Utah’s digital sales tax and other state fees.

Revenue from the tax would flow into a newly created government account, earmarked for teen mental health treatment through the state Department of Health and Human Services. It’s worth noting that Utah is among the states that formally frame pornography consumption as a public health crisis, a position tied to the still-contested concept of “pornography addiction.”

The bill doesn’t stop at taxation. It also introduces a $500 annual recurring fee, paid into accounts overseen by the Division of Consumer Protection. This so-called “notification fee” would apply to companies producing content deemed “harmful to minors” and is tied directly to age-verification compliance.

Those funds would be used by the Division to monitor compliance in a system modeled after the United Kingdom’s Ofcom framework. Companies would need to notify annually. Miss that step, and the penalty jumps to $1,000 per day until the paperwork—and compliance—are in order.

Utah, of course, has already been down this road. It was one of the first states to pass a statewide age-verification law structured as a “bounty law,” allowing private individuals to sue on the state’s behalf over noncompliance. That approach famously led Aylo, the owner of Pornhub, to block Utah IP addresses, just as it has done in other states with similar laws.

Utah wouldn’t be alone in adding a porn-specific tax to the mix. Alabama already has one on the books, imposing a ten percent levy on top of existing digital goods and sales taxes.

And the idea is still spreading. In Pennsylvania, a bipartisan pair of state senators recently announced plans to propose a measure that would tax online pornography subscriptions in that state’s digital marketplace.

Read More »