Political Attacks

Missouri Age-Verification Regulation Takes Effect November 30th

Missouri flag

Missouri’s age-verification regulation, 15 CSR 60-18, kicks in on Sunday, November 30. It arrives quietly, almost like a new rule taped to the front door of the internet—one most people won’t notice until they run into it.

Under Missouri’s rule, any site where at least 33⅓% of content is considered harmful to minors must verify a visitor’s age before letting them in. The state signs off on methods like digital IDs, government-issued identification, or other systems that confirm age through transactional data. If a platform thinks it has a better solution, it can pitch its own—so long as it proves it works just as well.

Violating the rule isn’t just a slap on the wrist. The state treats it as “an unfair, deceptive, fraudulent, or otherwise unlawful practice” under the Missouri Merchandising Practices Act. If regulators decide a violation was done “with the intent to defraud,” it escalates into a Class E felony. Each visit to a non-compliant site counts as a separate offense, with penalties capped at $10,000 per day. There’s no option for private lawsuits; this is the state’s show.

For businesses, the message is simple but unsettling: if you might fall under the rule, read the fine print, understand the liability, and protect yourself. The consequences aren’t theoretical—they’re baked in. And as laws like this multiply, compliance is becoming less about checking a box and more about navigating a moving target with stakes that touch real people and their privacy.

Because once the government decides how adults must prove their age online, the question stops being, Can you follow the rules?

It becomes, What do those rules change about the way we experience the internet at all?

Read More »

FSC Unveils Updated Toolkit to Help Sites Navigate Age-Verification Laws

Free Speech Coalition logo

Earlier this year, a toolkit dropped from the Free Speech Coalition that was supposed to help adult websites navigate the chaos of U.S. age verification laws. On paper, it was about compliance. In reality, it spoke to something bigger—how to follow the law without sacrificing privacy, free expression, or basic human dignity in the process. The updated version arrives after months of legal whiplash and real-world testing, refined by feedback from the people actually living with these requirements. It’s not just a rulebook; it’s a survival guide for an industry being legislated into a corner.

And honestly, it couldn’t have come at a better time.

Laws regulating sexual content online aren’t slowing down. They’re spreading. States are experimenting with different enforcement mechanisms like they’re swapping cocktail recipes—ID uploads here, age-estimation scans there, endless demands for personal data everywhere. What counts as compliance in one state can trigger fines in another. Platforms are stuck either bending to every new rule or blocking entire populations just to avoid liability.

Some people call that safety. Others see it as the invention of a digital checkpoint system where adulthood must be proven over and over again.

The updated toolkit tries to offer a middle path: protect minors without building a surveillance state. That means emphasizing privacy-preserving verification methods, data minimization, and safeguards against turning porn sites into honeypots for identity theft. When your sexual curiosity can be cross-referenced with a government database, it’s not hard to imagine how badly that could go.

But this isn’t just about porn. It’s about how much of yourself you should have to reveal simply to access legal content. If a state can require ID to watch an adult video, why couldn’t it do the same for BDSM forums, queer education sites, or reproductive health information? The slope may not be slippery—it might already be greased.

There’s also the uncomfortable truth that “protecting kids” has become a political Swiss Army knife. Behind the moral language are groups who openly want to make adult content inaccessible altogether, not just to minors. Age verification becomes the first domino rather than the final safeguard. When lawmakers start treating porn the way others treat fentanyl, it’s worth asking who gets to define harm — and who gets punished in the process.

Meanwhile, the people enforcing these laws rarely understand how the internet works. The burden falls on smaller platforms, independent creators, and marginalized workers who already operate under scrutiny. Sex workers were dealing with censorship long before age-verification laws existed. Now, they’re being folded into legislation written by people who’ve never considered how someone pays rent by selling a video clip.

The irony? The more governments tighten restrictions, the faster users migrate to unregulated foreign sites where consent and safety checks don’t exist at all. The “protection” ends up exposing people to worse content, not preventing it.

If lawmakers truly cared about reducing harm, they would fund education, promote ethical production standards, and support platforms that actually moderate content responsibly. Instead, the system encourages the exact opposite: drive traffic to the shadows, then blame the shadows for being dark.

The toolkit is trying to hold the line—compliance without capitulation. It’s a reminder that safety and privacy don’t have to be adversaries. They can coexist, but only if laws are written by people who understand what’s at stake for users and creators.

Because asking adults to prove who they are before they can access legal sexual content isn’t just a technical requirement. It’s a worldview. One where the state sits in the bedroom doorway holding a clipboard, deciding who gets to come inside.

And once that door closes, it rarely opens back up.

Read More »

Pornhub Pushes Tech Giants to Adopt Device-Level Age Verification

Pornhub banner at AVN

Letters went out this week to Apple, Google, and Microsoft, urging them to build age verification directly into devices rather than forcing adults to scan IDs on every website. The message wasn’t subtle: fix this at the operating-system level, or the internet keeps getting messier.

“Based on our real-world experience with existing age assurance laws, we strongly support the initiative to protect minors online,” wrote Anthony Penhale, chief legal officer for Aylo, the company behind Pornhub, Brazzers, Redtube, and YouPorn. “However, we have found site-based age assurance approaches to be fundamentally flawed and counterproductive.”

The letter argues that traditional, site-by-site checking has “failed to achieve their primary objective: protecting minors from accessing age-inappropriate material online.” Instead, Aylo says the solution should live inside the device itself—confirm a user’s age once on a phone or tablet, then share that “age signal” through an API with adult sites when needed.

The timing isn’t random. Age verification laws are spreading across the US and UK, forcing users to upload government IDs or personal documents—often through third-party systems—just to watch explicit content. Twenty-five US states have passed some version of these laws, each with its own rules.

Pornhub pulled out of most of those states. The result? A massive drop in traffic everywhere compliance was required. In Louisiana, complying meant losing 80 percent of site visits. In the UK, where similar rules came into play under the Online Safety Act, traffic dropped almost 80 percent as well.

Aylo argues that pushing verification onto external services not only endangers privacy—it pushes people to sites that don’t check anything at all.

“We have seen an exponential surge in searches for alternate adult sites without age restrictions or safety standards at all,” said Alex Kekesi, vice president of brand and community at Pornhub.

She hopes tech companies eventually align with Aylo’s approach, especially after California passed the Digital Age Assurance Act (AB 1043), which requires app stores to verify ages before users download apps. “This is a law that’s interesting because it gets it almost exactly right,” she said.

Google responded by saying it’s working on new tools. “Google is committed to protecting kids online, including by developing and deploying new age assurance tools like our Credential Manager API that can be used by websites,” spokesperson Karl Ryan said. The company reminded that its app store already prohibits adult entertainment apps and that platforms like Aylo will still have to build their own compliance systems.

Microsoft didn’t comment directly, but pointed to a policy statement saying that “age assurance should be applied at the service level, target specific design features that pose heightened risks, and enable tailored experiences for children.”

Apple also didn’t comment directly, instead pointing to its existing safety documentation and noting that content filters are automatically enabled for users under 18. A recent update requires children under 13 to have designated accounts with built-in restrictions. Apple currently has no method to force every website to integrate a universal age-check API.

Pornhub says the existing legal framework isn’t working regardless. “The sheer volume of adult content platforms has proven to be too challenging for governments worldwide to regulate at the individual site or platform level,” said Kekesi. Aylo claims that verifying age once per device—rather than at every website—would protect privacy while still keeping minors out.

Research backs up the circumvention problem. Recent studies from New York University and the Phoenix Center say current laws fail because people simply route around them—using VPNs or migrating to sites that ignore the regulations entirely.

“Platform-based verification has been like Prohibition,” said Mike Stabile, director of public policy at the Free Speech Coalition. “We’re seeing consumer behavior reroute away from legal, compliant sites to foreign sites that don’t comply with any regulations or laws. Age verification laws have effectively rerouted a massive river of consumers to sites with pirated content, revenge porn, and child sex abuse material.” In his words, these laws “have been great for criminals, terrible for the legal adult industry.”

With age checks becoming the norm, anonymity online is disappearing fast—and communities already marginalized are likely to feel it first. Sex workers have been dealing with digital surveillance and censorship for years, and political groups have openly discussed using state laws to “back door” a national ban on online pornography. One playbook for a future Trump administration explicitly calls for doing just that.

The current wave of child-protection legislation is reshaping the internet far beyond adult content. Gaming, social platforms, and online communities are being pulled into the same regulatory orbit. In Australia, for example, minors under 16 will soon be kicked off major social platforms under new enforcement rules.

According to Stabile, that’s not an accident. In the US, he says, the major supporters of these bills fall into two camps: religious organizations that believe pornography shouldn’t exist at all, and identity-verification companies that profit from stricter rules. The first group wants to shrink the adult industry out of existence, while the second expands its market any way it can—even if that means aligning with groups that want to censor sexual content altogether.

And the lawmakers writing these bills? “Even well-meaning legislators advancing these bills have little understanding of the internet,” Stabile said. “It’s much easier to go after a political punching bag like Pornhub than it is Apple or Google. But if you’re not addressing the reality of the internet, if your legislation flies in the face of consumer behavior, you’re only going to end up creating systems that fail.”

People inside the adult industry say they’re not against rules—they just want rules that work. “Keeping minors off adult sites is a shared responsibility that requires a global solution,” said Kekesi. “Every phone, tablet, or computer should start as a kid-safe device. Only verified adults should unlock access to things like dating apps, gambling, or adult content.” She noted that in 2022, the platform introduced a chatbot that directs users searching for child abuse material toward counseling resources. Since then, Pornhub has released annual transparency reports and tightened upload verification.

Major tech companies—including Google, Meta, OpenAI, Snap, and Pinterest—supported California’s new age-authentication bill, and Kekesi sees it as a starting point rather than an endpoint.

“We obviously see that there’s kind of a path forward here,” she said.

Read More »

Yoti Cashes In as New Online Safety Rules Kick In

Yoti logo

Yoti’s revenue didn’t just rise this year—it exploded, conveniently right as age verification rules tightened under the Online Safety Act. The company reported a 55 percent jump in turnover, hitting £20.3 million for the year ending in March. Funny how the moment the government demands IDs to access half the internet, the firms selling ID checks start printing money.

“Regulatory issues are central to the business and Yoti is expecting to benefit from significant regulatory changes, in both identity and age, both in domestic and overseas markets,” the company said.

Translation: more laws that treat adults like children mean more profit.

“Anticipated regulatory changes in the United Kingdom, France and Australia, in particular, are expected to support the company’s growth.”

And that growth isn’t just about protecting anyone—it’s about building infrastructure where logging into a website slowly starts to resemble crossing a border. The more regulation expands, the more companies like Yoti become gatekeepers to everyday life online.

“Rapid development in the sophistication of both online fraud, deep fakes and the technology to prevent this, means that the market is constantly developing and growing.”

Sure, deepfakes and fraud are real concerns. But when the solution is “show your papers” to browse adult content—or eventually anything deemed “sensitive”—it’s worth asking who really benefits. Right now, it looks less like safety and more like a booming business model built on surveillance.

Read More »

Age Checks Didn’t Stop Porn Use—They Just Pushed Men Toward Harder, Unregulated Content

Pornsite on a mobile phone

There’s a strange moment that happens when you quit porn: suddenly sex feels… quieter. That’s what happened to Ray*, who stopped watching in July after age verification rules kicked in. At first, sex felt more vanilla. “I have been a little less creative in bed as I’m not trying anything I’ve recently seen [online] with my girlfriend,” he admitted. But it also felt more grounded. “It does feel healthier […] and it’s made ‘normal’ sex a little more exciting.”

If you somehow missed the memo, the UK’s Online Safety Act (OSA) began enforcing age checks on 18+ sites starting July 25. Imagine a bouncer suddenly stationed at the entrance of every porn site in the country—minus the velvet rope and questionable cologne. The idea was to keep minors from stumbling into explicit content. But for adults like Ray, who relied on free tube sites, the price of entry became handing over personal identification. “It’s reduced my porn usage by 99%,” said the 36-year-old workplace trainer. Others haven’t quit—they’ve just changed how they watch… and in some cases, doubled down.

This shift among adults isn’t the stated goal. The law was designed to protect children, especially given that 79% of young people in the UK say they’ve encountered violent porn before turning 18. With 80% public support, legislators believed ID checks were the best way to stop minors from accessing explicit material.

But critics worry the law is a gateway to broader online censorship. Sex workers are already feeling the consequences—forced to sanitize profiles and content across platforms, making it harder to market themselves safely or honestly. And while everyone keeps shouting about protecting kids, almost no one is talking about actual sex education or porn literacy, both of which are crucial for teens learning how to navigate desire and boundaries, and for adults struggling with compulsive habits.

Three months in, the OSA isn’t delivering on its most basic promise. “It will almost certainly reduce the most casual, accidental access to porn for under-18s, but if the question is whether it will stop young people altogether, the honest answer is no,” said Professor Clarissa Smith, co-editor of Porn Studies. “Teenagers are extraordinarily adept at routing around whatever adults prohibit, and they often end up in the less safe corners of the internet to do it.”

Over on Discord, teens have already discovered a workaround using screenshots of video game characters to bypass age checks. “It’s essential that we recognize that many young people will find ways to get around age-gating,” said Paula Hall, founder of The Laurel Centre, which focuses on sex and porn addiction. Government officials haven’t exactly been eager to discuss the results.

But like any prohibition, the law is changing behavior—especially among adults who, legally speaking, should be able to watch whatever consenting adults create. The age checks feel invasive to many. Who wants their kinks attached to a verified identity in a database that may or may not leak?

“Verification is wrong to me,” Ray said. “It’s like telling the fire brigade you’re about to burn your house down. It’s an inconvenience to everyone.” He doesn’t trust verification vendors. “I don’t believe the data is deleted, I don’t believe they are secure, and I don’t trust the location of these services either.”

David*, a film-industry craftsman, felt the same way. “There was no chance of me entering my personal details onto a porn site — and equally, getting a VPN for the purposes of watching porn seemed a bit desperate, so it was quite easy for me to disengage entirely.”

For others, it’s less about paranoia and more about effort. “It’s instant gratification, so I wouldn’t put that kind of effort in,” said Luke*, who works in patient enrollment. “I would never scan my face or hand over my ID. I’m not that concerned about personal security, it’s just [that age verification] is an effort and not worth it for the end result.”

PORN AGE VERIFICATION IS LIKE TELLING THE FIRE BRIGADE YOU’RE ABOUT TO BURN YOUR HOUSE DOWN

In that sense, the OSA is functioning like a convenience tax on horny impulses. Think of it like the plain-packaging laws for cigarettes—just instead of dull tobacco branding, it’s dulling access to bukkake. “For many, the additional time and effort it takes helps to reduce impulsive viewing and may encourage developing wider interests,” Hall noted.

“For some men, having to verify their age adds just enough friction [making an undesirable action more difficult to perform] that they delay or skip a viewing session,” added Smith. “It doesn’t mean they had a compulsive relationship with porn, just that spontaneity is sensitive to obstacles. Friction always changes behavior at the margins.”

Of course, whether porn counts as an addiction is another battle entirely. Hall believes “the language of addiction fits the lived experience.” Others argue it’s closer to a compulsive habit fueled by shame, not dopamine. But either way, a lot of people want to cut back: 80% of regular porn-watching men aged 18–29 in the UK say they’re concerned about their consumption.

For some, the OSA became the unexpected catalyst they needed. “I’ve found a renewed focus on having real fun (not just in the bedroom) and my relationships more generally seem to have blossomed,” said David. Friends have noticed a shift in his mood—“calm positivity,” he called it. “It’s pretty much ended my casual consumption; it was too easy to fall down that rabbit hole during moments of boredom or stress.”

But most men aren’t quitting—they’re adapting. “The larger pattern is displacement […] the desire doesn’t disappear; it detours,” said Smith. “The OSA increases the distance between a moment of wanting and a moment of accessing, but people bridge that distance in creative ways. Men are simply reorganizing their porn habits around the new architectures.”

And in true internet fashion, Reddit’s response was to make jokes about “moving to Norway.” Translation: VPNs are doing numbers. Proton VPN saw an 1,800% spike in sign-ups right after verification launched.

This also explains why headlines celebrating a supposed 77% drop in UK Pornhub traffic miss the plot—traffic is just being rerouted through “other countries.” Coincidentally, 77% of Gen Z say they watch porn regularly. The math speaks for itself.

Some men are using the moment to pay creators directly. “I do have a kink, so I found reconnecting to adult performers who lean into that world much more rewarding,” Ray said. “I buy a clip a month now, and I like the fact that I’m now paying the entertainers for their work.”

Most, however, are not pulling out their credit cards. Free sites still dwarf paid platforms by a massive margin—hundreds of millions of UK visits vs. a fraction of that on subscription-based services.

Edward* chose a different path: erotic literature. “I’d say the largest behavioral shift would be that I am now more open to the concept of readable erotica,” he said. He’s using fantasy instead of autoplay algorithms. “I definitely have been trying to curate my inner sex life using my own fantasies rather than taking the easy option of porn.”

I’VE BEEN CURATING MY INNER SEX LIFE USING MY OWN FANTASIES RATHER THAN TAKING THE EASY OPTION OF PORN

Maybe retro porn will return—DVDs tucked on dusty shelves like vinyl revival for genitals. That’s what one shop owner predicted. But nostalgia rarely wins against pixels and convenience.

More realistically, people are heading to sketchier sites with no age checks and much worse moderation. Ofcom helpfully publishes a running list of these platforms, effectively handing adults a menu of unfiltered content. “The OSA is likely driving adults into a more fragmented, less regulated ecosystem, largely because people are uneasy about handing over ID for sexual content,” Smith said. “We’re probably not going to see less engagement with porn, we’ll see different routes to it — many of them far outside the spaces the law was written for.”

Ray tried it. “I ended up on some unfiltered Eastern European sites, but finding what I wanted was difficult and unfulfilling.”

For others, the shift is darker. “I watch more hardcore stuff on unregulated sites now,” said one anonymous user.

The ripple effects aren’t subtle. Young men already drifting toward misogynistic corners of the internet now have more reason to end up there. “I grew up in an era of porn playing cards being traded in school,” Edward said. “It was pretty tame stuff compared to the likes of choking, degradation and the weird step-sibling shite that makes up large portions of porn sites. This trend and attitudes towards women in general I do find seriously concerning.”

That doesn’t mean porn itself is inherently harmful. Like anything pleasurable, it’s about intention, context, and consent. Performed ethically, and consumed by choice—not compulsion—it can enhance sex lives rather than replace them.

For some men, the OSA nudged them toward healthier patterns. A smaller group discovered new, ethical models of consumption. But for the vast majority, those digital bouncers are just a minor obstacle. They’ll keep flashing IDs—real or borrowed—and spending another night on the tubes.

Read More »

Congress vs. VPNs: Bold Moves From People Who Don’t Know What a Server Is

VPN

There was a time when the worst thing a lawmaker could do was force you to flash an ID just to look at something mildly sexual online. Turns out that was just the warm-up act. Now, politicians in Wisconsin, Michigan, and a few very enthusiastic copycats have decided the real enemy isn’t porn—it’s privacy itself.

And the new target? VPNs.

Yes, seriously.

Wisconsin’s A.B. 105/S.B. 130 demands that any website hosting content that could possibly be considered “sexual” must implement age verification and block access for anyone using a VPN. The bill also inflates the definition of what counts as material “harmful to minors,” sweeping in everything from discussions of anatomy to basic information about sexuality and reproduction.

It’s part of a trend: conservative lawmakers expanding “harmful to minors” far beyond what courts have historically allowed, pulling in sex education, LGBTQ+ health resources, art, memoirs, medical info—basically anything that makes them clutch their pearls.

Wisconsin’s bill has already passed the State Assembly and is crawling its way through the Senate. If it passes, it could become the first law in the country to effectively criminalize accessing certain content while using a VPN. Michigan tried something similar—requiring ISPs to identify and block VPN connections—but it stalled. Meanwhile, officials in the U.K. are calling VPNs “a loophole that needs closing.”

This isn’t abstract. It’s happening.

And if legislators get their way, it’s going to wreck far more than porn access.

Here’s Why This Is a Terrible Idea

VPNs hide your real location by routing traffic through another server. The site you visit sees the VPN’s IP, not yours. Think of it like using a P.O. box so someone doesn’t know your home address.

So when Wisconsin demands that websites “block VPN users from Wisconsin,” they’re essentially asking websites to perform sorcery. There’s no way to tell whether a VPN server is in Milwaukee or Mumbai. The tech doesn’t work that way.

Faced with legal risk, websites will either pull out of Wisconsin entirely or block all VPN users everywhere. One poorly drafted state law could break private browsing for the entire internet.

The collateral damage outweighs any hypothetical benefit.

Almost Everyone Uses VPNs

And it’s not just people trying to avoid showing their driver’s license before watching porn.

Businesses rely on VPNs. Remote workers need them. People checking email in a hotel lobby need them. Companies use VPNs to protect employee data, internal communications, and client files.

Students rely on VPNs because universities require them to access academic databases, class materials, and research tools. At the University of Wisconsin-Madison, WiscVPN “allows UW–‍Madison faculty, staff and students to access University resources even when they are using a commercial Internet Service Provider (ISP).”

Vulnerable people rely on VPNs, too. Domestic abuse survivors use them to hide their location. Journalists use them to protect sources. Activists use them to organize without surveillance. LGBTQ+ people in hostile regions rely on VPNs to access medical guidance and community support. In censorship-heavy countries, VPNs aren’t optional—they’re lifelines.

And then there are regular people who just don’t want to be tracked, profiled, and monitored by corporations and ISPs. That shouldn’t require a moral justification.

It’s a Privacy Nightmare

Block VPNs and suddenly everyone needs to verify their identity with government IDs, credit cards, or biometric data just to access perfectly legal content.

And we all know how that ends: corporate databases leaking browsing histories tied to real names, real IDs, and real consequences.

This has already happened. It will happen again. It’s not a question of if—just when.

Turning mandatory surveillance into law isn’t moral; it’s just invasive.

“Harmful to Minors” Is Not a Blank Check

Under longstanding legal standards, governments can restrict minor access to sexual content only when it “appeals to prurient interests” and lacks serious value for minors.

Wisconsin’s bill bulldozes that definition. It applies to material that simply describes sex or depicts anatomy. That could encompass literature, film, music, medical content, sex-ed resources, LGBTQ+ health information—basically anything human bodies do.

It gets worse. The bill applies to any website where more than one-third of the “material” meets that definition. Suddenly, most social platforms could be considered age-restricted simply for hosting sexuality-related conversations.

And when governments get to decide what topics are “harmful,” the first groups punished are always marginalized ones.

It Won’t Even Work

Let’s imagine this law passes. Here’s what happens:

People bypass it. Instantly.

They’ll switch to homemade VPNs, private proxies, Cloudflare tunnels, virtual machines, or rented servers for a few dollars. The internet routes around censorship like water finding cracks in concrete.

Even if commercial VPNs disappeared overnight, people could just create their own encrypted tunnels. It takes five minutes and a $5 cloud server.

Meanwhile, students, workers, journalists, abuse survivors, and everyone else gets stuck without privacy or access.

The law solves nothing and breaks everything.

VPNs shouldn’t be required to access legal speech—but they also shouldn’t be criminalized. The real issue is the age verification regime itself: it’s invasive, ineffective, and trivial to circumvent. It harms far more than it protects.

A False Dilemma

People didn’t flock to VPNs because they’re trying to commit crimes. They did it because governments tried to force identity verification onto everyday browsing. Instead of asking why millions of people don’t want to hand over their IDs to random websites, lawmakers decided the problem is the tools protecting them.

The whole premise is backwards.

The question isn’t “How do we keep kids safe online by destroying privacy for adults?”

It’s “Why is surveillance the only solution anyone in power can imagine?”

If the real goal is protecting young people, lawmakers could strengthen digital literacy, offer better parental tools, support education, or address genuine online harms.

Instead, they’re trying to criminalize privacy itself.

VPN bans aren’t about safety. They’re about control. And the irony is almost poetic: the people writing these laws don’t even understand the technology they’re trying to outlaw.

Read More »

From Panic to Prohibition: The U.K.’s Crackdown on Rough Sex

Choking

Sometimes it feels like the ‘70s never ended—just swap the disco for legislation. The U.K. is moving toward criminalizing depictions of choking during sex, framing it as a push to “protect women and girls.” On paper, that sounds noble. In practice, it’s a blanket ban that sweeps up consensual sexual expression between adults.

The ban wouldn’t just target production or distribution—it would criminalize possession. Even if the images are AI-generated. Even if no one is harmed. Even if the entire thing was created by consenting adults who enjoy that kind of play.

Proponents insist the mere existence of choking porn harms women, regardless of context or consent. Never mind that plenty of women actually enjoy “breath play” — whether that’s light choking, pressure, or controlled suffocation. Exposure itself, they argue, is a threat.

And in the name of “protection,” the government is essentially telling women what they can watch, produce, fantasize about, and do with their own bodies. Nothing says empowerment like a legal guardian you never asked for.

Lawmakers aren’t stopping there. Additional proposals would classify publishing sex-worker ads as “pimping,” and could even criminalize paying for webcam performances.

The Choking Amendment

The amendment was introduced on November 3 as part of the Crime and Policing Bill now moving through Parliament. It already cleared the House of Commons and is sitting with the Lords, with committee sessions scheduled through January 2026. Odds of passage? Pretty high. Beyond the choking ban, the bill also includes provisions like outlawing protests outside judges’ and politicians’ homes.

The amendment, proposed by Labour’s Alison Levitt, would make it illegal to possess a pornographic image—defined as any image “produced solely or principally for the purpose of sexual arousal”—if it “portrays, in an explicit and realistic way, a person strangling or suffocating another person, and (c) a reasonable person looking at the image would think that the persons were real.”

Publishing such material would also be illegal, with publishing defined broadly as “giving or making it available to another person by any means.”

So if anyone has downloaded BDSM content over the past decade, they could be prosecuted. Same goes for fans of the Fifty Shades movies. And naturally, the creators and performers who made the work in the first place.

Under the amendment, possession could mean up to two years in prison. Publishing could mean up to five.

There’s a narrow defense for people who film themselves engaging in such acts, but only if they directly participated, and only if “the act did not involve the infliction of any non-consensual harm on any person.”

Platforms would also face new obligations. According to government materials, “the depiction of strangulation in pornography will be designated as a priority offense under the Online Safety Act, meaning platforms…will be required to take proactive steps to prevent users from seeing illegal strangulation and suffocation content. This could include companies using automated systems to pre-emptively detect and hide the images, moderation tools or stricter content policies to prevent abusive content from circulating.”

Who Exactly Is Being Protected?

The government doesn’t hide the goal here: controlling private sexual behavior, including things that many women enjoy. Platforms, it says, “will be held accountable [for] ensuring content does not spread, which can lead to normalizing harmful practices in people’s private lives.”

Lately, sexual choking has become a cultural panic point—feminists, conservatives, anti-porn activists, and politicians all claim it’s a porn-driven plague assaulting unsuspecting women. The narrative paints breath play as a gateway to abuse and misogynistic violence.

But research complicates that narrative. Many women enjoy rough sex. Women are often the ones initiating choking. And most of the time, it doesn’t result in physical harm.

One survey of U.S. college and grad students found choking during sex was consensual 92 percent of the time, and “fewer than 1% of participants reported that their partner had ever lost consciousness due to their choking them.” Women, transgender, and nonbinary students were more likely than men to find choking pleasurable.

That matches a kink poll in which nearly 30 percent of women found choking erotic, compared to under 20 percent of men.

Another poll of young Australians found women were more likely to ask for choking than men. Porn was a common entry point—but not the only one. While 61 percent had seen depictions in porn, significant numbers cited movies, friends, social media, and partners.

A recent study found women were more likely than men to find sexual aggression in porn arousing. “About 69 percent of women in the study said they enjoyed at least some aggressive content, compared to 40 percent of men,” wrote Eric W. Dolan. “Women were also more likely than men to report arousal from ‘harder’ forms of aggression, such as choking or gagging, and were more likely to actively seek out pornographic videos that featured aggression.”

Of course, choking can be risky and can be dangerous if done wrong—or maybe even inherently unsafe, depending who you ask. (Some dominatrixes argue there’s no fully safe way to do it.)

But banning depictions doesn’t give people information to assess risk or negotiate safer play. And if anything, outlawing visuals may just make the behavior edgier without giving anyone better tools to stay safe, as writer Ana Valens points out.

If the real goal is protecting women, then education, harm-reduction, and honest conversations would go much further than criminalizing fantasies. Acting like this is some dark patriarchal conspiracy—rather than a thing many women choose and enjoy—turns a real conversation about safety into moral panic cosplay.

Other Porn Amendments in the U.K. Crime Bill

The choking ban is just one piece.

One proposal would allow performers to retroactively revoke consent for published content, requiring platforms to remove videos anytime someone featured asks—no matter prior contracts or payments. The language is messy and doesn’t explain how publishers are supposed to validate consent across already-released material.

The amendment also states that a person “commits an offense if they publish or allow or facilitate the publishing of pornographic content online where it has not been verified that…every individual featured…is an adult.” Taken literally, this could make everyone uploading porn liable if any content on the platform includes someone under 18.

Violators could face two years in prison and fines. Platforms could be fined up to £18 million or 10% of global revenue. The government could also order hosting providers, registrars, or ISPs to cut ties.

Another amendment would make it illegal to create “an indecent photograph” in which an adult pretends to be a minor. That includes fantasy content where no actual minors are involved. It also creates a crime for sharing content—“including text shared on internet forums”—that “advocates or celebrates” adults having sex with minors. That kind of language easily sweeps up fictional narratives, literary analysis, and speech that doesn’t harm anyone.

Another proposal targets software designed to create or alter images of people “in an intimate state.” That sounds like a strike at deepfake and “nudify” tools, but the language is so broad it could criminalize software that creates adult CGI porn—even if used consensually.

In all these cases, the issue isn’t just intent—it’s wording so sweeping that normal sexual content, fictional narratives, and artistic expression could all be caught in the dragnet.

Turning Ad Platforms Into ‘Pimps’ and Webcamming Into Prostitution

The bill also goes after sex work more broadly. An amendment from Mary Goudie would redefine pimping to include any facilitation of prostitution—even when no profit is involved. Publishing ads that facilitate sex work would also count as pimping.

Currently, it’s illegal to cause or control prostitution for financial gain. Under the new language, simply helping someone engage in consensual sex work—letting them borrow a car, offering a ride—could be criminal. Punishment: up to 10 years in prison.

Another amendment would criminalize giving or offering payment in exchange for sexual activity, even when the person being paid is consenting and not coerced. That includes physical contact and situations where someone “touches themselves for the sexual gratification of the other person.” There’s no explicit carve-out for digital performance, which could make paying for cam shows illegal.

Again: up to 10 years in prison.

While the choking ban has sparked headlines, these other proposals have slipped under the radar. Taken together, they paint a picture of lawmakers on all sides eager to police sexuality across the board—online, offline, consensual, fictional, and everything in between.

And it leaves a lingering question: when a government claims to protect us by deciding what we can desire, fantasize about, or consensually do with our own bodies…who is really being protected?

Read More »

Ofcom Targets More Sites as Users Flee to Non-Verified Platforms

Ofcom logo

It feels like every time a major platform caves to age checks, smaller sites quietly slip into the vacuum—and regulators eventually notice. That’s what’s happening now in the U.K., where Ofcom has opened investigations into 20 additional adult sites under the Online Safety Act’s age-assurance rules.

In a statement released Thursday, the agency named five providers that collectively operate the 20 sites under scrutiny: HQPorner, Porntrex, Fapello, XXBrits, and Sun Social Media, which runs playvids.com and peekvids.com.

According to Ofcom, these particular platforms weren’t chosen at random. They appear to have gained new users from the wave of traffic migrating away from larger, fully compliant sites—the ones that turned on age verification last summer and took the hit.

“We have prioritized action against these companies based on the risk of harm posed by the services they operate,” the statement reads. “We have taken particular account of their user numbers, including where we have seen significant increases in their user traffic since age-check laws came into force last summer.”

People have been predicting this shift for years: enforce age-verification on big platforms and users will simply move to smaller, noncompliant sites. Now it’s playing out in real time—major sites reporting steep traffic drops while their competitors grow without the same legal burdens.

Suzanne Cater, Ofcom’s Director of Enforcement, didn’t mince words. “The use of highly effective age assurance to protect children from harmful pornographic content is non-negotiable, and we will accept no excuses for failure. Any service that fails to meet its age-check duties under the Online Safety Act can expect to face robust enforcement action, including significant fines.”

Ongoing Investigations

Alongside the announcement about newly targeted sites, Ofcom also updated progress on a series of ongoing cases.

The agency fined Itai Tech Ltd., operator of the AI “nudification” site Undress.cc, a total of 55,000 pounds for failing to implement required age-assurance measures and for ignoring a statutory information request.

It also issued provisional decisions against 8579 LLC and Kick Online Entertainment for similar issues. “Both providers now have an opportunity to make representations to us before we make our final decisions,” the statement notes.

Meanwhile, investigations into Cyberitic LLC and the operator of xgroovy.com are expanding, as regulators assess whether those companies also failed to properly respond to formal requests for information.

One case did end on a more cooperative note: Ofcom closed its investigation into Trendio Ltd., concluding the provider has “taken steps in good faith towards compliance.”

With these new additions, the total number of platforms Ofcom is actively investigating under the Online Safety Act now sits at 76.

Read More »

Alabama and North Carolina Laws Spark Bans on Creators and Content Across Adult Platforms

Censorship

It started with a post on X that felt less like an announcement and more like a warning shot. Krystal Davis shared that one of her platforms would no longer accept adult content tied to Alabama or North Carolina—and suddenly a lot of creators across the U.S. were scrambling to figure out what this meant for them.

Some Adult Platforms Are Banning Content and Creators From Alabama and North Carolina

The notice laid out the new rules in blunt terms. The platform will reject:

Any productions shot in Alabama or North Carolina.

Any productions featuring talent who legally reside in those states.

Any productions featuring talent whose ID documents were issued by those states.

And it’s not just some vague future plan. The policy is attached to specific launch dates:

Alabama: applies to content shot on or after October 1, 2024.

North Carolina: applies to content shot on or after December 1, 2025.

Why Adult Platforms Are Banning Content From Alabama and North Carolina

Krystal Davis said her notice came from Adult Empire, and another creator reported getting a similar notice from Adult Time. It wouldn’t be surprising if more platforms quietly follow the same route. In a way, it feels like another one of those “small changes” that’ll end up reshaping the industry before anyone has time to react.

But why this move? Why now?

Both states recently passed sweeping laws regulating adult content online—laws that carry enough legal risk that platforms appear to be choosing exclusion over compliance. Instead of building new legal infrastructure, they’re just geoblocking the problem.

So let’s unpack the laws behind the panic.

Alabama’s HB164: A Strict Age-Verification and Consent Law With Heavy Penalties

Alabama

HB164 went into effect October 1, 2024, packaged as “consumer protection.” On paper, it reads like safety policy. In practice, it puts massive responsibility on platforms hosting adult content.

1. Mandatory Age Verification for All Adult Sites

Any commercial entity that “knowingly and intentionally publishes or distributes sexual material harmful to minors” must verify users are 18+ using a “reasonable age-verification method.”

And those verification services? They must be designed so they can’t retain user data.

If platforms screw up, they’re exposed to:

Civil lawsuits

Up to $10,000 per violation

Penalties under deceptive trade laws

2. Strict Written-Consent Requirements for All “Private Images”

Before publishing any “private image,” platforms need written, notarized consent from every person depicted—and those records have to be stored for five years.

3. Mandatory Warning Labels on Every Page

Not subtle ones either. We’re talking big, government-scripted warnings like:

“Pornography is potentially biologically addictive…”

“Pornography increases the demand for prostitution, child exploitation, and child pornography.”

4. A 10% Tax on Pornography Produced or Sold in Alabama

Section 10 slaps a 10% gross-receipts tax on memberships, subscriptions, and any material produced or sold in the state.

Why Platforms Are Responding by Blocking Alabama

If you’re a platform, you’re staring at:

High legal liability

Restrictions on data handling

Constant compliance demands

A tax on any content tied to the state

And lawsuit exposure for every alleged violation

At some point, it stops being a legal puzzle and starts being a cost-benefit analysis. And Alabama isn’t worth the math.

North Carolina’s HB805: Extremely Broad “Pornographic Image” Verification Rules

North Carolina

HB805 drops December 1, 2025 for adult-content sections, and while the bill covers everything from school libraries to “biological sex” definitions, the part that matters to creators is Article 51A.

This isn’t just strict; it’s procedural overkill.

1. Age and Consent Documentation for Every Person in Every Pornographic Image

Before publishing a pornographic image, platforms must verify:

The person was 18 at the time of creation

Written consent for each sex act performed

Written consent specifically for distribution

And, crucially: consent for performance does not equal consent to distribute

Platforms must collect:

A full consent form with personal details

A matching government ID

2. Mandatory Removal System With 72-Hour Deadlines

If a performer requests removal, platforms must comply within 72 hours—even if consent was properly documented.

If consent is questioned, content must be pulled down temporarily.

Re-uploads? Permanently banned.

3. Massive Civil Penalties

The Attorney General can impose:

Up to $10,000 per day per image for failure to remove

Up to $5,000 per day for publishing violations

Performers can also sue for $10,000 per day per image.

Why Platforms Are Banning North Carolina Content

HB805 basically forces platforms to:

Re-document performers from NC

Handle disputes more aggressively

Maintain permanent blocks on re-uploads

Maintain 1:1 traceable consent for every act in every piece of content

That’s not a tweak—it’s an entirely new compliance department.

You may also notice the bans include things like:

Talent living in those states

Talent whose IDs originate from those states

Content filmed in those states

This is because the laws follow the people and the production location—not just where the content is uploaded. That means:

An NC resident filmed in Las Vegas? Still a risk.

A performer who moved out of Alabama but still has an AL ID? Risk.

A scene shot in Alabama and uploaded from New York? Still covered.

The jurisdiction sticks like glue.

Adult platforms aren’t banning performers because they suddenly want to. They’re doing it because Alabama and North Carolina have created legal terrains where one clerical oversight could turn into six-figure penalties.

Alabama’s HB164 demands notarized consent, strict age verification, no data retention, warning labels, and a 10% tax.

North Carolina’s HB805 requires different consents for each act, ID verification, rapid takedowns, and crushing per-day fines.

Faced with that, some companies are choosing the path of least resistance: eliminating content tied to those states entirely. Will others follow? Probably. Not because they want to—because compliance costs more than creators do.

The laws don’t just restrict porn; they quietly redraw who gets to participate in the industry at all.

Read More »

xHamster Ends Texas AV Lawsuit With $120K Settlement

Xhamster logo

Sometimes a legal fight doesn’t end with a dramatic ruling—just a quiet deal and a check. That’s what happened in Texas, where Hammy Media, the company behind xHamster, agreed to settle a lawsuit over alleged violations of the state’s age verification law with a $120,000 payment.

Texas Attorney General Ken Paxton launched the suit in 2024. The complaint painted the site’s early verification screen as little more than a digital speed bump, arguing, “Minors can simply click almost anywhere on the webpage away from the ‘I’m 18 or older’ button, including the ‘X’ in the top right corner of the message, to dismiss the pop-up message and proceed to the Defendant’s pornographic website … The age verification methods used by the Defendant on its websites cannot be said to verify anything at all.”

The state didn’t start small. Texas initially asked a district court to impose penalties of up to $1.67 million, plus another $10,000 for every day after the filing date—a financial threat large enough to make most companies blink.

Those cases stalled for a while as everyone waited for the U.S. Supreme Court to decide whether these types of laws even hold up under the Constitution. The case—FSC v. Paxton, brought by the Free Speech Coalition—became the legal bellwether. In March, the court sided with Texas, declaring the law constitutional and effectively giving other states a green light to move forward with similar efforts. Once that happened, dormant lawsuits snapped back to life.

According to the agreed final order filed Nov. 7, the company made changes quickly. “Promptly after suit was filed, on March 21, 2024, Hammy Media restricted access to its website,” and it has now rolled out the kind of age verification Texas requires. The order also “resolves any and all claims based on the facts alleged in the State’s Petition” and specifies that the settlement isn’t an admission of wrongdoing—just a resolution.

Texas didn’t stop at xHamster. The state filed similar lawsuits in 2024 against Multi Media, the company behind Chaturbate, and Aylo, which operates Pornhub. Chaturbate settled in April; the Aylo case is still moving through the courts.

Read More »