Political Attacks

From Panic to Prohibition: The U.K.’s Crackdown on Rough Sex

Choking

Sometimes it feels like the ‘70s never ended—just swap the disco for legislation. The U.K. is moving toward criminalizing depictions of choking during sex, framing it as a push to “protect women and girls.” On paper, that sounds noble. In practice, it’s a blanket ban that sweeps up consensual sexual expression between adults.

The ban wouldn’t just target production or distribution—it would criminalize possession. Even if the images are AI-generated. Even if no one is harmed. Even if the entire thing was created by consenting adults who enjoy that kind of play.

Proponents insist the mere existence of choking porn harms women, regardless of context or consent. Never mind that plenty of women actually enjoy “breath play” — whether that’s light choking, pressure, or controlled suffocation. Exposure itself, they argue, is a threat.

And in the name of “protection,” the government is essentially telling women what they can watch, produce, fantasize about, and do with their own bodies. Nothing says empowerment like a legal guardian you never asked for.

Lawmakers aren’t stopping there. Additional proposals would classify publishing sex-worker ads as “pimping,” and could even criminalize paying for webcam performances.

The Choking Amendment

The amendment was introduced on November 3 as part of the Crime and Policing Bill now moving through Parliament. It already cleared the House of Commons and is sitting with the Lords, with committee sessions scheduled through January 2026. Odds of passage? Pretty high. Beyond the choking ban, the bill also includes provisions like outlawing protests outside judges’ and politicians’ homes.

The amendment, proposed by Labour’s Alison Levitt, would make it illegal to possess a pornographic image—defined as any image “produced solely or principally for the purpose of sexual arousal”—if it “portrays, in an explicit and realistic way, a person strangling or suffocating another person, and (c) a reasonable person looking at the image would think that the persons were real.”

Publishing such material would also be illegal, with publishing defined broadly as “giving or making it available to another person by any means.”

So if anyone has downloaded BDSM content over the past decade, they could be prosecuted. Same goes for fans of the Fifty Shades movies. And naturally, the creators and performers who made the work in the first place.

Under the amendment, possession could mean up to two years in prison. Publishing could mean up to five.

There’s a narrow defense for people who film themselves engaging in such acts, but only if they directly participated, and only if “the act did not involve the infliction of any non-consensual harm on any person.”

Platforms would also face new obligations. According to government materials, “the depiction of strangulation in pornography will be designated as a priority offense under the Online Safety Act, meaning platforms…will be required to take proactive steps to prevent users from seeing illegal strangulation and suffocation content. This could include companies using automated systems to pre-emptively detect and hide the images, moderation tools or stricter content policies to prevent abusive content from circulating.”

Who Exactly Is Being Protected?

The government doesn’t hide the goal here: controlling private sexual behavior, including things that many women enjoy. Platforms, it says, “will be held accountable [for] ensuring content does not spread, which can lead to normalizing harmful practices in people’s private lives.”

Lately, sexual choking has become a cultural panic point—feminists, conservatives, anti-porn activists, and politicians all claim it’s a porn-driven plague assaulting unsuspecting women. The narrative paints breath play as a gateway to abuse and misogynistic violence.

But research complicates that narrative. Many women enjoy rough sex. Women are often the ones initiating choking. And most of the time, it doesn’t result in physical harm.

One survey of U.S. college and grad students found choking during sex was consensual 92 percent of the time, and “fewer than 1% of participants reported that their partner had ever lost consciousness due to their choking them.” Women, transgender, and nonbinary students were more likely than men to find choking pleasurable.

That matches a kink poll in which nearly 30 percent of women found choking erotic, compared to under 20 percent of men.

Another poll of young Australians found women were more likely to ask for choking than men. Porn was a common entry point—but not the only one. While 61 percent had seen depictions in porn, significant numbers cited movies, friends, social media, and partners.

A recent study found women were more likely than men to find sexual aggression in porn arousing. “About 69 percent of women in the study said they enjoyed at least some aggressive content, compared to 40 percent of men,” wrote Eric W. Dolan. “Women were also more likely than men to report arousal from ‘harder’ forms of aggression, such as choking or gagging, and were more likely to actively seek out pornographic videos that featured aggression.”

Of course, choking can be risky and can be dangerous if done wrong—or maybe even inherently unsafe, depending who you ask. (Some dominatrixes argue there’s no fully safe way to do it.)

But banning depictions doesn’t give people information to assess risk or negotiate safer play. And if anything, outlawing visuals may just make the behavior edgier without giving anyone better tools to stay safe, as writer Ana Valens points out.

If the real goal is protecting women, then education, harm-reduction, and honest conversations would go much further than criminalizing fantasies. Acting like this is some dark patriarchal conspiracy—rather than a thing many women choose and enjoy—turns a real conversation about safety into moral panic cosplay.

Other Porn Amendments in the U.K. Crime Bill

The choking ban is just one piece.

One proposal would allow performers to retroactively revoke consent for published content, requiring platforms to remove videos anytime someone featured asks—no matter prior contracts or payments. The language is messy and doesn’t explain how publishers are supposed to validate consent across already-released material.

The amendment also states that a person “commits an offense if they publish or allow or facilitate the publishing of pornographic content online where it has not been verified that…every individual featured…is an adult.” Taken literally, this could make everyone uploading porn liable if any content on the platform includes someone under 18.

Violators could face two years in prison and fines. Platforms could be fined up to £18 million or 10% of global revenue. The government could also order hosting providers, registrars, or ISPs to cut ties.

Another amendment would make it illegal to create “an indecent photograph” in which an adult pretends to be a minor. That includes fantasy content where no actual minors are involved. It also creates a crime for sharing content—“including text shared on internet forums”—that “advocates or celebrates” adults having sex with minors. That kind of language easily sweeps up fictional narratives, literary analysis, and speech that doesn’t harm anyone.

Another proposal targets software designed to create or alter images of people “in an intimate state.” That sounds like a strike at deepfake and “nudify” tools, but the language is so broad it could criminalize software that creates adult CGI porn—even if used consensually.

In all these cases, the issue isn’t just intent—it’s wording so sweeping that normal sexual content, fictional narratives, and artistic expression could all be caught in the dragnet.

Turning Ad Platforms Into ‘Pimps’ and Webcamming Into Prostitution

The bill also goes after sex work more broadly. An amendment from Mary Goudie would redefine pimping to include any facilitation of prostitution—even when no profit is involved. Publishing ads that facilitate sex work would also count as pimping.

Currently, it’s illegal to cause or control prostitution for financial gain. Under the new language, simply helping someone engage in consensual sex work—letting them borrow a car, offering a ride—could be criminal. Punishment: up to 10 years in prison.

Another amendment would criminalize giving or offering payment in exchange for sexual activity, even when the person being paid is consenting and not coerced. That includes physical contact and situations where someone “touches themselves for the sexual gratification of the other person.” There’s no explicit carve-out for digital performance, which could make paying for cam shows illegal.

Again: up to 10 years in prison.

While the choking ban has sparked headlines, these other proposals have slipped under the radar. Taken together, they paint a picture of lawmakers on all sides eager to police sexuality across the board—online, offline, consensual, fictional, and everything in between.

And it leaves a lingering question: when a government claims to protect us by deciding what we can desire, fantasize about, or consensually do with our own bodies…who is really being protected?

Read More »

Ofcom Targets More Sites as Users Flee to Non-Verified Platforms

Ofcom logo

It feels like every time a major platform caves to age checks, smaller sites quietly slip into the vacuum—and regulators eventually notice. That’s what’s happening now in the U.K., where Ofcom has opened investigations into 20 additional adult sites under the Online Safety Act’s age-assurance rules.

In a statement released Thursday, the agency named five providers that collectively operate the 20 sites under scrutiny: HQPorner, Porntrex, Fapello, XXBrits, and Sun Social Media, which runs playvids.com and peekvids.com.

According to Ofcom, these particular platforms weren’t chosen at random. They appear to have gained new users from the wave of traffic migrating away from larger, fully compliant sites—the ones that turned on age verification last summer and took the hit.

“We have prioritized action against these companies based on the risk of harm posed by the services they operate,” the statement reads. “We have taken particular account of their user numbers, including where we have seen significant increases in their user traffic since age-check laws came into force last summer.”

People have been predicting this shift for years: enforce age-verification on big platforms and users will simply move to smaller, noncompliant sites. Now it’s playing out in real time—major sites reporting steep traffic drops while their competitors grow without the same legal burdens.

Suzanne Cater, Ofcom’s Director of Enforcement, didn’t mince words. “The use of highly effective age assurance to protect children from harmful pornographic content is non-negotiable, and we will accept no excuses for failure. Any service that fails to meet its age-check duties under the Online Safety Act can expect to face robust enforcement action, including significant fines.”

Ongoing Investigations

Alongside the announcement about newly targeted sites, Ofcom also updated progress on a series of ongoing cases.

The agency fined Itai Tech Ltd., operator of the AI “nudification” site Undress.cc, a total of 55,000 pounds for failing to implement required age-assurance measures and for ignoring a statutory information request.

It also issued provisional decisions against 8579 LLC and Kick Online Entertainment for similar issues. “Both providers now have an opportunity to make representations to us before we make our final decisions,” the statement notes.

Meanwhile, investigations into Cyberitic LLC and the operator of xgroovy.com are expanding, as regulators assess whether those companies also failed to properly respond to formal requests for information.

One case did end on a more cooperative note: Ofcom closed its investigation into Trendio Ltd., concluding the provider has “taken steps in good faith towards compliance.”

With these new additions, the total number of platforms Ofcom is actively investigating under the Online Safety Act now sits at 76.

Read More »

Alabama and North Carolina Laws Spark Bans on Creators and Content Across Adult Platforms

Censorship

It started with a post on X that felt less like an announcement and more like a warning shot. Krystal Davis shared that one of her platforms would no longer accept adult content tied to Alabama or North Carolina—and suddenly a lot of creators across the U.S. were scrambling to figure out what this meant for them.

Some Adult Platforms Are Banning Content and Creators From Alabama and North Carolina

The notice laid out the new rules in blunt terms. The platform will reject:

Any productions shot in Alabama or North Carolina.

Any productions featuring talent who legally reside in those states.

Any productions featuring talent whose ID documents were issued by those states.

And it’s not just some vague future plan. The policy is attached to specific launch dates:

Alabama: applies to content shot on or after October 1, 2024.

North Carolina: applies to content shot on or after December 1, 2025.

Why Adult Platforms Are Banning Content From Alabama and North Carolina

Krystal Davis said her notice came from Adult Empire, and another creator reported getting a similar notice from Adult Time. It wouldn’t be surprising if more platforms quietly follow the same route. In a way, it feels like another one of those “small changes” that’ll end up reshaping the industry before anyone has time to react.

But why this move? Why now?

Both states recently passed sweeping laws regulating adult content online—laws that carry enough legal risk that platforms appear to be choosing exclusion over compliance. Instead of building new legal infrastructure, they’re just geoblocking the problem.

So let’s unpack the laws behind the panic.

Alabama’s HB164: A Strict Age-Verification and Consent Law With Heavy Penalties

Alabama

HB164 went into effect October 1, 2024, packaged as “consumer protection.” On paper, it reads like safety policy. In practice, it puts massive responsibility on platforms hosting adult content.

1. Mandatory Age Verification for All Adult Sites

Any commercial entity that “knowingly and intentionally publishes or distributes sexual material harmful to minors” must verify users are 18+ using a “reasonable age-verification method.”

And those verification services? They must be designed so they can’t retain user data.

If platforms screw up, they’re exposed to:

Civil lawsuits

Up to $10,000 per violation

Penalties under deceptive trade laws

2. Strict Written-Consent Requirements for All “Private Images”

Before publishing any “private image,” platforms need written, notarized consent from every person depicted—and those records have to be stored for five years.

3. Mandatory Warning Labels on Every Page

Not subtle ones either. We’re talking big, government-scripted warnings like:

“Pornography is potentially biologically addictive…”

“Pornography increases the demand for prostitution, child exploitation, and child pornography.”

4. A 10% Tax on Pornography Produced or Sold in Alabama

Section 10 slaps a 10% gross-receipts tax on memberships, subscriptions, and any material produced or sold in the state.

Why Platforms Are Responding by Blocking Alabama

If you’re a platform, you’re staring at:

High legal liability

Restrictions on data handling

Constant compliance demands

A tax on any content tied to the state

And lawsuit exposure for every alleged violation

At some point, it stops being a legal puzzle and starts being a cost-benefit analysis. And Alabama isn’t worth the math.

North Carolina’s HB805: Extremely Broad “Pornographic Image” Verification Rules

North Carolina

HB805 drops December 1, 2025 for adult-content sections, and while the bill covers everything from school libraries to “biological sex” definitions, the part that matters to creators is Article 51A.

This isn’t just strict; it’s procedural overkill.

1. Age and Consent Documentation for Every Person in Every Pornographic Image

Before publishing a pornographic image, platforms must verify:

The person was 18 at the time of creation

Written consent for each sex act performed

Written consent specifically for distribution

And, crucially: consent for performance does not equal consent to distribute

Platforms must collect:

A full consent form with personal details

A matching government ID

2. Mandatory Removal System With 72-Hour Deadlines

If a performer requests removal, platforms must comply within 72 hours—even if consent was properly documented.

If consent is questioned, content must be pulled down temporarily.

Re-uploads? Permanently banned.

3. Massive Civil Penalties

The Attorney General can impose:

Up to $10,000 per day per image for failure to remove

Up to $5,000 per day for publishing violations

Performers can also sue for $10,000 per day per image.

Why Platforms Are Banning North Carolina Content

HB805 basically forces platforms to:

Re-document performers from NC

Handle disputes more aggressively

Maintain permanent blocks on re-uploads

Maintain 1:1 traceable consent for every act in every piece of content

That’s not a tweak—it’s an entirely new compliance department.

You may also notice the bans include things like:

Talent living in those states

Talent whose IDs originate from those states

Content filmed in those states

This is because the laws follow the people and the production location—not just where the content is uploaded. That means:

An NC resident filmed in Las Vegas? Still a risk.

A performer who moved out of Alabama but still has an AL ID? Risk.

A scene shot in Alabama and uploaded from New York? Still covered.

The jurisdiction sticks like glue.

Adult platforms aren’t banning performers because they suddenly want to. They’re doing it because Alabama and North Carolina have created legal terrains where one clerical oversight could turn into six-figure penalties.

Alabama’s HB164 demands notarized consent, strict age verification, no data retention, warning labels, and a 10% tax.

North Carolina’s HB805 requires different consents for each act, ID verification, rapid takedowns, and crushing per-day fines.

Faced with that, some companies are choosing the path of least resistance: eliminating content tied to those states entirely. Will others follow? Probably. Not because they want to—because compliance costs more than creators do.

The laws don’t just restrict porn; they quietly redraw who gets to participate in the industry at all.

Read More »

xHamster Ends Texas AV Lawsuit With $120K Settlement

Xhamster logo

Sometimes a legal fight doesn’t end with a dramatic ruling—just a quiet deal and a check. That’s what happened in Texas, where Hammy Media, the company behind xHamster, agreed to settle a lawsuit over alleged violations of the state’s age verification law with a $120,000 payment.

Texas Attorney General Ken Paxton launched the suit in 2024. The complaint painted the site’s early verification screen as little more than a digital speed bump, arguing, “Minors can simply click almost anywhere on the webpage away from the ‘I’m 18 or older’ button, including the ‘X’ in the top right corner of the message, to dismiss the pop-up message and proceed to the Defendant’s pornographic website … The age verification methods used by the Defendant on its websites cannot be said to verify anything at all.”

The state didn’t start small. Texas initially asked a district court to impose penalties of up to $1.67 million, plus another $10,000 for every day after the filing date—a financial threat large enough to make most companies blink.

Those cases stalled for a while as everyone waited for the U.S. Supreme Court to decide whether these types of laws even hold up under the Constitution. The case—FSC v. Paxton, brought by the Free Speech Coalition—became the legal bellwether. In March, the court sided with Texas, declaring the law constitutional and effectively giving other states a green light to move forward with similar efforts. Once that happened, dormant lawsuits snapped back to life.

According to the agreed final order filed Nov. 7, the company made changes quickly. “Promptly after suit was filed, on March 21, 2024, Hammy Media restricted access to its website,” and it has now rolled out the kind of age verification Texas requires. The order also “resolves any and all claims based on the facts alleged in the State’s Petition” and specifies that the settlement isn’t an admission of wrongdoing—just a resolution.

Texas didn’t stop at xHamster. The state filed similar lawsuits in 2024 against Multi Media, the company behind Chaturbate, and Aylo, which operates Pornhub. Chaturbate settled in April; the Aylo case is still moving through the courts.

Read More »

NetChoice Pushes Back, Suing Virginia Over Youth Social-Media Rules

Netchoice logo

There’s something strangely jarring about a state telling people how long they’re allowed to look at a screen. Maybe it’s because most of us grew up sneaking extra time on whatever device we had — the old family computer, a flip phone under the covers, whatever — and now here we are, watching lawmakers try to ration minutes like they’re passing out rations during a storm.

That’s the backdrop for a new fight in Virginia, where a major tech trade group just hit the state with a lawsuit over its sweeping new social-media law. NetChoice — the group that represents some of the biggest names in the digital world — filed its challenge in federal court in Alexandria, arguing that the state has stepped way, way over the line with Senate Bill 854. The suit names outgoing attorney general Jason Miyares and doesn’t pull punches about what’s at stake: the First Amendment rights of both adults and minors.

“Virginia must leave the parenting decisions where they belong: with parents,” said Paul Taske, co-director of NetChoice’s Litigation Center. “By asserting that authority for itself, Virginia not only violates its citizens’ rights to free speech but also exposes them to increased risk of privacy and security breaches.”

And then he doubled down: “We look forward to defending Virginians’ First Amendment rights in court.”

The law they’re talking about was signed back in May by Gov. Glenn Youngkin — who won’t be around when it actually kicks in on January 1, 2026. And once it does, things change fast. Anyone in Virginia who wants to access protected speech online has to verify their age. Adults, minors, parents — everyone gets checked at the door. No ID, no entry.

But the age-checks are only half the story. There’s also a hard, government-imposed time limit: one hour per day for anyone under 16. If an adult wants more time? They have to confirm that they’re the ones asking for it. It’s like a digital permission slip from the state.

Taske didn’t mince words about the absurdity he sees in that: “Virginia’s government cannot force you to read a book in one-hour chunks, and it cannot force you to watch a movie or documentary in state-preferred increments. That does not change when the speech in question happens online.”

Then there’s the kicker — the law also bans cellphones and mobile devices in schools. Not just a tweak, not a pilot program, but an across-the-board prohibition framed as a public-health move to protect kids. It’s sweeping, dramatic, and almost guaranteed to reshape daily life for families if it survives the courts.

NetChoice’s membership list reads like a who’s who of the digital universe: Meta Platforms, Netflix, Google, X, Etsy. The people building the platforms most of us touch every day. And they’re staring down a law that tries to regulate not just what people can see online, but how long they’re allowed to see it.

The whole thing feels like one of those moments where technology and policy collide in a way that makes you stop and wonder which part of the future we’re actually building — and who gets to hold the timer.

Read More »

Porn and Politicians: Still Reliable Clickbait by Stan Q. Brick

James Talarico

And here I thought being outraged by the things politicians say and do had become passe. Apparently not, if your transgressions include (please, those with delicate sensibilities, cover your eyes) following OnlyFans models and escorts on social media.

“‘Devout Christian’ Dem caught following prostitutes, OnlyFans models on social media,” proclaimed the New York Post, which as a publication that backs Donald Trump, clearly demands a higher standard of social media decorum than following an OnlyFans model. As I’m sure the discerning editors of the Post would tell him, James Talarico should stick to more family-friendly online activities, like sharing videos of well-informed patriots who can inform us of important, well-established facts – like Osama bin Laden is still alive.

Naturally, Talarico’s publicity flacks had to deny any meaningful personal interest in the eyebrow raising follows on the part of the good would-be Senator. After his timeline became news, they quickly fired off a statement stating the campaign’s “social media team – including James – follows back and engages with supporters who have large followings and does not investigate their backgrounds.”

To be fair, that explanation is plausible enough, so far as these things go. I can’t help but wonder, though; is part of the reason some of our elected officials are so inclined to support laws restricting and regulating all manner of sex-related things a need to distance themselves from their own desires? Or maybe more to the point, a need to be seen as standing against certain “immoral behaviors,” regardless of whether they truly are against those behaviors?

If you’re James Talarico, I suppose you must put out a statement like the semi-denial offered by his team. Your name isn’t Donald Trump, so you can’t simply say “Those social media follows were planted on my timeline by the Deep State” and expect half the country to believe you.

And I suppose if you’re the New York Post, you can’t go around taking the position it doesn’t matter if some Democrat from Texas likes and/or follows an OnlyFans model, just because the guy you endorsed for President paid hush money to a porn star. The same can be said for the rest of the media that seized on the story; they all have bills to pay and a sex-related scandals are reliable eyeball magnets.

But would the world (or the truth) truly suffer if we were to give a story like Talarico’s little social media snafu the sort of mundane headline it arguably deserves? Would people miss the momentary rush of self-righteous glee that accompanies such a story if we crowned it with something like: “Semi-Famous Texan Likes Pictures of Attractive Women”? Or how about “Guy Who Believes in God Sometimes Also Thinks About Sex”? Or perhaps “Stunner: Would-Be Senator Has Actual Blood in Veins”?

Either way, maybe James Talarico should look at the bright side: at least he isn’t a politician in the UK who was, say, following someone who made porn that depicts strangulation, or he could have much bigger problems on his hands.

Read More »

Pornhub, Stripchat Challenge EU’s VLOP Tag, Calling the Data Faulty

Pornhub logo

There’s something strangely surreal about watching two adult platforms square off against one of the most powerful regulatory bodies in Europe. It’s like seeing the quiet kid in class suddenly challenge the teacher on the grading rubric — bold, a little chaotic, and honestly pretty fascinating. That’s what played out this week in Luxembourg, where attorneys for Pornhub and Stripchat told the EU’s General Court that the European Commission misjudged them based on shaky data.

Both cases — Aylo Freesites v. Commission for Pornhub and Technius v. Commission for Stripchat — orbit around the same issue: whether these sites really belong in the Digital Services Act’s “very large online platform” category, the VLOP bucket reserved for players with at least 45 million monthly EU users. It’s a label that comes with heavy regulatory weight, not to mention a fee structure that can make a CFO sweat.

But on Friday, according to reporting from MLex, something interesting happened. Christopher Thomas, Aylo’s lawyer, basically asked the court the digital equivalent of: Why are you trusting the substitute teacher but not the person who actually runs the classroom? He questioned why the commission brushed off Pornhub’s own methodology as noncompliant with DSA rules — yet readily accepted Similarweb’s data, even though no one seemed to know “what underlying data Similarweb used, or the maths applied.”

Pornhub’s internal numbers reportedly fell below the VLOP threshold. Similarweb’s outside numbers? Above it.

A small difference in theory, a massive difference in regulatory reality.

Thomas drove the point home with a simple comparison:

“If a provider said they had purchased an estimate below 45 million, but didn’t know the data or methodology on which that was based and couldn’t explain it to the commission, it would obviously be unacceptable.”

It’s one of those arguments that’s hard to un-hear once it’s been said.

From the commission’s side, attorney Paul-John Loewenthal offered something that sounded like a quiet confession about the digital universe we all live in. He acknowledged that user counts are, at the end of the day, approximations. There is “no golden method to calculate user numbers; it’s not possible.”

And with that, the case hung in the air — unresolved, unsettled, and oddly revealing.

Stripchat’s situation wasn’t far off. Their VLOP label also came from Similarweb data, which originally pinned the site above the 45-million mark. Then came the twist: Similarweb later revised the estimate downward, dropping Stripchat below the threshold entirely.

On Thursday, Stripchat operator Technius asked the court to go back and undo the European Commission’s 2023 ruling that labeled them a VLOP in the first place — not just reversed going forward, but retroactively scrapped. And even though the commission already revoked the VLOP status earlier this year, Technius still has reasons to chase a clean erasure.

According to MLex, attorney Tobias Bosch explained that a retroactive annulment could do more than tidy the record. It could help Stripchat recover the supervisory fee it paid. It could shift the jurisdiction over who gets to police them. And it could influence an active investigation into whether the platform violated the DSA in the past.

It’s wild how much hinges on something as mundane-sounding as user-count methodology. But maybe that’s the real tension here — in a world obsessed with data, the people interpreting the numbers often have more power than the numbers themselves.

Read More »

Open The Age Verification Floodgates!

Ben Suroeste opines on a new age verification service. Here’s a summary:

Brady Mills Agency just rolled out AgeWallet, their shiny new age-verification system they say will help small adult-sector merchants survive the avalanche of new compliance rules. They’re leaning hard on the idea that the Supreme Court’s FSC v. Paxton ruling kicked the industry into fast-forward, and honestly, they’re not wrong—tools like this are going to keep popping up like mushrooms after rain.

What stands out, though, is the quieter warning baked into the announcement: the beginning of a serious crackdown on VPNs. AgeWallet claims it can detect proxies, masked locations, all of it—and force users to re-verify if anything looks off. Great for merchants trying to stay legal. Not so great if you’re living under a government that already decides what you’re allowed to read or watch. For those of us who remember the scrappy, boundary-free internet, this feels less like progress and more like another brick in the wall.

 

Read More »

Safe Bet: Soon, They’ll Try to Ban VPN Use by Stan Q. Brick

Laptop saying age verification

Over on Forbes.com right now, there’s an article making the point that when you read somewhere that traffic from the UK to Pornhub is down 77%, you might want to take that figure with a grain of salt. Or maybe a pillar of the stuff.

Writing for Forbes, Zak Doffman goes further still, suggesting “you can completely ignore” such a claim because “it’s not true.”

“What’s actually happening is that U.K. adults are turning to VPNs to mask their locations,” Doffman writes. “Just as residents of U.S. states affecting bans now pretend to be someplace else. Pornhub makes this as easy as possible.”

The article goes on to cite (perhaps accurately – I’m certainly no expert on VPNs) a variety of reasons why this sudden expansion in VPN use may not be a good thing, including the eye-catching assertion that “VPNs are dangerous.”

“You are trusting all your content to a third-party provider who can see where you are and the websites you visit,” Duffman writes. “At a minimum. There are plenty of reports of rogue VPNs doing much worse than that. In particular, you must avoid free VPNs and Chinese VPNs. Stick to bluechip options.”

Duffman is probably right and his advice on sticking to the name brand VPNs probably makes good sense. But as a guy who misses the era of what people call the “open internet” my concern isn’t so much rogue VPN operators as it is rogue legislators.

As I read Duffman’s piece, I couldn’t help but imagine some elected official somewhere reading the same piece and saying to himself/herself: “OH. MY. GOD. This VPN thing MUST be stopped, whatever it is.” The manner of legislation that follows this sort of epiphany typically tries to solve one problem by creating another. Or maybe several others.

The thing is, it’s not Duffman’s warning about the potential dangers of VPN use that will drive the concern of my hypothetical legislator, not the potential security threat or the nefarious actors out there offering free VPNs.

No, what will get the legislators all fired up and ready to wield their pens again will be the part about the ease of using VPNs to get around their precious, legally mandated age verification walls.

I don’t expect too many legislators will seek to ban VPN use altogether, although doubtlessly there will be some bright bulb somewhere who proposes exactly that. More likely, what they’ll do is add something to an existing age verification statute that prohibits “facilitating the use of technology to circumvent state law” on the part of the adult site, or mandating that adult sites have to do what a lot of paywalled sites do for their own reasons, which is try to detect and defeat VPN use.

As Duffman notes, websites can “look at your browser settings or cellular settings or recognize you from previous visits…. That’s why it’s harder to watch live sports from your usual provider when you’re away from home, their market restrictions try to catch you out. Porn sites do not.”

For the sake of adults in the UK and elsewhere who would rather not hand over their sensitive personal information to a third party just to exercise their right to look at sexually explicit images, here’s hoping porn sites aren’t soon forced to do what they’re currently choosing not to do.

Read More »

AI Porn Triggers Some Very Tricky Debates by Morley Safeword

Human head with AI

There’s been a lot of discussion of AI-generated porn lately, particularly in the days since OpenAI announced that starting in December, the firm would allow “mature content” to be generated by ChatGPT users who have verified their age on the platform. Understandably, much of that discussion has centered on consent—or the lack of such—in the context of AI content generation, given the proliferation of “deepfake” content in recent years.

Concern over publicly available images being used to create AI porn without the consent of the people being depicted is also driving legislative bodies everywhere to consider passing new laws that specifically forbid the practice. In South Dakota, for example, Attorney General Marty Jackley wants the legislature to craft a new law making it a felony to create AI-generated porn from an image of a non-consenting adult, which would mirror a law passed in the state last year making it a crime to do so using images of a minor.

You can certainly understand why this sort of law appeals to people, even if there are some potentially tricky First Amendment questions raised by such a prohibition. I don’t think any of us like the idea of someone grabbing our old yearbook photos and creating ‘porn doubles’ of us to be distributed willy nilly on the internet. But that very understandable and sensible concern doesn’t make the potential First Amendment questions magically disappear.

For one, if it’s not possible to make it illegal to create, say, a painting of a public figure without that person’s permission (and it isn’t), can it be made illegal to use AI to create an image of that same person? If it’s OK to create a non-pornographic image of that person, can a pornographic image of them be illegal only if it is also considered legally “obscene”?

While a lot of the questions around AI porn pertain to its potential for abuse, there’s a flipside to it, as well. For example, if one’s primary objection to the creation of pornography is rooted in its impact on the performers—the risks to their health and safety, the oft-cited potential for human trafficking being involved, etc.—then isn’t it better if the only “actors” involved are entirely digital beings?

On the other hand, if you’re someone who creates adult content, particular in a performing capacity, the prospect of being replaced by a competitor who doesn’t need to travel, sleep, undergo STD screening or pay any bills is a frightening one, I should think—particularly if there’s no legal mechanism preventing unscrupulous third parties from profiting by effectively pirating your very likeness. Getting replaced in a job by anyone sucks; just imagine what it would be like to get replaced by a counterfeit of yourself!

To sort all this out and craft effective legislation and regulation of AI porn is going to take a lot of careful, deliberate, rational thought. Unfortunately, I’m not sure there’s a lot of that to be found within the halls of Congress or any other legislative body. So, in all likelihood, states around the country and countries around the world will continue to struggle to get their heads wrapped around AI porn (and AI more generally) the same way they’ve struggled with the internet itself for the last several decades.

In the meantime, the rest of us will try to muddle through, as best we can. Personally, I have no plans to either create or consume AI porn… but will I even know I’m doing so, if it happens?

Add that to the list of thorny questions, I suppose.

Read More »