The War on Porn

Safe Bet: Soon, They’ll Try to Ban VPN Use by Stan Q. Brick

Laptop saying age verification

Over on Forbes.com right now, there’s an article making the point that when you read somewhere that traffic from the UK to Pornhub is down 77%, you might want to take that figure with a grain of salt. Or maybe a pillar of the stuff.

Writing for Forbes, Zak Doffman goes further still, suggesting “you can completely ignore” such a claim because “it’s not true.”

“What’s actually happening is that U.K. adults are turning to VPNs to mask their locations,” Doffman writes. “Just as residents of U.S. states affecting bans now pretend to be someplace else. Pornhub makes this as easy as possible.”

The article goes on to cite (perhaps accurately – I’m certainly no expert on VPNs) a variety of reasons why this sudden expansion in VPN use may not be a good thing, including the eye-catching assertion that “VPNs are dangerous.”

“You are trusting all your content to a third-party provider who can see where you are and the websites you visit,” Duffman writes. “At a minimum. There are plenty of reports of rogue VPNs doing much worse than that. In particular, you must avoid free VPNs and Chinese VPNs. Stick to bluechip options.”

Duffman is probably right and his advice on sticking to the name brand VPNs probably makes good sense. But as a guy who misses the era of what people call the “open internet” my concern isn’t so much rogue VPN operators as it is rogue legislators.

As I read Duffman’s piece, I couldn’t help but imagine some elected official somewhere reading the same piece and saying to himself/herself: “OH. MY. GOD. This VPN thing MUST be stopped, whatever it is.” The manner of legislation that follows this sort of epiphany typically tries to solve one problem by creating another. Or maybe several others.

The thing is, it’s not Duffman’s warning about the potential dangers of VPN use that will drive the concern of my hypothetical legislator, not the potential security threat or the nefarious actors out there offering free VPNs.

No, what will get the legislators all fired up and ready to wield their pens again will be the part about the ease of using VPNs to get around their precious, legally mandated age verification walls.

I don’t expect too many legislators will seek to ban VPN use altogether, although doubtlessly there will be some bright bulb somewhere who proposes exactly that. More likely, what they’ll do is add something to an existing age verification statute that prohibits “facilitating the use of technology to circumvent state law” on the part of the adult site, or mandating that adult sites have to do what a lot of paywalled sites do for their own reasons, which is try to detect and defeat VPN use.

As Duffman notes, websites can “look at your browser settings or cellular settings or recognize you from previous visits…. That’s why it’s harder to watch live sports from your usual provider when you’re away from home, their market restrictions try to catch you out. Porn sites do not.”

For the sake of adults in the UK and elsewhere who would rather not hand over their sensitive personal information to a third party just to exercise their right to look at sexually explicit images, here’s hoping porn sites aren’t soon forced to do what they’re currently choosing not to do.

Read More »

AI Porn Triggers Some Very Tricky Debates by Morley Safeword

Human head with AI

There’s been a lot of discussion of AI-generated porn lately, particularly in the days since OpenAI announced that starting in December, the firm would allow “mature content” to be generated by ChatGPT users who have verified their age on the platform. Understandably, much of that discussion has centered on consent—or the lack of such—in the context of AI content generation, given the proliferation of “deepfake” content in recent years.

Concern over publicly available images being used to create AI porn without the consent of the people being depicted is also driving legislative bodies everywhere to consider passing new laws that specifically forbid the practice. In South Dakota, for example, Attorney General Marty Jackley wants the legislature to craft a new law making it a felony to create AI-generated porn from an image of a non-consenting adult, which would mirror a law passed in the state last year making it a crime to do so using images of a minor.

You can certainly understand why this sort of law appeals to people, even if there are some potentially tricky First Amendment questions raised by such a prohibition. I don’t think any of us like the idea of someone grabbing our old yearbook photos and creating ‘porn doubles’ of us to be distributed willy nilly on the internet. But that very understandable and sensible concern doesn’t make the potential First Amendment questions magically disappear.

For one, if it’s not possible to make it illegal to create, say, a painting of a public figure without that person’s permission (and it isn’t), can it be made illegal to use AI to create an image of that same person? If it’s OK to create a non-pornographic image of that person, can a pornographic image of them be illegal only if it is also considered legally “obscene”?

While a lot of the questions around AI porn pertain to its potential for abuse, there’s a flipside to it, as well. For example, if one’s primary objection to the creation of pornography is rooted in its impact on the performers—the risks to their health and safety, the oft-cited potential for human trafficking being involved, etc.—then isn’t it better if the only “actors” involved are entirely digital beings?

On the other hand, if you’re someone who creates adult content, particular in a performing capacity, the prospect of being replaced by a competitor who doesn’t need to travel, sleep, undergo STD screening or pay any bills is a frightening one, I should think—particularly if there’s no legal mechanism preventing unscrupulous third parties from profiting by effectively pirating your very likeness. Getting replaced in a job by anyone sucks; just imagine what it would be like to get replaced by a counterfeit of yourself!

To sort all this out and craft effective legislation and regulation of AI porn is going to take a lot of careful, deliberate, rational thought. Unfortunately, I’m not sure there’s a lot of that to be found within the halls of Congress or any other legislative body. So, in all likelihood, states around the country and countries around the world will continue to struggle to get their heads wrapped around AI porn (and AI more generally) the same way they’ve struggled with the internet itself for the last several decades.

In the meantime, the rest of us will try to muddle through, as best we can. Personally, I have no plans to either create or consume AI porn… but will I even know I’m doing so, if it happens?

Add that to the list of thorny questions, I suppose.

Read More »

Mandatory Age Verification Is Creating a New Security Crisis by John Johnson – Cybersecurity Expert

Discord logo

There’s a quiet rule that’s floated around cybersecurity circles for years: don’t hold onto more data than you’re capable of protecting. Simple, elegant, almost parental in its logic — if you can’t safeguard it, don’t collect it.

But the world doesn’t care about that rule anymore.

Laws around identity and age verification are spreading fast, and they’re forcing companies—whether they’re ready or not—to gather and store the most intimate, high-risk documents a person can hand over. Passports. Driver’s licenses. National IDs. All the things you’d rather keep in your own pocket, not scattered across the servers of whoever happens to run the website you’re using.

And then something like the Discord breach happens.

In early October 2025, The recent data breach involving Discord. Not Discord’s internal systems—one of the partners handling support. Hackers got access to support-ticket data: names, emails, IP addresses, billing info, conversation logs… the usual mess. But tucked inside that mess was something far more sensitive: government-issued IDs.

These were collected for one reason: to prove a user was old enough to be there. To appeal an underage ban. And suddenly, the private documents people reluctantly handed over “just to get their account back,” were sitting in someone else’s hands entirely.

The Trap These Laws Create

Discord didn’t wake up one day deciding it wanted a folder full of driver’s licenses. Companies aren’t hungry for that kind of liability. But regulators have been ramping up age-verification mandates, and the penalties for non-compliance are steep enough to make anyone comply.

You can see the logic in the laws. Protect kids. Keep platforms accountable. Reasonable goals.

But look closely at the side effects:

We’ve built a system where organizations must stockpile some of the most breach-sensitive personal data in existence — even when they have no business storing it, no infrastructure built to protect it, and no desire to be holding it at all.

The old rule of “collect as little as possible” dies the moment a legal mandate requires collecting everything.

One Breach Becomes Everyone’s Problem

And once a company becomes responsible for storing IDs, the risk spreads. Healthcare portals, schools, banks, e-commerce shops, SaaS platforms — anyone providing service to the general public could end up in the same situation.

Every new database of passport scans is a future headline waiting to happen.

And when it happens, the fallout isn’t just personal. It’s financial. Legal. Reputational. You lose customer trust once — and you don’t get it back.

For small companies, one breach can simply end the business.

The MSPs Get Pulled Into the Storm

Managed service providers—MSPs—don’t get to sit this one out. They inherit the problem from every client they support. One MSP breach doesn’t just hit one organization. It hits all of them at the same time.

And the typical MSP environment? It’s a patchwork quilt of tools stitched together over time:

  • One for backups

  • One for endpoint protection

  • Another for vulnerability scanning

  • A different one for patching

  • Another for monitoring

  • And maybe one more to try and tie it all together

Every tool is another doorway. Another password. Another integration that can fail silently. Another shadow corner where data can slip unencrypted or unmonitored.

In an age when MSPs are being asked to guard government IDs, medical files, financial records, and entire networks—you can’t afford those shadows.

The Fix Isn’t “More Tools” — It’s Fewer

The only real path forward is simplification.

Not by removing security controls, but by merging them. Consolidation. Native integration. One platform where backup, protection, monitoring, and recovery exist inside the same ecosystem, speaking the same language, managed from the same place.

When everything runs through a single agent with one control plane:

  • There are fewer gaps.

  • There are fewer weak handoffs.

  • There are fewer places for attackers to slip in unnoticed.

  • And the attack surface shrinks dramatically.

You trade chaos for clarity.

You trade complexity for protection.

The New Reality

That old cybersecurity rule—don’t collect more data than you can protect—wasn’t wrong. It’s just not optional anymore.

The Discord breach isn’t a one-off story. It’s a preview. A warning shot.

Organizations are being legally pushed into storing the exact type of data that attracts attackers the most. And MSPs are being put in charge of securing it at scale.

So the question shifts:

If you no longer get to choose how much data you collect…

you have to be very deliberate about how you protect it.

And that means rethinking the entire structure of how we secure systems—not by addition, but by alignment.

Because now the stakes aren’t abstract. They are literal: your identity, my identity, everyone’s identity.

And someone is always watching for the first loose thread.

Read More »

How to Stay Legally Protected When Policies Get Outdated

Adult Attorney Corey Silverstein talks about how to stay legally protected as an adult website owner. Here’s a summary of the article:

It feels like the adult industry just hit a hard reset. Age verification laws are no longer theoretical — they’re real, enforced, and expensive to ignore. And because every region wants something slightly different, the once-standard “one policy fits everywhere” approach is basically dead. If a site can’t explain exactly how it keeps minors out, it’s already behind.

At the same time, regulators and payment processors are demanding proof that every bit of content is consensual and monitored. The Aylo case didn’t accuse anyone of new wrongdoing, but it sent a clear message: it’s not enough to say you have safeguards — you need documentation, systems, records, and the ability to show them working. Old blanket model releases aren’t enough anymore. Consent now has to be specific, traceable, and ongoing.

And hanging over all of this is data privacy — the silent one that can shut a company down overnight. GDPR and CPRA require clear deletion rights, consent controls, and minimal data collection. Most adult sites still haven’t updated. The takeaway is simple: the old shortcuts are now risks. The companies that survive will be the ones who update before they’re forced to — not after.

Read More »

Big Tech Pushes Back Against New Colorado Rules

Netchoice logo

There’s something almost surreal about the idea of a phone interrupting a teenager’s late-night scrolling to say, Hey, maybe this isn’t great for your brain. That’s what Colorado’s new law is aiming for: gentle on-screen nudges when minors have been on a platform for more than an hour, or when they’re using it between 10 p.m. and 6 a.m. The warnings would have to reference research on brain development and repeat every thirty minutes. It’s the kind of thing that could become background noise — or maybe it could give someone a moment to pause. Hard to say. The provision is supposed to roll out on January 1.

But before any pop-ups appear, the law is already tied up in court. NetChoice, a group representing major social media platforms, has filed a lawsuit arguing that Colorado can’t require companies to deliver the government’s message on their own services. Their point is that social media isn’t just a product — it’s a place where communication happens, where speech happens — and the First Amendment doesn’t let the government compel speech, even in the name of public health. They note that other media industries have voluntary systems for guidance — movie ratings, for example — and that forcing platforms to present specific warnings crosses constitutional lines.

Supporters of the law say the goal is simply to help protect kids, especially given widespread concern about youth mental health. They argue that many parents want something in place to help guard against endless scrolling or disrupted sleep. Opponents respond that the science around social media’s effects is still developing — not nonexistent, but varied, with impacts that differ depending on the individual. Some researchers point to small but measurable effects; others note that social media can be a source of support, identity, creativity, or connection. So the debate becomes less about whether social media is “good” or “bad” and more about who gets to decide how platforms communicate risk — the state, the companies, or families themselves.

Similar legal battles are unfolding in other states, and courts haven’t landed on one clear answer yet. Some laws have been paused, others allowed to proceed while the challenges continue. Colorado’s case will likely turn on some very old constitutional questions — about speech, autonomy, and the limits of state power — applied to a very modern situation. And maybe that’s what makes this moment feel so unsettled: the technology is new, the stakes feel personal, and the rules are still being written.

Read More »

Why is Ofcom trying to censor Americans?

Spiked’s Adam Edwards opines on the Online Safety Act in the UK.

The story centers on U.S. lawyer Preston Byrne, who represents the message board 4chan and is openly defying the UK’s Online Safety Act. When the UK regulator Ofcom issued 4chan a £20,000 fine, Byrne publicly mocked the demand and argued that British law has no legal power over companies and citizens who have no operations or assets in the UK. He views the Online Safety Act as an overreaching censorship regime and says Ofcom is trying to enforce rules outside its jurisdiction by sending threatening letters instead of going through proper international legal channels.

The Online Safety Act requires any online platform accessed by UK users—regardless of where the company is based—to submit risk assessments, reports, and censorship plans, under threat of fines or even jail for executives. While 4chan has refused to comply and likely faces no real consequences because it has no UK presence, larger American companies like Meta and Google do have substantial assets in Britain, making potential enforcement far more serious. This has sparked broader questions about sovereignty, free speech, and whether a foreign government can compel U.S. companies to restrict or monitor content.

To counter the UK’s moves, Byrne has launched both a legal challenge in U.S. federal court and proposed new U.S. legislation called the GRANITE Act, which would allow American companies to sue foreign regulators like Ofcom if they attempt to impose fines or censorship demands. If passed, it could effectively block foreign censorship attempts and even allow U.S. courts to seize foreign government assets in retaliation. Byrne argues that if the UK cannot force U.S. firms to comply, British lawmakers may eventually be forced to reconsider the Online Safety Act altogether.

Read More »

Appeals Court Clears Tennessee to Begin Enforcing Age Verification Law

Tennessee flag

There are court decisions that land with a dull thud, and then there are the ones that feel like a door has quietly been locked behind you. This week’s move from the U.S. Sixth Circuit Court of Appeals sits firmly in the second category.
A three-judge panel just wiped away a lower court’s injunction that had been blocking Tennessee from enforcing its age verification law for adult sites. They didn’t do it with loud fanfare. They simply pointed to a recent Supreme Court decision and said, essentially, their hands are tied. And for anyone who works in digital adult spaces — creators, viewers, small platform operators — the shift is significant.
The law in question sits under the Protect Tennessee Minors Act (PTMA), which requires commercial adult platforms to verify the age of anyone trying to access sexual content. Tennessee lawmakers didn’t write this in a vacuum — they modeled it after Texas’ HB 1181, the same law that’s been bouncing through courts for over a year now. And with the Supreme Court’s ruling in Free Speech Coalition et al. v. Paxton, Tennessee suddenly found a legal green light where, a week earlier, there’d been a flashing red.
The Sixth Circuit judges put it plainly: “[The] Supreme Court has upheld a Texas statute that the Court described as ‘materially similar’ to the one at issue here.”
That line is the hinge. The before and after.
Once the highest court signed off on Texas’ law, Tennessee simply adjusted its own definitions to match. The panel noted that “the state has also since amended the PTMA, the Attorney General says, to track ‘the definitional language at issue in Paxton almost word for word.’”
Meaning: Tennessee rewrote its law to be legally bulletproof — or at least bullet-resistant — by mirroring the one that survived at the highest level.
And that’s where things get complicated, even surreal, depending on who you ask.
Earlier versions of the PTMA had been criticized for being vague, especially in how the state defined “obscenity” and “harmful to minors.” Those categories aren’t just slippery — they’re historically weaponized. What counts as obscene in Nashville might pass without a blink in New York, or Berlin, or literally any corner of the internet where adults talk to each other freely. Lawmakers in Tennessee tightened those definitions after the Supreme Court’s decision, lifting language almost directly from the Texas statute that the justices allowed to stand.
The result? A clearer legal blueprint — but one that still treats the adult internet like a gated amusement park, complete with bouncers and ID scanners.
The Sixth Circuit didn’t rule that age verification is harmless or good policy. They just said that, for now, the district court’s injunction can’t stand while appeals continue. In legal speak, the fight isn’t over. In reality, the pressure just shifted to the people who will have to comply immediately.
And that’s where the human story sits:
•The solo performer who runs her own website.
•The queer couple documenting intimacy as art.
•The niche platforms built by two founders and a rented server rack.
•The teenager who grew up online and suddenly hits a digital checkpoint that assumes danger instead of curiosity.
There’s a narrative underneath these rulings that rarely fits into the court filings:
This isn’t just about sex. It’s about who gets to speak, who gets to control, and who gets to define what is “harmful.”
Tennessee made its move quietly, in the wake of a louder battle in Texas. But the message travels. Other states are already watching, already drafting, already preparing to photocopy the language line by line.
Sometimes the law doesn’t roar.
Sometimes it whispers.
But the effect can be the same.
And the internet is listening.

Read More »

Free Speech Coalition Rolls Out New Age Verification Toolkit for Adult Platforms

Free Speech Coalition logo

Something’s always shifting in the world of compliance — especially when it comes to how adult sites handle age verification. The Free Speech Coalition (FSC), the legal advocacy nonprofit that often ends up doing the industry’s heavy lifting, just rolled out an updated version of its Compliance With U.S. Age Verification Laws: A Toolkit for Adult Websites.

The group explained on its website that the updates were necessary, given how quickly the legal ground keeps moving. What used to be “best practices” a few months ago can suddenly look outdated once new state laws or attorney general actions land on the table.

“Today, we are releasing an updated edition reflecting new legal developments and feedback from stakeholders who’ve put the toolkit into practice,” the post read.

And it’s not a light revision, either. “The key updates in this version include the final language and analysis of the Missouri age verification regulation taking effect November 30th and inclusion of recently-filed attorney-general actions, regulatory notices, and litigation related to age-verification laws,” FSC added.

That sense of urgency runs through every line of the update. “FSC’s guidance to our members continues to develop as state requirements and enforcement actions evolve,” said Alison Boden, executive director of the Free Speech Coalition. “Staying up-to-date is vital as companies make decisions about compliance.”

For those who actually need to keep their sites out of trouble (and their users protected), the new version of Compliance With U.S. Age Verification Laws: A Toolkit for Adult Websites is available for download at FreeSpeechCoalition.com/toolkit

Read More »

France Launches Investigation Into Shein, Temu, and AliExpress Over Youth Porn Exposure

Arcom logo

It’s one thing for a brand to go viral for its prices — it’s another to land under government investigation. France’s Finance Minister, Roland Lescure, just put fast-fashion giant Shein on notice after a watchdog found “child-like” sex dolls being sold on the site. Reuters broke the story, and Lescure didn’t mince words: he called the products “illegal.”

He went even further. “For terrorist acts, drug trafficking, and child pornography, the government has the right to request banning access to the French market,” he said. The threat hit just as Shein opened its first permanent retail store — a glitzy shop in central Paris inside the historic BHV department store. It’s the kind of irony that writes itself.

But this isn’t just about one brand. French regulators are now investigating whether Shein — along with Temu, AliExpress, and Wish — has allowed minors to access pornographic content through their platforms. That’s not just scandalous; it’s illegal. The country’s age-verification laws are strict, and these platforms may have crossed the line.

France’s consumer watchdog, the Directorate-General for Competition, Consumer Affairs and Fraud Control, issued an advisory explaining that “the e-commerce site Shein was selling child-like sex dolls.” They didn’t sit on it — the listings were reported to a public prosecutor.

The agency added that “these activities have been reported to ARCOM, the competent regulatory body in this area, and, in agreement with the public prosecutor, a report has been filed with the platform, urging it to implement appropriate measures promptly.” In other words, this is no warning shot — it’s an official escalation.

Quentin Ruffat, Shein’s head of public affairs in France, tried to strike a cooperative tone when speaking to local radio, as reported by Reuters. He said the company was sharing information with investigators, including the names of vendors and buyers.

“We are in the process of sacking all the offending vendors from the platform,” Ruffat said. Meanwhile, Lescure confirmed he’d submitted a report to ARCOM, noting that Shein qualifies as a “very large online platform” under the European Union’s Digital Services Act — meaning, yes, it’s squarely in the regulators’ crosshairs.

AliExpress, another major e-commerce player, isn’t escaping scrutiny either. It’s being investigated for allegedly distributing pornographic images or depictions of minors — a charge that can lead to five years in prison.

It’s worth remembering that these platforms — Shein, AliExpress, Temu — are backed by massive Chinese corporations, while Wish belongs to a Singaporean parent company. They’ve built empires on accessibility and affordability. But as France is reminding them now, there’s a line between disruption and disregard — and crossing it can get very expensive.

Read More »

Quick Look: The Status of Age Verification Laws Across the U.S. by Morely Safeword

age verification

As you’re likely aware, since you’re reading this site, in recent years there’s been a proliferation of new state laws across the United States that require adult websites to verify the age of users before displaying any content that may be deemed “harmful to minors” to those users.

After the recent Supreme Court decision in Free Speech Coalition v. Paxton, in which the court upheld the age verification mandate passed by the Texas legislature, similar laws in other states are now clearly enforceable. With Missouri’s law poised to take effect later this month, it’s a good time to remind ourselves of the states that have (and haven’t, yet) passed similar laws.

The states with active age verification mandates in place include Alabama, Arizona, Arkansas, Florida, Georgia, Idaho, Indiana, Kansas, Kentucky, Louisiana, Mississippi, Montana, Nebraska, North Carolina, North Dakota, Ohio, Oklahoma, South Carolina, South Dakota, Tennessee, Texas, Utah, Virginia and Wyoming. And as mentioned earlier, Missouri will soon join this list.

Quite a few states have not yet passed age verification laws, at least to date. Those states include Alaska, California, Colorado, Connecticut, Delaware, Hawaii, Illinois, Iowa, Maine, Maryland, Massachusetts, Michigan, Minnesota, Nevada, New Hampshire, New Jersey, New Mexico, New York, Oregon, Pennsylvania, Vermont, Washington, West Virginia and Wisconsin.

Several of the states on the list of those that haven’t passed age verification laws have considered such proposals in the past and may do so again in the future, of course. Doubtlessly, there are at least some legislators in every state who favor the measures and are likely to introduce new bills at some point in the future.

States that haven’t passed age verification laws but have debated them at some point include Colorado, Hawaii, Illinois, Iowa, Maryland, Michigan, Minnesota, Nevada, New Mexico, New York, Oregon and West Virginia.

For much more information on age verification laws around the country – and to keep track of new bills that would establish them in additional states – check out the Age Verification section of the Free Speech Coalition website.

Read More »