Commentary

The Adult Industry Has Been Through Worse. We Will Survive by Morley Safeword

Anthony Comstock

These are challenging times for the adult entertainment industry, no doubt. Around the globe, governments are passing increasingly strict regulations around age verification and other, more censorious measures putatively designed to “protect minors,” but which legislators and anti-porn crusaders also hope will reduce porn consumption among adults, as well.

If all this is enough to inspire some folks in the adult industry want to wave the white flag, close up shop, and find something else to do for a living, I can certainly understand why. As the name of this site reflects, people in the industry rightfully feel like they’re under siege, waging a battle against forces with a great deal more wealth and power to enlist as weapons than does our side.

As someone who has worked in the adult industry for nearly 30 years (and who has enjoyed its products even longer), take it from me when I tell you none of this is new. Some of the battlefields are new and they are constantly evolving, but the war itself goes back longer than many of us can remember.

In the United States, obscenity laws and other statutes designed to maintain public morals and prevent the corruption of society date back to colonial times. In other words, long before there was an adult entertainment industry against which to wage war, the government was taking aim at sexual expression and conduct.

Fast forward to the 19th Century and there was the establishment of the Comstock Act of 1873, which—among many other things—made it a criminal offense to send obscene materials through the U.S. mail. The Act also made it illegal to use the mail to tell someone where such materials might be found, or how to make them provisions, which was later struck down by the courts as overly broad, thankfully.

To give you an idea of just how much more restrictive the obscenity laws were in the early 20th Century than they are today, you need only look as far as the name of a seminal case from 1933 – United States v. One Book Called Ulysses. Frankly, the contents of James Joyce’s Ulysses wouldn’t even be enough to raise one-half of a would-be censor’s eyebrow these days, yet it was considered positively scandalous in its day.

From an American adult industry perspective, the War on Porn arguably reached its zenith in the 1980s and 1990s, under Presidents Ronald Reagan and George H.W. Bush. According to the Bureau of Justice Statistics, in 1990 alone there were 74 federal obscenity prosecutions targeting adult materials (as opposed to Child Sexual Abuse Materials, which are patently illegal and have no protection under the First Amendment). Contrast that figure with 2009, in which there were a total of six.

Despite the number of prosecutions at the start of the decade, the 1990s were a period of tremendous growth for the adult industry, driven in large part by the advent of the commercial internet and its relatively unregulated environment. What we’re seeing now is what governments might call a “correction” of that laissez faire approach – and what those of us in the industry might call an overcorrection.

Yes, age-verification laws present a challenge. Like a lot of people in the adult industry, I don’t object to the idea of making people prove they’re adults before consuming porn; what I object to is the means by which we’re required to offer such proof and the way those methods compromise not only our privacy, but potentially open us up to extortion, identity theft and other crimes. I’m also not convinced age verification, at least as currently executed, does much to prevent minors from being exposed to porn.

If you were to ask any of the people who have been prosecuted for obscenity for the movies they’ve made, books they’ve written, or magazines they’ve published, I think you’d find near unanimity on the question of whether they’d rather pay a financial penalty, or face serving years in prison in addition to being fined, as the likes of Paul Little (AKA “Max Hardcore”) have done in the past.

My point here is not that those of us currently working in the adult industry should simply thank our lucky stars we avoided the crackdowns of the past or simply accept the current campaign against the adult industry without putting up a fight. My point is simply this: We’ve been under the gun for decades and we’ve not only survived but expanded as an industry considerably along the way.

The bottom line, whether the anti-porn zealots like it or not, is many humans like sexual expression, whether one calls it “porn,” “erotica,” or “filth.” Neither the desire to consume the products we make nor the desire to make them is going away—and neither are we.

Read More »

What Would Ethical Age Verification Online Actually Look Like?

age verification

Age-verification laws are spreading fast, and on paper they sound simple enough: if a website hosts explicit content — and sometimes even if it doesn’t — it has to check that visitors are over 18, usually by collecting personal data. Lawmakers say it’s about protecting kids. Full stop.

But scratch the surface and things get messy. Privacy experts keep waving red flags about what happens when sensitive personal data starts piling up on servers. And this year, several studies quietly dropped an uncomfortable truth: these laws don’t actually seem to stop minors from accessing porn at all.

So the uncomfortable question hangs in the air — is age verification, the way it’s currently done, ethical? And if not, what would ethical age-verification even look like? When experts were asked, their answers kept circling back to the same idea: device-level filters.

Current age-verification systems

Right now, most laws — from state-by-state mandates in the U.S. to the UK’s Online Safety Act — put the burden on platforms themselves. Websites are expected to install age checks and sort it out. And, honestly, it hasn’t gone well.

“Age gating, especially the current technology that is available, is ineffective at achieving the goals it seeks to achieve, and minors can circumvent it,” said Cody Venzke, senior policy counsel for the ACLU.

A study published in November showed what happens next. Once these laws go live, searches for VPNs shoot up. That’s usually a sign people are sidestepping location-based restrictions — and succeeding. Searches for porn sites also rise, suggesting people are hunting for platforms that simply don’t comply.

The ethics get even murkier. Mike Stabile, director of public policy at the Free Speech Coalition, didn’t mince words. “In practice, they’ve so far functioned as a form of censorship.”

Fear plays a huge role here. When people worry their IDs might be stored, processed, or leaked — and we’ve already seen IDs exposed, like during October’s Discord hack — they hesitate. Adults back away from legal content. That same November study argued that the cost to adults’ First Amendment rights doesn’t outweigh the limited benefits for minors.

“Unfortunately, we’ve heard many of the advocates behind these laws say that this chilling effect is, in fact, good. They don’t want adults accessing porn,” Stabile said.

And for some lawmakers, that’s not a bug — it’s the feature. Project 2025, the blueprint tied to President Trump’s second term, openly calls for banning porn altogether and imprisoning creators. One of its co-writers, Russell Vought, was reportedly caught on a secret recording in 2024 calling age-verification laws a porn ban through the “back door.”

But there is another path. And it doesn’t start with websites at all.

An ethical age assurance method?

“Storing people’s actual birth dates on company servers is probably not a good way to approach this, especially for minors… you can’t change your birth date if it gets leaked,” said Robbie Torney, senior director of AI programs at Common Sense Media.

“But there are approaches that are privacy-preserving and are already established in the industry that could go a long way towards making it safer for kids to interact across a wide range of digital services.”

It also helps to separate two terms that often get lumped together. Age verification usually means confirming an exact age — showing ID, scanning documents, that sort of thing. Age assurance, Torney explained, is broader. It’s about determining whether someone falls into an age range without demanding precise details.

One real-world example is California’s AB 1043, set to take effect in 2027.

Under that law, operating systems — the software running phones, tablets, and computers — will ask for an age or birthday during setup. The device then creates an age-bracket signal, not an exact age, and sends that signal to apps. If someone’s underage, access is blocked. Simple. And notably, it all happens at the device level.

That approach has been recommended for years by free-speech advocates and adult platforms alike.

“Any solution should be easy to use, privacy-preserving, and consumer-friendly. In most cases, that means the verification is going to happen once, on the device,” Stabile said.

Sarah Gardner, founder and CEO of the child-safety nonprofit Heat Initiative, agreed. “Device-level verification is the best way to do age verification because you’re limiting the amount of data that you give to the apps. And many of the devices already know the age of the users,” she said.

Apple already does some of this. Its Communication Safety feature warns children when they send or receive images containing nudity through iMessage and gives them ways to get help. The company recently expanded protections for teens aged 13–17, including broader web content filters.

So yes, the technology exists. And in 2027, at least in California, device makers will have to use it.

But there’s a catch. AB 1043 doesn’t apply to websites — including adult sites. It only covers devices and app stores.

“Frankly, we want AB 1043 to apply to adult sites,” Stabile said. “We want a signal that tells us when someone is a minor. It’s the easiest, most effective way to block minors and doesn’t force adults to submit to biometrics every time they visit a website.”

Last month, Pornhub sent letters to Apple, Google, and Microsoft urging them to enable device-level age assurance for web platforms. Those letters referenced AB 1043 directly.

Venzke said the ACLU is watching these discussions closely, especially when it comes to privacy implications.

Will device-level age assurance catch on?

Whether tech giants will embrace the idea is still an open question. Microsoft declined to comment. Apple pointed to recent updates around under-18 accounts and a child-safety white paper stating, “The right place to address the dangers of age-restricted content online is the limited set of websites and apps that host that kind of content.”

Google struck a similar tone, saying it’s “committed to protecting kids online,” and highlighted new age-assurance tools like its Credential Manager API. At the same time, it made clear that certain high-risk services will always need to invest in their own compliance tools.

Torney thinks the future probably isn’t either-or. A layered system, where both platforms and operating systems share responsibility, may be unavoidable. “This has been a little bit like hot potato,” he said.

No system will ever be perfect. That part’s worth admitting out loud. “But if you’re operating from a vantage point of wanting to reduce harm, to increase appropriateness, and to increase youth wellbeing,” Torney said, “a more robust age assurance system is going to go much farther to keep the majority of teens safe.”

And maybe that’s the real shift here — moving away from blunt tools that scare adults and don’t stop kids, toward something quieter, smarter, and a little more honest about how people actually use the internet.

Read More »

Oh Good, Warning Labels Are Back Again by Stan Q. Brick

Cigarette warning label

Good news, everyone: The Nanny State is back and coming to a computer screen near you!

In fact, if you live in Washington state or Missouri, the Nanny State is coming to a computer screen very near you indeed, because it will be your own computer’s screen. Or smartphone screen, or smart watch screen, or pretty much any other screen you can connect to the internet.

As you may have read here on The War on Porn or elsewhere, both states currently are considering bills which would not only impose age verification requirements on adult websites but would require such sites to publish warning notices about their content, as well.

The Washington bill is the murkier of the two, stipulating that the warning labels to come are “to be developed by the department of health.” The Missouri bill, on the other hand, is quite specific indeed.

The legislation being pondered in Missouri would require sites to publish warnings stating that “Pornography is potentially biologically addictive, is proven to harm human brain development, desensitizes brain reward circuits, increases conditioned responses, and weakens brain function;” that “exposure to this content is associated with low self-esteem and body image, eating disorders, impaired brain development, and other emotional and mental illnesses;” and finally that “pornography increases the demand for prostitution, child exploitation, and child pornography.”

To say that these claims are disputed would be to put it mildly. Most of the evidence for these assertions is anecdotal in nature, in part because it’s very difficult to evaluate them without intentionally exposing a group of minors to pornography (which is illegal to do) in the context of clinical study.

Regardless of their basis in fact (or lack thereof) these labels are what attorneys and Constitutional scholars call “compelled speech,” something which is a bit of a no-no under First Amendment jurisprudence and the appropriately named “compelled speech doctrine.”

As explained by David L. Hudson Jr., writing for the Free Speech Center at Middle Tennessee State University, the compelled speech doctrine “sets out the principle that the government cannot force an individual or group to support certain expression.”

“Thus, the First Amendment not only limits the government from punishing a person for his speech, but it also prevents the government from punishing a person for refusing to articulate, advocate, or adhere to the government’s approved messages,” Hudson adds.

The compelled speech doctrine has been invoked by the Chief Justice John C. Roberts-era Supreme Court as recently as the case Rumsfeld v. Forum for Academic and Institutional Rights.

“Some of this Court’s leading First Amendment precedents have established the principle that freedom of speech prohibits the government from telling people what they must say,” Roberts wrote for the Court in 2006.

When some folks hear about these labels, doubtlessly they say to themselves something like “How is this any different from requiring cigarette packages to carry warning labels?” And that would be a good question, if cigarettes were a form of speech that presumptively enjoys protection under the First Amendment.

Beyond that distinction, there’s another obvious difference here. Cigarettes, unlike pornography, have been subjected to extensive clinical study, research which has confirmed that nicotine is addictive, and that tobacco (along with the myriad other substances found in cigarettes) is strongly associated with the development of lung cancer and various cardiopulmonary disorders and diseases.

In short, the analogy between pornography and cigarettes is a terrible one, scientifically and legally.

There was a time when I would very confidently assert that the Supreme Court will eventually reject these warning labels as textbook compelled speech and shoot down at least the labeling requirements in the bills pending in Washington and Missouri. But after their decision in Free Speech Coalition v. Paxton, I’m not so sure.

For those who like the contours of our First Amendment just the way they are, this uncertainly should be even more alarming than the warning labels the Nanny State wants us to start seeing on porn sites.

Read More »

Missouri Becomes the Latest State to Treat Online Adults Like Children by Stan Q. Brick

Missouri flag

Citizens of Missouri who frequent adult websites will find the internet has changed for them when they wake up this Sunday morning, towards the end of the long Thanksgiving weekend.

Why will the internet be different for citizens of Missouri as of that morning? Because Sunday is November 30, the day the state’s new age-verification mandate begins for websites covered by the “Missouri Merchandising Practices Act.”

Under the law, websites on which a “substantial portion” of the content (33% or more) is deemed “pornographic for minors” must employ a “reasonable age verification method” to assure anyone accessing such content is an adult.

On its face, requiring adult sites to verify the age of their visitors may not seem like such an unreasonable proposition. But, as the saying goes, “the devil is in the details.”

For starters, making adults jump through hoops to enter a brick-and-mortar adult video store, or requiring people to show ID when purchasing a porn mag at a convenience store is one thing, storing and cross-referencing their personally identifying information is quite another.

When a clerk at an adult shop or any store that sells age-restricted materials checks your ID, they look at it, they look at you, they check the date of birth listed on the ID document and then you both get on with your lives. Minutes later, that same clerk probably couldn’t tell you much about the customer they’d just served, other than “I checked his ID, it looked legit and he’s 55 freaking years old, dude.”

When I scan my ID on the behest of an age-verification provider…who the fuck knows what happens to that data? Sure, some of these state laws prohibit vendors from storing and sharing that data, but do you trust them to follow the law? How many times do we need to haul tech companies before Congress (or watch them get fined by the FCC) to admit they interpreted the law in some “nuanced” way that permits them to hold on to and use our personal data before we get wise to their sneaky ways?

The data collected by age-verification services is valuable to them. They aren’t going to abstain from using it in every profitable way possible, regardless of what the law says. They will find ways to interpret the law such that they can sell, rent out or permit third-party cross-referencing of the data, mark my words. Some of these companies won’t be domiciled in the United States – and they will give just about as big a shit about U.S. law as any other business located outside the jurisdiction of the U.S. does, accordingly.

Of course, none of this will bother the politicians who pass these laws, because this isn’t about protecting kids – and it sure as hell isn’t about protecting the privacy of adults who like to watch porn. This is about a larger antipathy towards adult entertainment and a desire to discourage anyone and everyone from looking at porn, not just minors.

Consider what Missouri Attorney General Catherine Hanaway had to say in September about the new law in her state: “We are holding powerful corporations accountable, respecting women and victims of human trafficking, and helping ensure that minors are shielded from dangerous, sexually explicit material.”

Notice that the bit about “helping ensure that minors are shielded” comes last on the list? That’s not a coincidence.

Someone also needs to explain to me how making people show ID at the door when they watch porn is in any way helping “women and victims of human trafficking.” Let’s assume a person has been trafficked for the purpose of performing in porn (something that truly doesn’t happen often at all, despite a constant stream of political rhetoric to the contrary); how does making viewers confirm they’re old enough to watch legal porn help anyone who has been forced into making illegal porn?

The word “trafficking” doesn’t appear in the text of Missouri’s new law. What does appear there is the claim “nothing in this proposed rule limits the ability of adults to view sexually explicit material online,” which is technically true, so long as one doesn’t consider an age-verification requirement a “limit” to any of the adults who would prefer not to hand over the personally identifying information to God-knows-who.

When the Supreme Court ruled in favor of Texas in the challenge to that state’s age-verification mandate, Cecillia Wang, the national legal director of the American Civil Liberties Union, said something that strikes me as being just as true with respect to the Missouri law:

“The legislature claims to be protecting children from sexually explicit materials, but the law will do little to block their access and instead deters adults from viewing vast amounts of First Amendment-protected content.”

She’s right – and the list of adults deterred by such laws is only going to get longer as these laws proliferate.

Welcome to the dumb-downed internet. Please be mindful of the language you use herein; some of your readers might be children!

Read More »

Aylo Pushes Tech Giants to Adopt API-Driven Device Age Verification

Aylo-logo

Something interesting happens when big tech companies get a polite nudge from a company they usually keep at arm’s length. That’s exactly what Aylo — the parent company of Pornhub — just did. The company asked Google, Apple, and Microsoft to open the door to API signals that would let platforms verify a user’s age at the device or operating-system level. The goal? Keeping minors off porn. It’s a request that feels both obvious and strangely overdue, considering how much of the internet already runs through those devices.

Wired revealed last week that Anthony Penhale, Aylo’s chief legal officer, sent separate letters on Nov. 14 to the relevant executives at each company. Those letters were later confirmed by Aylo, whose spokesperson provided them for review.

Aylo has been steadily pushing the idea that age verification should happen at the device level — not slapped awkwardly onto individual sites through clunky pop-ups and ID uploads. It’s a stance that puts the company at odds with most state and regional age-gating laws in the U.S. and E.U., which still rely on site-level verification. Meanwhile, Google, Apple, and Microsoft have been sending mixed signals about how far they’re willing to go with device-based checks.

Most recently, California’s governor, Gavin Newsom, signed a bill requiring age verification in app stores. Google, Meta, and OpenAI endorsed the measure, while major film studios and streaming platforms pushed back, calling the law a step too far.

“We strongly advocate for device-based age assurance, where users’ age is determined once on the device, and the age range can be used to create an age signal sent over an API to websites,” Penhale wrote in his letter to Apple. “Understanding that your Declared Age Range API is designed to ‘help developers obtain users’ age categories’ for apps, we respectfully request that Apple extend this device-based approach to web platforms.”

“We believe this extension is critical to achieving effective age assurance across the entire digital ecosystem and would enable responsible platforms like ours to provide better protection for minors while preserving user privacy,” he added.

Penhale’s letters to Alphabet and Microsoft echoed the same ask: allow website operators — not just app developers — access to the age-related API tools each company already uses within its own ecosystem.

“As a platform operator committed to user safety and regulatory compliance, Aylo would welcome the opportunity to participate in any technical working groups or discussions regarding extending the current age signal functionality to websites,” Penhale wrote in the letter sent to Microsoft.

A Google spokesperson told Wired that Google Play doesn’t “allow adult entertainment apps” and that “certain high-risk services like Aylo will always need to invest in specific tools to meet their own legal and responsibility obligations.” In other words, Google’s not eager to widen the gates.

Developer documentation shows that Apple now turns on content controls by default for new devices registered to under-18 users. Microsoft, for its part, has leaned heavily toward service-level verification — meaning platforms should handle their own age checks rather than relying on the device.

All of this is unfolding while Aylo continues to argue that site-level age verification doesn’t work. The company has pointed to real-world examples of how these systems push users off regulated sites and into murkier, unmonitored corners of the web.

Internal data shows that traffic from the U.K. to Aylo’s platforms dropped more than 77 percent after Ofcom began enforcing new rules under the Online Safety Act. Related documents reviewed privately indicate that users didn’t disappear — they simply migrated to non-compliant, unregulated sites.

At the same time, a court in Germany just offered Aylo a temporary lifeline. On Nov. 19, the Administrative Court of Düsseldorf put a hold on new regulations requiring ISPs to block Pornhub and YouPorn entirely.

The court’s order would have forced ISPs like Deutsche Telekom, Vodafone, and O2 to bar access to the sites over Germany’s age verification laws. For now, those rules are on pause while the High Administrative Court of North Rhine-Westphalia works through appeals on the original network-ban orders.

Interestingly, the Düsseldorf court pointed out that Germany’s enforcement approach under the Youth Media Protection Interstate Treaty contradicts the European Union’s Digital Services Act, which outlines a different vision for age verification.

Aylo is still fighting over its designation as a “very-large online platform” under the DSA — a label that brings intense regulatory scrutiny and a long list of compliance demands. The company’s push for device-based age checks is part of that bigger battle, and it’s hard not to notice the irony: the company everyone expects to resist regulation is the one asking for the kind that might actually work.

Read More »

Missouri Age-Verification Regulation Takes Effect November 30th

Missouri flag

Missouri’s age-verification regulation, 15 CSR 60-18, kicks in on Sunday, November 30. It arrives quietly, almost like a new rule taped to the front door of the internet—one most people won’t notice until they run into it.

Under Missouri’s rule, any site where at least 33⅓% of content is considered harmful to minors must verify a visitor’s age before letting them in. The state signs off on methods like digital IDs, government-issued identification, or other systems that confirm age through transactional data. If a platform thinks it has a better solution, it can pitch its own—so long as it proves it works just as well.

Violating the rule isn’t just a slap on the wrist. The state treats it as “an unfair, deceptive, fraudulent, or otherwise unlawful practice” under the Missouri Merchandising Practices Act. If regulators decide a violation was done “with the intent to defraud,” it escalates into a Class E felony. Each visit to a non-compliant site counts as a separate offense, with penalties capped at $10,000 per day. There’s no option for private lawsuits; this is the state’s show.

For businesses, the message is simple but unsettling: if you might fall under the rule, read the fine print, understand the liability, and protect yourself. The consequences aren’t theoretical—they’re baked in. And as laws like this multiply, compliance is becoming less about checking a box and more about navigating a moving target with stakes that touch real people and their privacy.

Because once the government decides how adults must prove their age online, the question stops being, Can you follow the rules?

It becomes, What do those rules change about the way we experience the internet at all?

Read More »

FSC Unveils Updated Toolkit to Help Sites Navigate Age-Verification Laws

Free Speech Coalition logo

Earlier this year, a toolkit dropped from the Free Speech Coalition that was supposed to help adult websites navigate the chaos of U.S. age verification laws. On paper, it was about compliance. In reality, it spoke to something bigger—how to follow the law without sacrificing privacy, free expression, or basic human dignity in the process. The updated version arrives after months of legal whiplash and real-world testing, refined by feedback from the people actually living with these requirements. It’s not just a rulebook; it’s a survival guide for an industry being legislated into a corner.

And honestly, it couldn’t have come at a better time.

Laws regulating sexual content online aren’t slowing down. They’re spreading. States are experimenting with different enforcement mechanisms like they’re swapping cocktail recipes—ID uploads here, age-estimation scans there, endless demands for personal data everywhere. What counts as compliance in one state can trigger fines in another. Platforms are stuck either bending to every new rule or blocking entire populations just to avoid liability.

Some people call that safety. Others see it as the invention of a digital checkpoint system where adulthood must be proven over and over again.

The updated toolkit tries to offer a middle path: protect minors without building a surveillance state. That means emphasizing privacy-preserving verification methods, data minimization, and safeguards against turning porn sites into honeypots for identity theft. When your sexual curiosity can be cross-referenced with a government database, it’s not hard to imagine how badly that could go.

But this isn’t just about porn. It’s about how much of yourself you should have to reveal simply to access legal content. If a state can require ID to watch an adult video, why couldn’t it do the same for BDSM forums, queer education sites, or reproductive health information? The slope may not be slippery—it might already be greased.

There’s also the uncomfortable truth that “protecting kids” has become a political Swiss Army knife. Behind the moral language are groups who openly want to make adult content inaccessible altogether, not just to minors. Age verification becomes the first domino rather than the final safeguard. When lawmakers start treating porn the way others treat fentanyl, it’s worth asking who gets to define harm — and who gets punished in the process.

Meanwhile, the people enforcing these laws rarely understand how the internet works. The burden falls on smaller platforms, independent creators, and marginalized workers who already operate under scrutiny. Sex workers were dealing with censorship long before age-verification laws existed. Now, they’re being folded into legislation written by people who’ve never considered how someone pays rent by selling a video clip.

The irony? The more governments tighten restrictions, the faster users migrate to unregulated foreign sites where consent and safety checks don’t exist at all. The “protection” ends up exposing people to worse content, not preventing it.

If lawmakers truly cared about reducing harm, they would fund education, promote ethical production standards, and support platforms that actually moderate content responsibly. Instead, the system encourages the exact opposite: drive traffic to the shadows, then blame the shadows for being dark.

The toolkit is trying to hold the line—compliance without capitulation. It’s a reminder that safety and privacy don’t have to be adversaries. They can coexist, but only if laws are written by people who understand what’s at stake for users and creators.

Because asking adults to prove who they are before they can access legal sexual content isn’t just a technical requirement. It’s a worldview. One where the state sits in the bedroom doorway holding a clipboard, deciding who gets to come inside.

And once that door closes, it rarely opens back up.

Read More »

Yoti Cashes In as New Online Safety Rules Kick In

Yoti logo

Yoti’s revenue didn’t just rise this year—it exploded, conveniently right as age verification rules tightened under the Online Safety Act. The company reported a 55 percent jump in turnover, hitting £20.3 million for the year ending in March. Funny how the moment the government demands IDs to access half the internet, the firms selling ID checks start printing money.

“Regulatory issues are central to the business and Yoti is expecting to benefit from significant regulatory changes, in both identity and age, both in domestic and overseas markets,” the company said.

Translation: more laws that treat adults like children mean more profit.

“Anticipated regulatory changes in the United Kingdom, France and Australia, in particular, are expected to support the company’s growth.”

And that growth isn’t just about protecting anyone—it’s about building infrastructure where logging into a website slowly starts to resemble crossing a border. The more regulation expands, the more companies like Yoti become gatekeepers to everyday life online.

“Rapid development in the sophistication of both online fraud, deep fakes and the technology to prevent this, means that the market is constantly developing and growing.”

Sure, deepfakes and fraud are real concerns. But when the solution is “show your papers” to browse adult content—or eventually anything deemed “sensitive”—it’s worth asking who really benefits. Right now, it looks less like safety and more like a booming business model built on surveillance.

Read More »

Age Checks Didn’t Stop Porn Use—They Just Pushed Men Toward Harder, Unregulated Content

Pornsite on a mobile phone

There’s a strange moment that happens when you quit porn: suddenly sex feels… quieter. That’s what happened to Ray*, who stopped watching in July after age verification rules kicked in. At first, sex felt more vanilla. “I have been a little less creative in bed as I’m not trying anything I’ve recently seen [online] with my girlfriend,” he admitted. But it also felt more grounded. “It does feel healthier […] and it’s made ‘normal’ sex a little more exciting.”

If you somehow missed the memo, the UK’s Online Safety Act (OSA) began enforcing age checks on 18+ sites starting July 25. Imagine a bouncer suddenly stationed at the entrance of every porn site in the country—minus the velvet rope and questionable cologne. The idea was to keep minors from stumbling into explicit content. But for adults like Ray, who relied on free tube sites, the price of entry became handing over personal identification. “It’s reduced my porn usage by 99%,” said the 36-year-old workplace trainer. Others haven’t quit—they’ve just changed how they watch… and in some cases, doubled down.

This shift among adults isn’t the stated goal. The law was designed to protect children, especially given that 79% of young people in the UK say they’ve encountered violent porn before turning 18. With 80% public support, legislators believed ID checks were the best way to stop minors from accessing explicit material.

But critics worry the law is a gateway to broader online censorship. Sex workers are already feeling the consequences—forced to sanitize profiles and content across platforms, making it harder to market themselves safely or honestly. And while everyone keeps shouting about protecting kids, almost no one is talking about actual sex education or porn literacy, both of which are crucial for teens learning how to navigate desire and boundaries, and for adults struggling with compulsive habits.

Three months in, the OSA isn’t delivering on its most basic promise. “It will almost certainly reduce the most casual, accidental access to porn for under-18s, but if the question is whether it will stop young people altogether, the honest answer is no,” said Professor Clarissa Smith, co-editor of Porn Studies. “Teenagers are extraordinarily adept at routing around whatever adults prohibit, and they often end up in the less safe corners of the internet to do it.”

Over on Discord, teens have already discovered a workaround using screenshots of video game characters to bypass age checks. “It’s essential that we recognize that many young people will find ways to get around age-gating,” said Paula Hall, founder of The Laurel Centre, which focuses on sex and porn addiction. Government officials haven’t exactly been eager to discuss the results.

But like any prohibition, the law is changing behavior—especially among adults who, legally speaking, should be able to watch whatever consenting adults create. The age checks feel invasive to many. Who wants their kinks attached to a verified identity in a database that may or may not leak?

“Verification is wrong to me,” Ray said. “It’s like telling the fire brigade you’re about to burn your house down. It’s an inconvenience to everyone.” He doesn’t trust verification vendors. “I don’t believe the data is deleted, I don’t believe they are secure, and I don’t trust the location of these services either.”

David*, a film-industry craftsman, felt the same way. “There was no chance of me entering my personal details onto a porn site — and equally, getting a VPN for the purposes of watching porn seemed a bit desperate, so it was quite easy for me to disengage entirely.”

For others, it’s less about paranoia and more about effort. “It’s instant gratification, so I wouldn’t put that kind of effort in,” said Luke*, who works in patient enrollment. “I would never scan my face or hand over my ID. I’m not that concerned about personal security, it’s just [that age verification] is an effort and not worth it for the end result.”

PORN AGE VERIFICATION IS LIKE TELLING THE FIRE BRIGADE YOU’RE ABOUT TO BURN YOUR HOUSE DOWN

In that sense, the OSA is functioning like a convenience tax on horny impulses. Think of it like the plain-packaging laws for cigarettes—just instead of dull tobacco branding, it’s dulling access to bukkake. “For many, the additional time and effort it takes helps to reduce impulsive viewing and may encourage developing wider interests,” Hall noted.

“For some men, having to verify their age adds just enough friction [making an undesirable action more difficult to perform] that they delay or skip a viewing session,” added Smith. “It doesn’t mean they had a compulsive relationship with porn, just that spontaneity is sensitive to obstacles. Friction always changes behavior at the margins.”

Of course, whether porn counts as an addiction is another battle entirely. Hall believes “the language of addiction fits the lived experience.” Others argue it’s closer to a compulsive habit fueled by shame, not dopamine. But either way, a lot of people want to cut back: 80% of regular porn-watching men aged 18–29 in the UK say they’re concerned about their consumption.

For some, the OSA became the unexpected catalyst they needed. “I’ve found a renewed focus on having real fun (not just in the bedroom) and my relationships more generally seem to have blossomed,” said David. Friends have noticed a shift in his mood—“calm positivity,” he called it. “It’s pretty much ended my casual consumption; it was too easy to fall down that rabbit hole during moments of boredom or stress.”

But most men aren’t quitting—they’re adapting. “The larger pattern is displacement […] the desire doesn’t disappear; it detours,” said Smith. “The OSA increases the distance between a moment of wanting and a moment of accessing, but people bridge that distance in creative ways. Men are simply reorganizing their porn habits around the new architectures.”

And in true internet fashion, Reddit’s response was to make jokes about “moving to Norway.” Translation: VPNs are doing numbers. Proton VPN saw an 1,800% spike in sign-ups right after verification launched.

This also explains why headlines celebrating a supposed 77% drop in UK Pornhub traffic miss the plot—traffic is just being rerouted through “other countries.” Coincidentally, 77% of Gen Z say they watch porn regularly. The math speaks for itself.

Some men are using the moment to pay creators directly. “I do have a kink, so I found reconnecting to adult performers who lean into that world much more rewarding,” Ray said. “I buy a clip a month now, and I like the fact that I’m now paying the entertainers for their work.”

Most, however, are not pulling out their credit cards. Free sites still dwarf paid platforms by a massive margin—hundreds of millions of UK visits vs. a fraction of that on subscription-based services.

Edward* chose a different path: erotic literature. “I’d say the largest behavioral shift would be that I am now more open to the concept of readable erotica,” he said. He’s using fantasy instead of autoplay algorithms. “I definitely have been trying to curate my inner sex life using my own fantasies rather than taking the easy option of porn.”

I’VE BEEN CURATING MY INNER SEX LIFE USING MY OWN FANTASIES RATHER THAN TAKING THE EASY OPTION OF PORN

Maybe retro porn will return—DVDs tucked on dusty shelves like vinyl revival for genitals. That’s what one shop owner predicted. But nostalgia rarely wins against pixels and convenience.

More realistically, people are heading to sketchier sites with no age checks and much worse moderation. Ofcom helpfully publishes a running list of these platforms, effectively handing adults a menu of unfiltered content. “The OSA is likely driving adults into a more fragmented, less regulated ecosystem, largely because people are uneasy about handing over ID for sexual content,” Smith said. “We’re probably not going to see less engagement with porn, we’ll see different routes to it — many of them far outside the spaces the law was written for.”

Ray tried it. “I ended up on some unfiltered Eastern European sites, but finding what I wanted was difficult and unfulfilling.”

For others, the shift is darker. “I watch more hardcore stuff on unregulated sites now,” said one anonymous user.

The ripple effects aren’t subtle. Young men already drifting toward misogynistic corners of the internet now have more reason to end up there. “I grew up in an era of porn playing cards being traded in school,” Edward said. “It was pretty tame stuff compared to the likes of choking, degradation and the weird step-sibling shite that makes up large portions of porn sites. This trend and attitudes towards women in general I do find seriously concerning.”

That doesn’t mean porn itself is inherently harmful. Like anything pleasurable, it’s about intention, context, and consent. Performed ethically, and consumed by choice—not compulsion—it can enhance sex lives rather than replace them.

For some men, the OSA nudged them toward healthier patterns. A smaller group discovered new, ethical models of consumption. But for the vast majority, those digital bouncers are just a minor obstacle. They’ll keep flashing IDs—real or borrowed—and spending another night on the tubes.

Read More »

Congress vs. VPNs: Bold Moves From People Who Don’t Know What a Server Is

VPN

There was a time when the worst thing a lawmaker could do was force you to flash an ID just to look at something mildly sexual online. Turns out that was just the warm-up act. Now, politicians in Wisconsin, Michigan, and a few very enthusiastic copycats have decided the real enemy isn’t porn—it’s privacy itself.

And the new target? VPNs.

Yes, seriously.

Wisconsin’s A.B. 105/S.B. 130 demands that any website hosting content that could possibly be considered “sexual” must implement age verification and block access for anyone using a VPN. The bill also inflates the definition of what counts as material “harmful to minors,” sweeping in everything from discussions of anatomy to basic information about sexuality and reproduction.

It’s part of a trend: conservative lawmakers expanding “harmful to minors” far beyond what courts have historically allowed, pulling in sex education, LGBTQ+ health resources, art, memoirs, medical info—basically anything that makes them clutch their pearls.

Wisconsin’s bill has already passed the State Assembly and is crawling its way through the Senate. If it passes, it could become the first law in the country to effectively criminalize accessing certain content while using a VPN. Michigan tried something similar—requiring ISPs to identify and block VPN connections—but it stalled. Meanwhile, officials in the U.K. are calling VPNs “a loophole that needs closing.”

This isn’t abstract. It’s happening.

And if legislators get their way, it’s going to wreck far more than porn access.

Here’s Why This Is a Terrible Idea

VPNs hide your real location by routing traffic through another server. The site you visit sees the VPN’s IP, not yours. Think of it like using a P.O. box so someone doesn’t know your home address.

So when Wisconsin demands that websites “block VPN users from Wisconsin,” they’re essentially asking websites to perform sorcery. There’s no way to tell whether a VPN server is in Milwaukee or Mumbai. The tech doesn’t work that way.

Faced with legal risk, websites will either pull out of Wisconsin entirely or block all VPN users everywhere. One poorly drafted state law could break private browsing for the entire internet.

The collateral damage outweighs any hypothetical benefit.

Almost Everyone Uses VPNs

And it’s not just people trying to avoid showing their driver’s license before watching porn.

Businesses rely on VPNs. Remote workers need them. People checking email in a hotel lobby need them. Companies use VPNs to protect employee data, internal communications, and client files.

Students rely on VPNs because universities require them to access academic databases, class materials, and research tools. At the University of Wisconsin-Madison, WiscVPN “allows UW–‍Madison faculty, staff and students to access University resources even when they are using a commercial Internet Service Provider (ISP).”

Vulnerable people rely on VPNs, too. Domestic abuse survivors use them to hide their location. Journalists use them to protect sources. Activists use them to organize without surveillance. LGBTQ+ people in hostile regions rely on VPNs to access medical guidance and community support. In censorship-heavy countries, VPNs aren’t optional—they’re lifelines.

And then there are regular people who just don’t want to be tracked, profiled, and monitored by corporations and ISPs. That shouldn’t require a moral justification.

It’s a Privacy Nightmare

Block VPNs and suddenly everyone needs to verify their identity with government IDs, credit cards, or biometric data just to access perfectly legal content.

And we all know how that ends: corporate databases leaking browsing histories tied to real names, real IDs, and real consequences.

This has already happened. It will happen again. It’s not a question of if—just when.

Turning mandatory surveillance into law isn’t moral; it’s just invasive.

“Harmful to Minors” Is Not a Blank Check

Under longstanding legal standards, governments can restrict minor access to sexual content only when it “appeals to prurient interests” and lacks serious value for minors.

Wisconsin’s bill bulldozes that definition. It applies to material that simply describes sex or depicts anatomy. That could encompass literature, film, music, medical content, sex-ed resources, LGBTQ+ health information—basically anything human bodies do.

It gets worse. The bill applies to any website where more than one-third of the “material” meets that definition. Suddenly, most social platforms could be considered age-restricted simply for hosting sexuality-related conversations.

And when governments get to decide what topics are “harmful,” the first groups punished are always marginalized ones.

It Won’t Even Work

Let’s imagine this law passes. Here’s what happens:

People bypass it. Instantly.

They’ll switch to homemade VPNs, private proxies, Cloudflare tunnels, virtual machines, or rented servers for a few dollars. The internet routes around censorship like water finding cracks in concrete.

Even if commercial VPNs disappeared overnight, people could just create their own encrypted tunnels. It takes five minutes and a $5 cloud server.

Meanwhile, students, workers, journalists, abuse survivors, and everyone else gets stuck without privacy or access.

The law solves nothing and breaks everything.

VPNs shouldn’t be required to access legal speech—but they also shouldn’t be criminalized. The real issue is the age verification regime itself: it’s invasive, ineffective, and trivial to circumvent. It harms far more than it protects.

A False Dilemma

People didn’t flock to VPNs because they’re trying to commit crimes. They did it because governments tried to force identity verification onto everyday browsing. Instead of asking why millions of people don’t want to hand over their IDs to random websites, lawmakers decided the problem is the tools protecting them.

The whole premise is backwards.

The question isn’t “How do we keep kids safe online by destroying privacy for adults?”

It’s “Why is surveillance the only solution anyone in power can imagine?”

If the real goal is protecting young people, lawmakers could strengthen digital literacy, offer better parental tools, support education, or address genuine online harms.

Instead, they’re trying to criminalize privacy itself.

VPN bans aren’t about safety. They’re about control. And the irony is almost poetic: the people writing these laws don’t even understand the technology they’re trying to outlaw.

Read More »