Commentary

Age Verification Push Sends Millions of Britons to Unregulated Porn Sites, Charity Says

Age verification

Something strange has been happening since age verification checks quietly became part of everyday internet life in the UK. You’d think stricter rules would close doors. Instead, a lot of people seem to be wandering down darker hallways.

New research from the Lucy Faithfull Foundation suggests that nearly 45 per cent of UK porn users have visited websites without age verification checks since the rules came into force last summer under the Online Safety Act.

The poll, which surveyed more than 3,700 people across Britain, found that 39 per cent of those who visited these sites ended up watching content that made them feel uncomfortable. Even more telling, 40 per cent said what they saw was enough to put them off returning altogether.

The charity, which focuses on preventing online child sexual abuse, has warned that these unregulated spaces can quietly increase the risk of people stumbling into harmful — and illegal — material.

Vicky Young, who leads the Stop It Now UK and Ireland anonymous helpline, said these sites can become a dangerous stepping stone toward indecent images of children.

“We work with people who have looked at indecent images of children to try and address that behaviour, to help support them to change their behaviour,” she said.

“One of the things that people say to us frequently is that they started looking at legal adult pornography and that their behaviour escalated. In part they were spending maybe longer online, but also the sort of content that they were looking at became more extreme and often started getting younger, and that’s when they then crossed into illegal behaviour, so looking at indecent images of children.”

That pattern — the slow creep from curiosity to something far more serious — is what worries the charity most.

“Because of that pathway and that coming out constantly in conversations we have with people, it concerns us that if people are accessing sites where there is this riskier content, that actually they are putting themselves at a higher chance of accessing indecent images,” she said.

“Sometimes that might not be intentional in the beginning, but what people tell us is that actually, if they come across those images as part of their other pornography, that then sparks curiosity. There’s something that kind of adds to the excitement around the risk, and they don’t necessarily stop at one image. They actually then start looking for more images.”

The survey also revealed a quieter anxiety bubbling under the surface. Nearly 30 per cent of respondents said they were worried about how much pornography they consume. That concern was highest among young men aged 18 to 24, with more than half admitting it’s something that troubles them — a group the Foundation describes as particularly vulnerable.

At the same time, the rules appear to be forcing a moment of self-reflection for many. Almost 47 per cent said they’ve reduced how much pornography they watch since age checks were introduced, while 55 per cent said the changes made them stop and think about their habits.

Recent data backs that up. Enforcement of highly effective age assurance led to an immediate drop in traffic to major porn sites from late July. But as one door closed, another cracked open: VPN use surged as people looked for ways around the new barriers.

The UK’s most visited adult site recorded a drop of around 1.5 million viewers year on year, falling from 11.3 million in August 2024 to 9.8 million this August.

Meanwhile, VPN usage more than doubled after the rules came in, jumping from roughly 650,000 daily users to a peak of over 1.4 million in mid-August 2025. Although that number has since dipped, it still hovered around 900,000 by November.

In other words, fewer people are walking through the front door — but plenty are still trying the side entrance.

Dr Alexandra Bailey, head of psychology at the Foundation and an associate professor at the University of Roehampton, said the intention behind age verification is sound, but the consequences are more complicated.

“Age verification is vital to protect children, and we fully support it,” she said. “But we also need to recognise that some adults are choosing riskier sites to avoid age checks. These sites can expose people to harmful material, including illegal content depicting child sexual abuse. Even if you’re not looking for it, you could encounter it — and that can have serious life-changing consequences.”

She added that the rules have created a pause many people probably needed, but not everyone is responding in a safe way.

“Age verification is also prompting adults to reflect on their online behaviour, which can be a good thing for people worried about their porn use. But we need to address the risks for those who are turning to sites that avoid the new regulations.

“Every day, our advisors speak to people whose pornography use has spiralled into something much more harmful. We know embarrassment can stop people from reaching out, but confidential help is available. If you’re worried about your own behaviour or someone else’s, contact Stop It Now before it’s too late.”

Sometimes the most dangerous part isn’t the rule itself — it’s what people do when they decide to dodge it.

Read More »

2025: The Year Tighter Regulation Came to Town for the Online Adult Industry by Morley Safeword

Adults only button

When I got my start in the online sector of the adult entertainment business, back in the mid-nineties, there was no video streaming. Individual photos often took web users several minutes to download. And you hardly heard a peep from anyone suggesting that the fledgling industry needed to be reined in.

To be fair, many people were only vaguely aware of what was available on the internet at the time, let alone worried about what their kids might be looking for on there – and frankly, the web was so slow, using it exceeded the patience of a lot of kids, anyway.

Oh, how things have changed.

What evolved fastest, of course, was the technology underpinning the internet. As high-speed connectivity became the norm rather than the exception and video streaming capabilities increased year over year, online porn went from something enjoyed by a small subset of early adopters to a massive, multibillion dollar industry. Along with those changes in technology came ever louder calls for the online adult industry to be more tightly regulated – or regulated at all, in the still-early-internet days of the mid-nineties.

In the United States, Congress began cooking up proposals to prevent minors from accessing online porn. While these proposals enjoyed broad bipartisan support (within the legislature, at least), what they didn’t get was much support from the courts.

Early attempts to impose things like age verification requirements were slapped down by the courts, most notably in cases like Reno v. ACLU, decided in 1997. In Reno, the Supreme Court held that certain provisions of the Communications Decency Act of 1996 (“CDA”) violated the First Amendment. Specifically, the court found that the CDA’s “indecent transmission” and “patently offensive display” provisions trod upon the freedom of speech protected by the First Amendment.

What changed in 2025, as the Supreme Court again considered an age verification proposal, this time a state law passed in Texas (“HB 1181”), was in part the continued forward march of technology. But more crucially, what changed was the court’s disposition as to which “standard of review” ought to be applied.

In previous cases involving online age verification proposals, the court has applied “strict scrutiny,” a high bar that requires the government to show its actions (and laws) are “narrowly tailored” to further a “compelling government interest” and are the “least restrictive means” to further that interest.

In the case Free Speech Coalition v. Paxton, which the Supreme Court decided in June, the district court had applied strict scrutiny and found that HB 1181 failed to satisfy the standard. When the case reached the Supreme Court, however, the majority decided the court had erred in applying strict scrutiny and that the correct standard to apply was “intermediate scrutiny,” which sets the bar much lower for the government.

Writing for the majority, Justice Clarence Thomas asserted that HB 1181 has “only an incidental effect on protected speech.”

“The First Amendment leaves undisturbed States’ traditional power to prevent minors from accessing speech that is obscene from their perspective,” Thomas wrote. “That power includes the power to require proof of age before an individual can access such speech. It follows that no person – adult or child – has a First Amendment right to access such speech without first submitting proof of age.”

Since the law “simply requires adults to verify their age before they can access speech that is obscene to children,” Thomas found that HB 1181 “is therefore subject only to intermediate scrutiny, which it readily survives.”

The three justices who dissented from the majority’s position didn’t see things quite the same way, naturally. In her dissent, Justice Elena Kagan criticized the majority’s holding as “confused” and highlighted the ways in which it departed from the court’s previous rulings in similar cases.

“Cases raising that question have reached this Court on no fewer than four prior occasions – and we have given the same answer, consistent with general free speech principles, each and every time,” Kagan observed. “Under those principles, we apply strict scrutiny, a highly rigorous but not fatal form of constitutional review, to laws regulating protected speech based on its content. And laws like H. B. 1181 fit that description: They impede adults from viewing a class of speech protected for them (even though not for children) and defined by its content. So, when we have confronted those laws before, we have always asked the strict scrutiny question: Is the law the least restrictive means of achieving a compelling state interest? There is no reason to change course.”

Whether there was reason to change course or not, surely now the course has been changed. Make no mistake, laws like HB 1181 are here to stay – and they will be followed by other measures designed to restrict access to sexually-explicit materials online, as well as regulation which goes much further and sweeps in an even broader range of controversial content.

The old cliché about the “canary in the coal mine” has often been applied to pornography in the context of free speech discussions. Even those who don’t like or approve of porn have often warned that crackdowns on sexually explicit expression can presage attempts at regulating other forms of speech.

If indeed those of us who work in the adult industry are part of a sentinel species, the warning to our peers in the broader world of entertainment and self-expression could not be more clear, as we look out to 2026 and beyond: Here in the world of online porn canaries, we’re choking on this new regulatory push – and most likely, some of you other birds are going to be feeling short of breath too, soon enough.

Read More »

The Adult Industry Has Been Through Worse. We Will Survive by Morley Safeword

Anthony Comstock

These are challenging times for the adult entertainment industry, no doubt. Around the globe, governments are passing increasingly strict regulations around age verification and other, more censorious measures putatively designed to “protect minors,” but which legislators and anti-porn crusaders also hope will reduce porn consumption among adults, as well.

If all this is enough to inspire some folks in the adult industry want to wave the white flag, close up shop, and find something else to do for a living, I can certainly understand why. As the name of this site reflects, people in the industry rightfully feel like they’re under siege, waging a battle against forces with a great deal more wealth and power to enlist as weapons than does our side.

As someone who has worked in the adult industry for nearly 30 years (and who has enjoyed its products even longer), take it from me when I tell you none of this is new. Some of the battlefields are new and they are constantly evolving, but the war itself goes back longer than many of us can remember.

In the United States, obscenity laws and other statutes designed to maintain public morals and prevent the corruption of society date back to colonial times. In other words, long before there was an adult entertainment industry against which to wage war, the government was taking aim at sexual expression and conduct.

Fast forward to the 19th Century and there was the establishment of the Comstock Act of 1873, which—among many other things—made it a criminal offense to send obscene materials through the U.S. mail. The Act also made it illegal to use the mail to tell someone where such materials might be found, or how to make them provisions, which was later struck down by the courts as overly broad, thankfully.

To give you an idea of just how much more restrictive the obscenity laws were in the early 20th Century than they are today, you need only look as far as the name of a seminal case from 1933 – United States v. One Book Called Ulysses. Frankly, the contents of James Joyce’s Ulysses wouldn’t even be enough to raise one-half of a would-be censor’s eyebrow these days, yet it was considered positively scandalous in its day.

From an American adult industry perspective, the War on Porn arguably reached its zenith in the 1980s and 1990s, under Presidents Ronald Reagan and George H.W. Bush. According to the Bureau of Justice Statistics, in 1990 alone there were 74 federal obscenity prosecutions targeting adult materials (as opposed to Child Sexual Abuse Materials, which are patently illegal and have no protection under the First Amendment). Contrast that figure with 2009, in which there were a total of six.

Despite the number of prosecutions at the start of the decade, the 1990s were a period of tremendous growth for the adult industry, driven in large part by the advent of the commercial internet and its relatively unregulated environment. What we’re seeing now is what governments might call a “correction” of that laissez faire approach – and what those of us in the industry might call an overcorrection.

Yes, age-verification laws present a challenge. Like a lot of people in the adult industry, I don’t object to the idea of making people prove they’re adults before consuming porn; what I object to is the means by which we’re required to offer such proof and the way those methods compromise not only our privacy, but potentially open us up to extortion, identity theft and other crimes. I’m also not convinced age verification, at least as currently executed, does much to prevent minors from being exposed to porn.

If you were to ask any of the people who have been prosecuted for obscenity for the movies they’ve made, books they’ve written, or magazines they’ve published, I think you’d find near unanimity on the question of whether they’d rather pay a financial penalty, or face serving years in prison in addition to being fined, as the likes of Paul Little (AKA “Max Hardcore”) have done in the past.

My point here is not that those of us currently working in the adult industry should simply thank our lucky stars we avoided the crackdowns of the past or simply accept the current campaign against the adult industry without putting up a fight. My point is simply this: We’ve been under the gun for decades and we’ve not only survived but expanded as an industry considerably along the way.

The bottom line, whether the anti-porn zealots like it or not, is many humans like sexual expression, whether one calls it “porn,” “erotica,” or “filth.” Neither the desire to consume the products we make nor the desire to make them is going away—and neither are we.

Read More »

What Would Ethical Age Verification Online Actually Look Like?

age verification

Age-verification laws are spreading fast, and on paper they sound simple enough: if a website hosts explicit content — and sometimes even if it doesn’t — it has to check that visitors are over 18, usually by collecting personal data. Lawmakers say it’s about protecting kids. Full stop.

But scratch the surface and things get messy. Privacy experts keep waving red flags about what happens when sensitive personal data starts piling up on servers. And this year, several studies quietly dropped an uncomfortable truth: these laws don’t actually seem to stop minors from accessing porn at all.

So the uncomfortable question hangs in the air — is age verification, the way it’s currently done, ethical? And if not, what would ethical age-verification even look like? When experts were asked, their answers kept circling back to the same idea: device-level filters.

Current age-verification systems

Right now, most laws — from state-by-state mandates in the U.S. to the UK’s Online Safety Act — put the burden on platforms themselves. Websites are expected to install age checks and sort it out. And, honestly, it hasn’t gone well.

“Age gating, especially the current technology that is available, is ineffective at achieving the goals it seeks to achieve, and minors can circumvent it,” said Cody Venzke, senior policy counsel for the ACLU.

A study published in November showed what happens next. Once these laws go live, searches for VPNs shoot up. That’s usually a sign people are sidestepping location-based restrictions — and succeeding. Searches for porn sites also rise, suggesting people are hunting for platforms that simply don’t comply.

The ethics get even murkier. Mike Stabile, director of public policy at the Free Speech Coalition, didn’t mince words. “In practice, they’ve so far functioned as a form of censorship.”

Fear plays a huge role here. When people worry their IDs might be stored, processed, or leaked — and we’ve already seen IDs exposed, like during October’s Discord hack — they hesitate. Adults back away from legal content. That same November study argued that the cost to adults’ First Amendment rights doesn’t outweigh the limited benefits for minors.

“Unfortunately, we’ve heard many of the advocates behind these laws say that this chilling effect is, in fact, good. They don’t want adults accessing porn,” Stabile said.

And for some lawmakers, that’s not a bug — it’s the feature. Project 2025, the blueprint tied to President Trump’s second term, openly calls for banning porn altogether and imprisoning creators. One of its co-writers, Russell Vought, was reportedly caught on a secret recording in 2024 calling age-verification laws a porn ban through the “back door.”

But there is another path. And it doesn’t start with websites at all.

An ethical age assurance method?

“Storing people’s actual birth dates on company servers is probably not a good way to approach this, especially for minors… you can’t change your birth date if it gets leaked,” said Robbie Torney, senior director of AI programs at Common Sense Media.

“But there are approaches that are privacy-preserving and are already established in the industry that could go a long way towards making it safer for kids to interact across a wide range of digital services.”

It also helps to separate two terms that often get lumped together. Age verification usually means confirming an exact age — showing ID, scanning documents, that sort of thing. Age assurance, Torney explained, is broader. It’s about determining whether someone falls into an age range without demanding precise details.

One real-world example is California’s AB 1043, set to take effect in 2027.

Under that law, operating systems — the software running phones, tablets, and computers — will ask for an age or birthday during setup. The device then creates an age-bracket signal, not an exact age, and sends that signal to apps. If someone’s underage, access is blocked. Simple. And notably, it all happens at the device level.

That approach has been recommended for years by free-speech advocates and adult platforms alike.

“Any solution should be easy to use, privacy-preserving, and consumer-friendly. In most cases, that means the verification is going to happen once, on the device,” Stabile said.

Sarah Gardner, founder and CEO of the child-safety nonprofit Heat Initiative, agreed. “Device-level verification is the best way to do age verification because you’re limiting the amount of data that you give to the apps. And many of the devices already know the age of the users,” she said.

Apple already does some of this. Its Communication Safety feature warns children when they send or receive images containing nudity through iMessage and gives them ways to get help. The company recently expanded protections for teens aged 13–17, including broader web content filters.

So yes, the technology exists. And in 2027, at least in California, device makers will have to use it.

But there’s a catch. AB 1043 doesn’t apply to websites — including adult sites. It only covers devices and app stores.

“Frankly, we want AB 1043 to apply to adult sites,” Stabile said. “We want a signal that tells us when someone is a minor. It’s the easiest, most effective way to block minors and doesn’t force adults to submit to biometrics every time they visit a website.”

Last month, Pornhub sent letters to Apple, Google, and Microsoft urging them to enable device-level age assurance for web platforms. Those letters referenced AB 1043 directly.

Venzke said the ACLU is watching these discussions closely, especially when it comes to privacy implications.

Will device-level age assurance catch on?

Whether tech giants will embrace the idea is still an open question. Microsoft declined to comment. Apple pointed to recent updates around under-18 accounts and a child-safety white paper stating, “The right place to address the dangers of age-restricted content online is the limited set of websites and apps that host that kind of content.”

Google struck a similar tone, saying it’s “committed to protecting kids online,” and highlighted new age-assurance tools like its Credential Manager API. At the same time, it made clear that certain high-risk services will always need to invest in their own compliance tools.

Torney thinks the future probably isn’t either-or. A layered system, where both platforms and operating systems share responsibility, may be unavoidable. “This has been a little bit like hot potato,” he said.

No system will ever be perfect. That part’s worth admitting out loud. “But if you’re operating from a vantage point of wanting to reduce harm, to increase appropriateness, and to increase youth wellbeing,” Torney said, “a more robust age assurance system is going to go much farther to keep the majority of teens safe.”

And maybe that’s the real shift here — moving away from blunt tools that scare adults and don’t stop kids, toward something quieter, smarter, and a little more honest about how people actually use the internet.

Read More »

Oh Good, Warning Labels Are Back Again by Stan Q. Brick

Cigarette warning label

Good news, everyone: The Nanny State is back and coming to a computer screen near you!

In fact, if you live in Washington state or Missouri, the Nanny State is coming to a computer screen very near you indeed, because it will be your own computer’s screen. Or smartphone screen, or smart watch screen, or pretty much any other screen you can connect to the internet.

As you may have read here on The War on Porn or elsewhere, both states currently are considering bills which would not only impose age verification requirements on adult websites but would require such sites to publish warning notices about their content, as well.

The Washington bill is the murkier of the two, stipulating that the warning labels to come are “to be developed by the department of health.” The Missouri bill, on the other hand, is quite specific indeed.

The legislation being pondered in Missouri would require sites to publish warnings stating that “Pornography is potentially biologically addictive, is proven to harm human brain development, desensitizes brain reward circuits, increases conditioned responses, and weakens brain function;” that “exposure to this content is associated with low self-esteem and body image, eating disorders, impaired brain development, and other emotional and mental illnesses;” and finally that “pornography increases the demand for prostitution, child exploitation, and child pornography.”

To say that these claims are disputed would be to put it mildly. Most of the evidence for these assertions is anecdotal in nature, in part because it’s very difficult to evaluate them without intentionally exposing a group of minors to pornography (which is illegal to do) in the context of clinical study.

Regardless of their basis in fact (or lack thereof) these labels are what attorneys and Constitutional scholars call “compelled speech,” something which is a bit of a no-no under First Amendment jurisprudence and the appropriately named “compelled speech doctrine.”

As explained by David L. Hudson Jr., writing for the Free Speech Center at Middle Tennessee State University, the compelled speech doctrine “sets out the principle that the government cannot force an individual or group to support certain expression.”

“Thus, the First Amendment not only limits the government from punishing a person for his speech, but it also prevents the government from punishing a person for refusing to articulate, advocate, or adhere to the government’s approved messages,” Hudson adds.

The compelled speech doctrine has been invoked by the Chief Justice John C. Roberts-era Supreme Court as recently as the case Rumsfeld v. Forum for Academic and Institutional Rights.

“Some of this Court’s leading First Amendment precedents have established the principle that freedom of speech prohibits the government from telling people what they must say,” Roberts wrote for the Court in 2006.

When some folks hear about these labels, doubtlessly they say to themselves something like “How is this any different from requiring cigarette packages to carry warning labels?” And that would be a good question, if cigarettes were a form of speech that presumptively enjoys protection under the First Amendment.

Beyond that distinction, there’s another obvious difference here. Cigarettes, unlike pornography, have been subjected to extensive clinical study, research which has confirmed that nicotine is addictive, and that tobacco (along with the myriad other substances found in cigarettes) is strongly associated with the development of lung cancer and various cardiopulmonary disorders and diseases.

In short, the analogy between pornography and cigarettes is a terrible one, scientifically and legally.

There was a time when I would very confidently assert that the Supreme Court will eventually reject these warning labels as textbook compelled speech and shoot down at least the labeling requirements in the bills pending in Washington and Missouri. But after their decision in Free Speech Coalition v. Paxton, I’m not so sure.

For those who like the contours of our First Amendment just the way they are, this uncertainly should be even more alarming than the warning labels the Nanny State wants us to start seeing on porn sites.

Read More »

Missouri Becomes the Latest State to Treat Online Adults Like Children by Stan Q. Brick

Missouri flag

Citizens of Missouri who frequent adult websites will find the internet has changed for them when they wake up this Sunday morning, towards the end of the long Thanksgiving weekend.

Why will the internet be different for citizens of Missouri as of that morning? Because Sunday is November 30, the day the state’s new age-verification mandate begins for websites covered by the “Missouri Merchandising Practices Act.”

Under the law, websites on which a “substantial portion” of the content (33% or more) is deemed “pornographic for minors” must employ a “reasonable age verification method” to assure anyone accessing such content is an adult.

On its face, requiring adult sites to verify the age of their visitors may not seem like such an unreasonable proposition. But, as the saying goes, “the devil is in the details.”

For starters, making adults jump through hoops to enter a brick-and-mortar adult video store, or requiring people to show ID when purchasing a porn mag at a convenience store is one thing, storing and cross-referencing their personally identifying information is quite another.

When a clerk at an adult shop or any store that sells age-restricted materials checks your ID, they look at it, they look at you, they check the date of birth listed on the ID document and then you both get on with your lives. Minutes later, that same clerk probably couldn’t tell you much about the customer they’d just served, other than “I checked his ID, it looked legit and he’s 55 freaking years old, dude.”

When I scan my ID on the behest of an age-verification provider…who the fuck knows what happens to that data? Sure, some of these state laws prohibit vendors from storing and sharing that data, but do you trust them to follow the law? How many times do we need to haul tech companies before Congress (or watch them get fined by the FCC) to admit they interpreted the law in some “nuanced” way that permits them to hold on to and use our personal data before we get wise to their sneaky ways?

The data collected by age-verification services is valuable to them. They aren’t going to abstain from using it in every profitable way possible, regardless of what the law says. They will find ways to interpret the law such that they can sell, rent out or permit third-party cross-referencing of the data, mark my words. Some of these companies won’t be domiciled in the United States – and they will give just about as big a shit about U.S. law as any other business located outside the jurisdiction of the U.S. does, accordingly.

Of course, none of this will bother the politicians who pass these laws, because this isn’t about protecting kids – and it sure as hell isn’t about protecting the privacy of adults who like to watch porn. This is about a larger antipathy towards adult entertainment and a desire to discourage anyone and everyone from looking at porn, not just minors.

Consider what Missouri Attorney General Catherine Hanaway had to say in September about the new law in her state: “We are holding powerful corporations accountable, respecting women and victims of human trafficking, and helping ensure that minors are shielded from dangerous, sexually explicit material.”

Notice that the bit about “helping ensure that minors are shielded” comes last on the list? That’s not a coincidence.

Someone also needs to explain to me how making people show ID at the door when they watch porn is in any way helping “women and victims of human trafficking.” Let’s assume a person has been trafficked for the purpose of performing in porn (something that truly doesn’t happen often at all, despite a constant stream of political rhetoric to the contrary); how does making viewers confirm they’re old enough to watch legal porn help anyone who has been forced into making illegal porn?

The word “trafficking” doesn’t appear in the text of Missouri’s new law. What does appear there is the claim “nothing in this proposed rule limits the ability of adults to view sexually explicit material online,” which is technically true, so long as one doesn’t consider an age-verification requirement a “limit” to any of the adults who would prefer not to hand over the personally identifying information to God-knows-who.

When the Supreme Court ruled in favor of Texas in the challenge to that state’s age-verification mandate, Cecillia Wang, the national legal director of the American Civil Liberties Union, said something that strikes me as being just as true with respect to the Missouri law:

“The legislature claims to be protecting children from sexually explicit materials, but the law will do little to block their access and instead deters adults from viewing vast amounts of First Amendment-protected content.”

She’s right – and the list of adults deterred by such laws is only going to get longer as these laws proliferate.

Welcome to the dumb-downed internet. Please be mindful of the language you use herein; some of your readers might be children!

Read More »

Aylo Pushes Tech Giants to Adopt API-Driven Device Age Verification

Aylo-logo

Something interesting happens when big tech companies get a polite nudge from a company they usually keep at arm’s length. That’s exactly what Aylo — the parent company of Pornhub — just did. The company asked Google, Apple, and Microsoft to open the door to API signals that would let platforms verify a user’s age at the device or operating-system level. The goal? Keeping minors off porn. It’s a request that feels both obvious and strangely overdue, considering how much of the internet already runs through those devices.

Wired revealed last week that Anthony Penhale, Aylo’s chief legal officer, sent separate letters on Nov. 14 to the relevant executives at each company. Those letters were later confirmed by Aylo, whose spokesperson provided them for review.

Aylo has been steadily pushing the idea that age verification should happen at the device level — not slapped awkwardly onto individual sites through clunky pop-ups and ID uploads. It’s a stance that puts the company at odds with most state and regional age-gating laws in the U.S. and E.U., which still rely on site-level verification. Meanwhile, Google, Apple, and Microsoft have been sending mixed signals about how far they’re willing to go with device-based checks.

Most recently, California’s governor, Gavin Newsom, signed a bill requiring age verification in app stores. Google, Meta, and OpenAI endorsed the measure, while major film studios and streaming platforms pushed back, calling the law a step too far.

“We strongly advocate for device-based age assurance, where users’ age is determined once on the device, and the age range can be used to create an age signal sent over an API to websites,” Penhale wrote in his letter to Apple. “Understanding that your Declared Age Range API is designed to ‘help developers obtain users’ age categories’ for apps, we respectfully request that Apple extend this device-based approach to web platforms.”

“We believe this extension is critical to achieving effective age assurance across the entire digital ecosystem and would enable responsible platforms like ours to provide better protection for minors while preserving user privacy,” he added.

Penhale’s letters to Alphabet and Microsoft echoed the same ask: allow website operators — not just app developers — access to the age-related API tools each company already uses within its own ecosystem.

“As a platform operator committed to user safety and regulatory compliance, Aylo would welcome the opportunity to participate in any technical working groups or discussions regarding extending the current age signal functionality to websites,” Penhale wrote in the letter sent to Microsoft.

A Google spokesperson told Wired that Google Play doesn’t “allow adult entertainment apps” and that “certain high-risk services like Aylo will always need to invest in specific tools to meet their own legal and responsibility obligations.” In other words, Google’s not eager to widen the gates.

Developer documentation shows that Apple now turns on content controls by default for new devices registered to under-18 users. Microsoft, for its part, has leaned heavily toward service-level verification — meaning platforms should handle their own age checks rather than relying on the device.

All of this is unfolding while Aylo continues to argue that site-level age verification doesn’t work. The company has pointed to real-world examples of how these systems push users off regulated sites and into murkier, unmonitored corners of the web.

Internal data shows that traffic from the U.K. to Aylo’s platforms dropped more than 77 percent after Ofcom began enforcing new rules under the Online Safety Act. Related documents reviewed privately indicate that users didn’t disappear — they simply migrated to non-compliant, unregulated sites.

At the same time, a court in Germany just offered Aylo a temporary lifeline. On Nov. 19, the Administrative Court of Düsseldorf put a hold on new regulations requiring ISPs to block Pornhub and YouPorn entirely.

The court’s order would have forced ISPs like Deutsche Telekom, Vodafone, and O2 to bar access to the sites over Germany’s age verification laws. For now, those rules are on pause while the High Administrative Court of North Rhine-Westphalia works through appeals on the original network-ban orders.

Interestingly, the Düsseldorf court pointed out that Germany’s enforcement approach under the Youth Media Protection Interstate Treaty contradicts the European Union’s Digital Services Act, which outlines a different vision for age verification.

Aylo is still fighting over its designation as a “very-large online platform” under the DSA — a label that brings intense regulatory scrutiny and a long list of compliance demands. The company’s push for device-based age checks is part of that bigger battle, and it’s hard not to notice the irony: the company everyone expects to resist regulation is the one asking for the kind that might actually work.

Read More »

Missouri Age-Verification Regulation Takes Effect November 30th

Missouri flag

Missouri’s age-verification regulation, 15 CSR 60-18, kicks in on Sunday, November 30. It arrives quietly, almost like a new rule taped to the front door of the internet—one most people won’t notice until they run into it.

Under Missouri’s rule, any site where at least 33⅓% of content is considered harmful to minors must verify a visitor’s age before letting them in. The state signs off on methods like digital IDs, government-issued identification, or other systems that confirm age through transactional data. If a platform thinks it has a better solution, it can pitch its own—so long as it proves it works just as well.

Violating the rule isn’t just a slap on the wrist. The state treats it as “an unfair, deceptive, fraudulent, or otherwise unlawful practice” under the Missouri Merchandising Practices Act. If regulators decide a violation was done “with the intent to defraud,” it escalates into a Class E felony. Each visit to a non-compliant site counts as a separate offense, with penalties capped at $10,000 per day. There’s no option for private lawsuits; this is the state’s show.

For businesses, the message is simple but unsettling: if you might fall under the rule, read the fine print, understand the liability, and protect yourself. The consequences aren’t theoretical—they’re baked in. And as laws like this multiply, compliance is becoming less about checking a box and more about navigating a moving target with stakes that touch real people and their privacy.

Because once the government decides how adults must prove their age online, the question stops being, Can you follow the rules?

It becomes, What do those rules change about the way we experience the internet at all?

Read More »

FSC Unveils Updated Toolkit to Help Sites Navigate Age-Verification Laws

Free Speech Coalition logo

Earlier this year, a toolkit dropped from the Free Speech Coalition that was supposed to help adult websites navigate the chaos of U.S. age verification laws. On paper, it was about compliance. In reality, it spoke to something bigger—how to follow the law without sacrificing privacy, free expression, or basic human dignity in the process. The updated version arrives after months of legal whiplash and real-world testing, refined by feedback from the people actually living with these requirements. It’s not just a rulebook; it’s a survival guide for an industry being legislated into a corner.

And honestly, it couldn’t have come at a better time.

Laws regulating sexual content online aren’t slowing down. They’re spreading. States are experimenting with different enforcement mechanisms like they’re swapping cocktail recipes—ID uploads here, age-estimation scans there, endless demands for personal data everywhere. What counts as compliance in one state can trigger fines in another. Platforms are stuck either bending to every new rule or blocking entire populations just to avoid liability.

Some people call that safety. Others see it as the invention of a digital checkpoint system where adulthood must be proven over and over again.

The updated toolkit tries to offer a middle path: protect minors without building a surveillance state. That means emphasizing privacy-preserving verification methods, data minimization, and safeguards against turning porn sites into honeypots for identity theft. When your sexual curiosity can be cross-referenced with a government database, it’s not hard to imagine how badly that could go.

But this isn’t just about porn. It’s about how much of yourself you should have to reveal simply to access legal content. If a state can require ID to watch an adult video, why couldn’t it do the same for BDSM forums, queer education sites, or reproductive health information? The slope may not be slippery—it might already be greased.

There’s also the uncomfortable truth that “protecting kids” has become a political Swiss Army knife. Behind the moral language are groups who openly want to make adult content inaccessible altogether, not just to minors. Age verification becomes the first domino rather than the final safeguard. When lawmakers start treating porn the way others treat fentanyl, it’s worth asking who gets to define harm — and who gets punished in the process.

Meanwhile, the people enforcing these laws rarely understand how the internet works. The burden falls on smaller platforms, independent creators, and marginalized workers who already operate under scrutiny. Sex workers were dealing with censorship long before age-verification laws existed. Now, they’re being folded into legislation written by people who’ve never considered how someone pays rent by selling a video clip.

The irony? The more governments tighten restrictions, the faster users migrate to unregulated foreign sites where consent and safety checks don’t exist at all. The “protection” ends up exposing people to worse content, not preventing it.

If lawmakers truly cared about reducing harm, they would fund education, promote ethical production standards, and support platforms that actually moderate content responsibly. Instead, the system encourages the exact opposite: drive traffic to the shadows, then blame the shadows for being dark.

The toolkit is trying to hold the line—compliance without capitulation. It’s a reminder that safety and privacy don’t have to be adversaries. They can coexist, but only if laws are written by people who understand what’s at stake for users and creators.

Because asking adults to prove who they are before they can access legal sexual content isn’t just a technical requirement. It’s a worldview. One where the state sits in the bedroom doorway holding a clipboard, deciding who gets to come inside.

And once that door closes, it rarely opens back up.

Read More »

Yoti Cashes In as New Online Safety Rules Kick In

Yoti logo

Yoti’s revenue didn’t just rise this year—it exploded, conveniently right as age verification rules tightened under the Online Safety Act. The company reported a 55 percent jump in turnover, hitting £20.3 million for the year ending in March. Funny how the moment the government demands IDs to access half the internet, the firms selling ID checks start printing money.

“Regulatory issues are central to the business and Yoti is expecting to benefit from significant regulatory changes, in both identity and age, both in domestic and overseas markets,” the company said.

Translation: more laws that treat adults like children mean more profit.

“Anticipated regulatory changes in the United Kingdom, France and Australia, in particular, are expected to support the company’s growth.”

And that growth isn’t just about protecting anyone—it’s about building infrastructure where logging into a website slowly starts to resemble crossing a border. The more regulation expands, the more companies like Yoti become gatekeepers to everyday life online.

“Rapid development in the sophistication of both online fraud, deep fakes and the technology to prevent this, means that the market is constantly developing and growing.”

Sure, deepfakes and fraud are real concerns. But when the solution is “show your papers” to browse adult content—or eventually anything deemed “sensitive”—it’s worth asking who really benefits. Right now, it looks less like safety and more like a booming business model built on surveillance.

Read More »