Age-verification laws are spreading fast, and on paper they sound simple enough: if a website hosts explicit content — and sometimes even if it doesn’t — it has to check that visitors are over 18, usually by collecting personal data. Lawmakers say it’s about protecting kids. Full stop.
But scratch the surface and things get messy. Privacy experts keep waving red flags about what happens when sensitive personal data starts piling up on servers. And this year, several studies quietly dropped an uncomfortable truth: these laws don’t actually seem to stop minors from accessing porn at all.
So the uncomfortable question hangs in the air — is age verification, the way it’s currently done, ethical? And if not, what would ethical age-verification even look like? When experts were asked, their answers kept circling back to the same idea: device-level filters.
Current age-verification systems
Right now, most laws — from state-by-state mandates in the U.S. to the UK’s Online Safety Act — put the burden on platforms themselves. Websites are expected to install age checks and sort it out. And, honestly, it hasn’t gone well.
“Age gating, especially the current technology that is available, is ineffective at achieving the goals it seeks to achieve, and minors can circumvent it,” said Cody Venzke, senior policy counsel for the ACLU.
A study published in November showed what happens next. Once these laws go live, searches for VPNs shoot up. That’s usually a sign people are sidestepping location-based restrictions — and succeeding. Searches for porn sites also rise, suggesting people are hunting for platforms that simply don’t comply.
The ethics get even murkier. Mike Stabile, director of public policy at the Free Speech Coalition, didn’t mince words. “In practice, they’ve so far functioned as a form of censorship.”
Fear plays a huge role here. When people worry their IDs might be stored, processed, or leaked — and we’ve already seen IDs exposed, like during October’s Discord hack — they hesitate. Adults back away from legal content. That same November study argued that the cost to adults’ First Amendment rights doesn’t outweigh the limited benefits for minors.
“Unfortunately, we’ve heard many of the advocates behind these laws say that this chilling effect is, in fact, good. They don’t want adults accessing porn,” Stabile said.
And for some lawmakers, that’s not a bug — it’s the feature. Project 2025, the blueprint tied to President Trump’s second term, openly calls for banning porn altogether and imprisoning creators. One of its co-writers, Russell Vought, was reportedly caught on a secret recording in 2024 calling age-verification laws a porn ban through the “back door.”
But there is another path. And it doesn’t start with websites at all.
An ethical age assurance method?
“Storing people’s actual birth dates on company servers is probably not a good way to approach this, especially for minors… you can’t change your birth date if it gets leaked,” said Robbie Torney, senior director of AI programs at Common Sense Media.
“But there are approaches that are privacy-preserving and are already established in the industry that could go a long way towards making it safer for kids to interact across a wide range of digital services.”
It also helps to separate two terms that often get lumped together. Age verification usually means confirming an exact age — showing ID, scanning documents, that sort of thing. Age assurance, Torney explained, is broader. It’s about determining whether someone falls into an age range without demanding precise details.
One real-world example is California’s AB 1043, set to take effect in 2027.
Under that law, operating systems — the software running phones, tablets, and computers — will ask for an age or birthday during setup. The device then creates an age-bracket signal, not an exact age, and sends that signal to apps. If someone’s underage, access is blocked. Simple. And notably, it all happens at the device level.
That approach has been recommended for years by free-speech advocates and adult platforms alike.
“Any solution should be easy to use, privacy-preserving, and consumer-friendly. In most cases, that means the verification is going to happen once, on the device,” Stabile said.
Sarah Gardner, founder and CEO of the child-safety nonprofit Heat Initiative, agreed. “Device-level verification is the best way to do age verification because you’re limiting the amount of data that you give to the apps. And many of the devices already know the age of the users,” she said.
Apple already does some of this. Its Communication Safety feature warns children when they send or receive images containing nudity through iMessage and gives them ways to get help. The company recently expanded protections for teens aged 13–17, including broader web content filters.
So yes, the technology exists. And in 2027, at least in California, device makers will have to use it.
But there’s a catch. AB 1043 doesn’t apply to websites — including adult sites. It only covers devices and app stores.
“Frankly, we want AB 1043 to apply to adult sites,” Stabile said. “We want a signal that tells us when someone is a minor. It’s the easiest, most effective way to block minors and doesn’t force adults to submit to biometrics every time they visit a website.”
Last month, Pornhub sent letters to Apple, Google, and Microsoft urging them to enable device-level age assurance for web platforms. Those letters referenced AB 1043 directly.
Venzke said the ACLU is watching these discussions closely, especially when it comes to privacy implications.
Will device-level age assurance catch on?
Whether tech giants will embrace the idea is still an open question. Microsoft declined to comment. Apple pointed to recent updates around under-18 accounts and a child-safety white paper stating, “The right place to address the dangers of age-restricted content online is the limited set of websites and apps that host that kind of content.”
Google struck a similar tone, saying it’s “committed to protecting kids online,” and highlighted new age-assurance tools like its Credential Manager API. At the same time, it made clear that certain high-risk services will always need to invest in their own compliance tools.
Torney thinks the future probably isn’t either-or. A layered system, where both platforms and operating systems share responsibility, may be unavoidable. “This has been a little bit like hot potato,” he said.
No system will ever be perfect. That part’s worth admitting out loud. “But if you’re operating from a vantage point of wanting to reduce harm, to increase appropriateness, and to increase youth wellbeing,” Torney said, “a more robust age assurance system is going to go much farther to keep the majority of teens safe.”
And maybe that’s the real shift here — moving away from blunt tools that scare adults and don’t stop kids, toward something quieter, smarter, and a little more honest about how people actually use the internet.
The War on Porn Regular Updates about the Assault on The Adult Industry