Legal Attacks

Utah Senator Floats Porn Tax to Pay for Age-Verification Enforcement

Utah House building

SALT LAKE CITY—There are some ideas that arrive quietly and others that walk in like they own the place. This one does the latter. At the opening of Utah’s new legislative session, a Republican lawmaker dropped a bill that would tax online adult content, funneling the money toward age-verification enforcement and teen mental health programs.

Sen. Calvin R. Musselman, who represents the small town of West Haven, is the driving force behind Senate Bill (SB) 73. The proposal introduces what it calls a “material harmful to minors tax,” set at seven percent of the “gross receipts” from sales of content classified under that label.

SB 73 has been formally introduced but hasn’t yet landed in a committee. Even so, the odds of it clearing the legislature are widely considered high.

The bill defines “gross receipts” as “the total amount of consideration received for a transaction […] without deduction for the cost of materials, labor, service, or other expenses.” In other words, it’s the top line, not the leftovers.

And the reach is… expansive. The tax would apply to “the gross receipts of all sales, distributions, memberships, subscriptions, performances, and content, amounting to material harmful to minors that is: (a) produced in this state; (b) sold in this state; (c) filmed in this state; (d) generated in this state; or (e) otherwise based in this state.” That’s a wide net, and it’s not subtle about it.

Because of that scope, the tax wouldn’t just hit one corner of the industry. Producers, creators, platforms—anyone touching qualifying content—would likely feel it. And it wouldn’t exist in a vacuum. The levy would stack on top of existing obligations, including Utah’s digital sales tax and other state fees.

Revenue from the tax would flow into a newly created government account, earmarked for teen mental health treatment through the state Department of Health and Human Services. It’s worth noting that Utah is among the states that formally frame pornography consumption as a public health crisis, a position tied to the still-contested concept of “pornography addiction.”

The bill doesn’t stop at taxation. It also introduces a $500 annual recurring fee, paid into accounts overseen by the Division of Consumer Protection. This so-called “notification fee” would apply to companies producing content deemed “harmful to minors” and is tied directly to age-verification compliance.

Those funds would be used by the Division to monitor compliance in a system modeled after the United Kingdom’s Ofcom framework. Companies would need to notify annually. Miss that step, and the penalty jumps to $1,000 per day until the paperwork—and compliance—are in order.

Utah, of course, has already been down this road. It was one of the first states to pass a statewide age-verification law structured as a “bounty law,” allowing private individuals to sue on the state’s behalf over noncompliance. That approach famously led Aylo, the owner of Pornhub, to block Utah IP addresses, just as it has done in other states with similar laws.

Utah wouldn’t be alone in adding a porn-specific tax to the mix. Alabama already has one on the books, imposing a ten percent levy on top of existing digital goods and sales taxes.

And the idea is still spreading. In Pennsylvania, a bipartisan pair of state senators recently announced plans to propose a measure that would tax online pornography subscriptions in that state’s digital marketplace.

Read More »

2025: The Year Tighter Regulation Came to Town for the Online Adult Industry by Morley Safeword

Adults only button

When I got my start in the online sector of the adult entertainment business, back in the mid-nineties, there was no video streaming. Individual photos often took web users several minutes to download. And you hardly heard a peep from anyone suggesting that the fledgling industry needed to be reined in.

To be fair, many people were only vaguely aware of what was available on the internet at the time, let alone worried about what their kids might be looking for on there – and frankly, the web was so slow, using it exceeded the patience of a lot of kids, anyway.

Oh, how things have changed.

What evolved fastest, of course, was the technology underpinning the internet. As high-speed connectivity became the norm rather than the exception and video streaming capabilities increased year over year, online porn went from something enjoyed by a small subset of early adopters to a massive, multibillion dollar industry. Along with those changes in technology came ever louder calls for the online adult industry to be more tightly regulated – or regulated at all, in the still-early-internet days of the mid-nineties.

In the United States, Congress began cooking up proposals to prevent minors from accessing online porn. While these proposals enjoyed broad bipartisan support (within the legislature, at least), what they didn’t get was much support from the courts.

Early attempts to impose things like age verification requirements were slapped down by the courts, most notably in cases like Reno v. ACLU, decided in 1997. In Reno, the Supreme Court held that certain provisions of the Communications Decency Act of 1996 (“CDA”) violated the First Amendment. Specifically, the court found that the CDA’s “indecent transmission” and “patently offensive display” provisions trod upon the freedom of speech protected by the First Amendment.

What changed in 2025, as the Supreme Court again considered an age verification proposal, this time a state law passed in Texas (“HB 1181”), was in part the continued forward march of technology. But more crucially, what changed was the court’s disposition as to which “standard of review” ought to be applied.

In previous cases involving online age verification proposals, the court has applied “strict scrutiny,” a high bar that requires the government to show its actions (and laws) are “narrowly tailored” to further a “compelling government interest” and are the “least restrictive means” to further that interest.

In the case Free Speech Coalition v. Paxton, which the Supreme Court decided in June, the district court had applied strict scrutiny and found that HB 1181 failed to satisfy the standard. When the case reached the Supreme Court, however, the majority decided the court had erred in applying strict scrutiny and that the correct standard to apply was “intermediate scrutiny,” which sets the bar much lower for the government.

Writing for the majority, Justice Clarence Thomas asserted that HB 1181 has “only an incidental effect on protected speech.”

“The First Amendment leaves undisturbed States’ traditional power to prevent minors from accessing speech that is obscene from their perspective,” Thomas wrote. “That power includes the power to require proof of age before an individual can access such speech. It follows that no person – adult or child – has a First Amendment right to access such speech without first submitting proof of age.”

Since the law “simply requires adults to verify their age before they can access speech that is obscene to children,” Thomas found that HB 1181 “is therefore subject only to intermediate scrutiny, which it readily survives.”

The three justices who dissented from the majority’s position didn’t see things quite the same way, naturally. In her dissent, Justice Elena Kagan criticized the majority’s holding as “confused” and highlighted the ways in which it departed from the court’s previous rulings in similar cases.

“Cases raising that question have reached this Court on no fewer than four prior occasions – and we have given the same answer, consistent with general free speech principles, each and every time,” Kagan observed. “Under those principles, we apply strict scrutiny, a highly rigorous but not fatal form of constitutional review, to laws regulating protected speech based on its content. And laws like H. B. 1181 fit that description: They impede adults from viewing a class of speech protected for them (even though not for children) and defined by its content. So, when we have confronted those laws before, we have always asked the strict scrutiny question: Is the law the least restrictive means of achieving a compelling state interest? There is no reason to change course.”

Whether there was reason to change course or not, surely now the course has been changed. Make no mistake, laws like HB 1181 are here to stay – and they will be followed by other measures designed to restrict access to sexually-explicit materials online, as well as regulation which goes much further and sweeps in an even broader range of controversial content.

The old cliché about the “canary in the coal mine” has often been applied to pornography in the context of free speech discussions. Even those who don’t like or approve of porn have often warned that crackdowns on sexually explicit expression can presage attempts at regulating other forms of speech.

If indeed those of us who work in the adult industry are part of a sentinel species, the warning to our peers in the broader world of entertainment and self-expression could not be more clear, as we look out to 2026 and beyond: Here in the world of online porn canaries, we’re choking on this new regulatory push – and most likely, some of you other birds are going to be feeling short of breath too, soon enough.

Read More »

Commentary: Age Verification Trounced Free Speech in 2025

Here’s a commentary on age verification from Michael McGrady of AVN.

Read More »

Congressional Push to Amend – or Simply End – Section 230 Safe Harbor Continues by Morley Safeword

Section 230

For years now, legislators at both the state and federal level have been calling for reform to the “safe harbor” provisions of Section 230 of the Communications Decency Act of 1996, a provision which has long protected providers (and to an extent, users) of “interactive computer services” from liability stemming from the actions of third parties.

There are several proposals floating around the U.S. House of Representatives and the U.S. Senate currently, some of which are far broader than others in terms of their impact on Section 230 safe harbor and the entities that rely on it. The most extreme of these proposals is one that would simply eliminate Section 230 altogether after December 31, 2026.

The need for reforms to Section 230, according to the legislators pushing for such, is rooted in the belief that changes and advances in communications technology have outpaced the law – and have turned Section 230 into too large a shield, in effect, for the technology companies it protects.

“Changes in technology have created new opportunities for criminals to harass, exploit, intimidate and harm American children,” said Senator Chuck Grassley (R-Iowa) in a statement about the Section 230 reform bills he sponsors or supports. “These horrific crimes – often committed by violent online groups who take advantage of our nation’s outdated laws – have gone unchecked for far too long.”

Senator Dick Durbin (D-Ill.) has joined Grassley in his effort to amend Section 230—and echoed Grassley’s sentiments in the same statement.

“Because of modern technology, child predators from anywhere in the world can target American kids online,” Grassley said. “As technology has evolved, so have online child exploiters. Today, offenders are engaging in sadistic online exploitation and coercing kids to take their own lives. Big Tech continues to fail our most vulnerable because they refuse to incorporate safety-by-design measures into their platforms or make meaningful efforts to detect the increasingly violent and depraved sexual exploitation of children on their services.”

The most extreme Section 230 reform idea being bandied about in Congress right now is the “Sunset To Reform Section 230 Act,” a very short bill that would simply append the following text to the law: “(g) Sunset.—This section shall have no force or effect after December 31, 2026.” The effect of this Act, should it pass, seems to be a complete repeal of Section 230, as opposed to a reform of the law.

While it’s perfectly understandable for people to want to do more to protect children who use the internet and other communications technologies and platforms, eliminating Section 230 would have far-reaching implications, some of which I get the feeling Congress has not fully considered.

Publication of user-generated content (UGC) is not limited to the likes of adult tube sites or major social media platforms. It’s one thing to approach Section 230 reform ‘surgically,’ by limiting the scope of its protections, or requiring more of the largest and best-funded platforms in terms of policing the content uploaded by their users, but to repeal Section 230 entirely would create a flood of lawsuits, potentially directed at any site or platform that enables users to publish content.

It’s not hard for one to imagine the chaos that could ensue, even for legislators themselves. If a representative or senator has a website of their own that allows readers and users to post comments, does the legislator in question want to face liability for anything untoward or illicit those users might post? This is the sort of hypothetical I’m not sure the likes of Grassley and Durbin have fully taken on board.

Reasonable people can disagree on whether the scope of Section 230 immunity, particularly as it has been interpreted by the courts, is too broad. But when it comes to reforming the safe harbor, outright elimination of Section 230 would create far more problems than it would solve.

Read More »

French Regulator Gets Its Way as Targeted Adult Sites Add Age Verification

France flag

PARIS — There’s a particular hush that falls right before the hammer drops. Five high-traffic adult websites based outside of France have now put age-verification systems in place under the country’s Security and Regulation of the Digital Space (SREN) law, after receiving pointed warnings from media regulator Arcom. It’s one of those moments where the room goes quiet and everyone waits to see who blinks first.

Back in August, Arcom sent enforcement notices to xHamster, Xvideos, XNXX, xHamsterLive, and TNAFlix, giving them a tight three-week window to comply or brace for delisting and blocking proceedings. Not exactly a friendly nudge—more like a stopwatch set on the table.

According to the regulator, all five sites now have age-verification solutions in place, and for the moment, that’s been enough to halt further action. No public victory laps, no dramatic announcements—just a sense that compliance, at least for now, has won the day.

That hasn’t stopped the arguments, though. From the start, there’s been real tension over whether France even has the authority to regulate companies based in other EU member states, and how that authority would work in practice. Arcom asked media regulators in Cyprus and the Czech Republic to help enforce its rules against the warned sites, but those agencies declined, saying they simply don’t have the legal tools to enforce French age-verification law within their own borders.

Then came a shift in September. In a case involving WebGroup Czech Republic, which operates XVideos.com, and NKL Associates, which operates XNXX.com, an advocate general of the European Union’s Court of Justice advised that France may, in fact, require pornographic websites based in other EU states to implement age verification in line with French law. It wasn’t a ruling—more like a legal compass—but it pointed in a very clear direction.

The opinion isn’t binding, but if the EU Court of Justice follows it, the ripple effects could be enormous. It would set precedent for other member states wrestling with the same jurisdiction questions, especially as similar litigation plays out in Germany over whether national laws or the EU’s Digital Services Act ultimately take precedence. This is the slow, grinding part of policymaking—courts, counsels, and contradictions, all trying to decide who gets the final word.

And this likely isn’t the end of it. Arcom has made clear that its next move will be to widen enforcement to include smaller adult sites. The message feels unmistakable now: this isn’t a one-off crackdown—it’s a line being drawn, and the rest of the industry is standing just behind it, watching to see how hard it holds.

Read More »

Florida AG Drops Age-Verification Case Against Segpay

James Uthmeier

TALLAHASSEE, Fla. — Sometimes legal battles don’t end with a bang, but with a quiet agreement and a collective exhale. On Monday, the Florida attorney general’s office agreed to drop its claims against payment processor Segpay in a lawsuit tied to alleged noncompliance with the state’s age-verification law.

Back in September, Florida Attorney General James Uthmeier brought lawsuits against both Segpay and Aylo in the 12th Judicial Circuit Court of Florida. The accusations centered on alleged violations of HB3, the state’s age-verification statute — a law with real teeth, carrying potential fines of up to $50,000 for each individual violation. The stakes were never abstract; they were painfully concrete.

Then, on Monday, the temperature shifted. The Office of the Attorney General and Segpay jointly filed a stipulation of voluntary dismissal, effectively closing that chapter of the case. No dramatic courtroom showdown. Just a line drawn under it.

Attorney Corey Silverstein, who represented Segpay alongside fellow industry attorney Lawrence Walters, said he and his clients were relieved by how the matter ultimately played out. Anyone who’s spent time in regulatory trench warfare knows that resolution — especially a fair one — can feel like a small miracle.

“We are very appreciative that the Florida AG’s office worked with us to get a clear understanding of the real facts involved here,” Silverstein said.

The lawsuit against Aylo, however, is still moving forward, a reminder that while one door has quietly closed, others remain very much open.

Read More »

The Adult Industry Has Been Through Worse. We Will Survive by Morley Safeword

Anthony Comstock

These are challenging times for the adult entertainment industry, no doubt. Around the globe, governments are passing increasingly strict regulations around age verification and other, more censorious measures putatively designed to “protect minors,” but which legislators and anti-porn crusaders also hope will reduce porn consumption among adults, as well.

If all this is enough to inspire some folks in the adult industry want to wave the white flag, close up shop, and find something else to do for a living, I can certainly understand why. As the name of this site reflects, people in the industry rightfully feel like they’re under siege, waging a battle against forces with a great deal more wealth and power to enlist as weapons than does our side.

As someone who has worked in the adult industry for nearly 30 years (and who has enjoyed its products even longer), take it from me when I tell you none of this is new. Some of the battlefields are new and they are constantly evolving, but the war itself goes back longer than many of us can remember.

In the United States, obscenity laws and other statutes designed to maintain public morals and prevent the corruption of society date back to colonial times. In other words, long before there was an adult entertainment industry against which to wage war, the government was taking aim at sexual expression and conduct.

Fast forward to the 19th Century and there was the establishment of the Comstock Act of 1873, which—among many other things—made it a criminal offense to send obscene materials through the U.S. mail. The Act also made it illegal to use the mail to tell someone where such materials might be found, or how to make them provisions, which was later struck down by the courts as overly broad, thankfully.

To give you an idea of just how much more restrictive the obscenity laws were in the early 20th Century than they are today, you need only look as far as the name of a seminal case from 1933 – United States v. One Book Called Ulysses. Frankly, the contents of James Joyce’s Ulysses wouldn’t even be enough to raise one-half of a would-be censor’s eyebrow these days, yet it was considered positively scandalous in its day.

From an American adult industry perspective, the War on Porn arguably reached its zenith in the 1980s and 1990s, under Presidents Ronald Reagan and George H.W. Bush. According to the Bureau of Justice Statistics, in 1990 alone there were 74 federal obscenity prosecutions targeting adult materials (as opposed to Child Sexual Abuse Materials, which are patently illegal and have no protection under the First Amendment). Contrast that figure with 2009, in which there were a total of six.

Despite the number of prosecutions at the start of the decade, the 1990s were a period of tremendous growth for the adult industry, driven in large part by the advent of the commercial internet and its relatively unregulated environment. What we’re seeing now is what governments might call a “correction” of that laissez faire approach – and what those of us in the industry might call an overcorrection.

Yes, age-verification laws present a challenge. Like a lot of people in the adult industry, I don’t object to the idea of making people prove they’re adults before consuming porn; what I object to is the means by which we’re required to offer such proof and the way those methods compromise not only our privacy, but potentially open us up to extortion, identity theft and other crimes. I’m also not convinced age verification, at least as currently executed, does much to prevent minors from being exposed to porn.

If you were to ask any of the people who have been prosecuted for obscenity for the movies they’ve made, books they’ve written, or magazines they’ve published, I think you’d find near unanimity on the question of whether they’d rather pay a financial penalty, or face serving years in prison in addition to being fined, as the likes of Paul Little (AKA “Max Hardcore”) have done in the past.

My point here is not that those of us currently working in the adult industry should simply thank our lucky stars we avoided the crackdowns of the past or simply accept the current campaign against the adult industry without putting up a fight. My point is simply this: We’ve been under the gun for decades and we’ve not only survived but expanded as an industry considerably along the way.

The bottom line, whether the anti-porn zealots like it or not, is many humans like sexual expression, whether one calls it “porn,” “erotica,” or “filth.” Neither the desire to consume the products we make nor the desire to make them is going away—and neither are we.

Read More »

FTC Revisits ‘Click to Cancel’ Subscription Rules

Click to cancel

There’s a familiar hum starting up again in Washington — that low, bureaucratic buzz that usually means a rule thought to be dead isn’t quite finished yet. The Federal Trade Commission has opened the door for public comment on a petition that would revive trade regulation rulemaking around negative option plans, following a federal court decision that knocked out the agency’s earlier “click-to-cancel” rule meant to simplify subscription cancellations.

Earlier this month, the FTC received and published a petition for rulemaking submitted by the Consumer Federation of America and the American Economic Liberties Project. The clock is now ticking, with the public comment period set to run through Jan. 2.

This isn’t the commission’s first time around this particular block. After announcing proposed changes back in March 2023, the FTC was flooded with feedback — more than 16,000 comments poured in from consumers, government agencies, advocacy groups, and trade associations. That kind of response tends to linger, even when the rules themselves get stalled.

Then came the judicial roadblock. In July, the U.S. Court of Appeals for the 8th Circuit vacated the FTC’s updated Negative Option Rule while further review was pending. Critics had argued that the agency overstepped its authority and cut corners procedurally, particularly by failing to issue a preliminary regulatory analysis. The court agreed enough to hit pause.

The irony is that the Negative Option Rule itself isn’t new or radical. It dates back to the 1970s, when it was designed to stop consumers from being quietly signed up for subscriptions they never agreed to. The proposed updates would have dramatically expanded its reach, applying to most negative option programs — from automatic renewals to free trials that roll into paid plans. For many website operators, that would’ve meant rethinking how sign-ups work, how cancellations happen, and how friction gets engineered into the process.

This new petition may be the first real sign that the “further review” ordered by the appeals court is officially underway. It opens the possibility that the FTC could come back with the same ideas — or close cousins — this time wrapped in tighter procedure and cleaner paperwork. Whether that leads to meaningful consumer protection or just another round of regulatory whiplash remains to be seen. But one thing feels clear: the click-to-cancel fight isn’t over. It just took a breath before getting back up.

Read More »

New Federal Bills Target Repeal of Section 230

US Congress

Something old — and foundational — is back on the chopping block in Washington. This week, members of Congress introduced two separate bills that would dismantle Section 230 of the Communications Decency Act, the legal backbone that protects interactive computer services — including adult platforms — from being held liable for user-generated content.

On Tuesday, Rep. Harriet Hageman of Wyoming introduced HR 6746, known as the Sunset to Reform Section 230 Act. The proposal doesn’t tinker or trim around the edges. It simply adds one stark sentence to the law: “This section shall have no force or effect after December 31, 2026.”

A day later, Sen. Lindsey Graham of South Carolina followed with his own bill, S 3546, which would also repeal Section 230 — this time, two years after enactment.

An Attempt to Gain Leverage

The delayed timelines aren’t accidental. Rather than calling for an immediate repeal, lawmakers appear to be using the threat itself as leverage — a ticking clock meant to force reluctant stakeholders to the negotiating table.

On the right, critics argue that platforms hide behind Section 230 while censoring conservative speech. Their goal is to limit platforms’ ability to moderate content as they see fit. On the left — and sometimes crossing party lines — lawmakers rail against “Big Tech” for profiting from illegal or harmful material, pushing for stricter moderation by making platforms legally responsible for what users post.

In a statement, Hageman warned that “outside interests” would try to block reform efforts.

“We must therefore find a way to force the issue through the reauthorization process,” she said.

Sen. Richard Blumenthal of Connecticut, one of several Democratic co-sponsors of Graham’s bill, echoed that sentiment. Supporting S 3546, he framed it as a pressure tactic designed to corner tech companies. The bill, he said, would “force Big Tech to the table with a bold demand: either negotiate sensible reforms now or lose your absolute immunity forever.”

Others backing the legislation — Graham included — spoke less about leverage and more about repeal as an end goal. Either way, the ripple effects would land hard on the adult industry.

Potential Consequences

Once Section 230 is opened up, even slightly, it becomes easy to imagine targeted carve-outs — the same way FOSTA/SESTA stripped protections from sites accused of “unlawfully promoting and facilitating” prostitution or sex trafficking.

Industry attorney Lawrence Walters didn’t mince words. “The modern adult industry is largely dependent on Section 230, which allows for operation of fan sites, cam sites and adult platforms,” he said. “If this bill is passed, or an adult industry carve-out is adopted, these business models are threatened. This frontal assault on Section 230 immunity should be a source of great concern to the adult industry and online freedom, generally.”

Those concerns aren’t new. Back in 2024, Free Speech Coalition Executive Director Alison Boden warned that altering or repealing Section 230 would unleash chaos.

“I think that it would cause a further crackdown on sexual content,” Boden said. “If there was a carve-out of Section 230 for ‘obscenity,’ the same way that FOSTA/SESTA carved out ‘human trafficking,’ that would have serious implications.”

And the odds of an adult-specific carve-out feel higher now than ever, given the broader political climate.

The Supreme Court has already signaled a shift, ruling in Free Speech Coalition v. Paxton that laws restricting access to adult content may be subject to a less rigorous standard of review. During his first term, President Trump attempted — unsuccessfully — to repeal Section 230 through an amendment to an unrelated bill. His FCC chair pick, Brendan Carr, has openly called for gutting Section 230 protections and previously helped author Project 2025’s “Mandate for Leadership,” which controversially asserted that pornography “has no claim to First Amendment protection.” Graham’s bill is also bipartisan, with co-sponsors that include influential Democrats like Dick Durbin and Amy Klobuchar.

A carve-out aimed at adult platforms would function as a de facto repeal of Section 230 for the industry. Any site hosting user-generated content would suddenly be exposed to legal liability — and a flood of civil lawsuits would almost certainly follow.

While the First Amendment protects legal speech, Section 230 has long acted as a shield against attempts to suppress that speech through litigation. Without it, unpopular expression — and adult content sits squarely in that category — becomes an easy target.

Industry attorney Corey Silverstein put it bluntly: losing Section 230 would be “catastrophic.”

“It would mean that internet service providers, search engines, and every interactive website could be left responsible for the actions of its users,” Silverstein said. “That is simply untenable, and these businesses would not be able to exist out of fear of being sued out of existence.”

And that’s the quiet reality behind the rhetoric. Strip away the politics, the soundbites, the threats meant to scare platforms into compliance — and what’s left is a question no one seems eager to answer: what happens to online speech when the shield disappears?

Read More »

What Would Ethical Age Verification Online Actually Look Like?

age verification

Age-verification laws are spreading fast, and on paper they sound simple enough: if a website hosts explicit content — and sometimes even if it doesn’t — it has to check that visitors are over 18, usually by collecting personal data. Lawmakers say it’s about protecting kids. Full stop.

But scratch the surface and things get messy. Privacy experts keep waving red flags about what happens when sensitive personal data starts piling up on servers. And this year, several studies quietly dropped an uncomfortable truth: these laws don’t actually seem to stop minors from accessing porn at all.

So the uncomfortable question hangs in the air — is age verification, the way it’s currently done, ethical? And if not, what would ethical age-verification even look like? When experts were asked, their answers kept circling back to the same idea: device-level filters.

Current age-verification systems

Right now, most laws — from state-by-state mandates in the U.S. to the UK’s Online Safety Act — put the burden on platforms themselves. Websites are expected to install age checks and sort it out. And, honestly, it hasn’t gone well.

“Age gating, especially the current technology that is available, is ineffective at achieving the goals it seeks to achieve, and minors can circumvent it,” said Cody Venzke, senior policy counsel for the ACLU.

A study published in November showed what happens next. Once these laws go live, searches for VPNs shoot up. That’s usually a sign people are sidestepping location-based restrictions — and succeeding. Searches for porn sites also rise, suggesting people are hunting for platforms that simply don’t comply.

The ethics get even murkier. Mike Stabile, director of public policy at the Free Speech Coalition, didn’t mince words. “In practice, they’ve so far functioned as a form of censorship.”

Fear plays a huge role here. When people worry their IDs might be stored, processed, or leaked — and we’ve already seen IDs exposed, like during October’s Discord hack — they hesitate. Adults back away from legal content. That same November study argued that the cost to adults’ First Amendment rights doesn’t outweigh the limited benefits for minors.

“Unfortunately, we’ve heard many of the advocates behind these laws say that this chilling effect is, in fact, good. They don’t want adults accessing porn,” Stabile said.

And for some lawmakers, that’s not a bug — it’s the feature. Project 2025, the blueprint tied to President Trump’s second term, openly calls for banning porn altogether and imprisoning creators. One of its co-writers, Russell Vought, was reportedly caught on a secret recording in 2024 calling age-verification laws a porn ban through the “back door.”

But there is another path. And it doesn’t start with websites at all.

An ethical age assurance method?

“Storing people’s actual birth dates on company servers is probably not a good way to approach this, especially for minors… you can’t change your birth date if it gets leaked,” said Robbie Torney, senior director of AI programs at Common Sense Media.

“But there are approaches that are privacy-preserving and are already established in the industry that could go a long way towards making it safer for kids to interact across a wide range of digital services.”

It also helps to separate two terms that often get lumped together. Age verification usually means confirming an exact age — showing ID, scanning documents, that sort of thing. Age assurance, Torney explained, is broader. It’s about determining whether someone falls into an age range without demanding precise details.

One real-world example is California’s AB 1043, set to take effect in 2027.

Under that law, operating systems — the software running phones, tablets, and computers — will ask for an age or birthday during setup. The device then creates an age-bracket signal, not an exact age, and sends that signal to apps. If someone’s underage, access is blocked. Simple. And notably, it all happens at the device level.

That approach has been recommended for years by free-speech advocates and adult platforms alike.

“Any solution should be easy to use, privacy-preserving, and consumer-friendly. In most cases, that means the verification is going to happen once, on the device,” Stabile said.

Sarah Gardner, founder and CEO of the child-safety nonprofit Heat Initiative, agreed. “Device-level verification is the best way to do age verification because you’re limiting the amount of data that you give to the apps. And many of the devices already know the age of the users,” she said.

Apple already does some of this. Its Communication Safety feature warns children when they send or receive images containing nudity through iMessage and gives them ways to get help. The company recently expanded protections for teens aged 13–17, including broader web content filters.

So yes, the technology exists. And in 2027, at least in California, device makers will have to use it.

But there’s a catch. AB 1043 doesn’t apply to websites — including adult sites. It only covers devices and app stores.

“Frankly, we want AB 1043 to apply to adult sites,” Stabile said. “We want a signal that tells us when someone is a minor. It’s the easiest, most effective way to block minors and doesn’t force adults to submit to biometrics every time they visit a website.”

Last month, Pornhub sent letters to Apple, Google, and Microsoft urging them to enable device-level age assurance for web platforms. Those letters referenced AB 1043 directly.

Venzke said the ACLU is watching these discussions closely, especially when it comes to privacy implications.

Will device-level age assurance catch on?

Whether tech giants will embrace the idea is still an open question. Microsoft declined to comment. Apple pointed to recent updates around under-18 accounts and a child-safety white paper stating, “The right place to address the dangers of age-restricted content online is the limited set of websites and apps that host that kind of content.”

Google struck a similar tone, saying it’s “committed to protecting kids online,” and highlighted new age-assurance tools like its Credential Manager API. At the same time, it made clear that certain high-risk services will always need to invest in their own compliance tools.

Torney thinks the future probably isn’t either-or. A layered system, where both platforms and operating systems share responsibility, may be unavoidable. “This has been a little bit like hot potato,” he said.

No system will ever be perfect. That part’s worth admitting out loud. “But if you’re operating from a vantage point of wanting to reduce harm, to increase appropriateness, and to increase youth wellbeing,” Torney said, “a more robust age assurance system is going to go much farther to keep the majority of teens safe.”

And maybe that’s the real shift here — moving away from blunt tools that scare adults and don’t stop kids, toward something quieter, smarter, and a little more honest about how people actually use the internet.

Read More »