Legal Attacks

Age Verification Push Sends Millions of Britons to Unregulated Porn Sites, Charity Says

Age verification

Something strange has been happening since age verification checks quietly became part of everyday internet life in the UK. You’d think stricter rules would close doors. Instead, a lot of people seem to be wandering down darker hallways.

New research from the Lucy Faithfull Foundation suggests that nearly 45 per cent of UK porn users have visited websites without age verification checks since the rules came into force last summer under the Online Safety Act.

The poll, which surveyed more than 3,700 people across Britain, found that 39 per cent of those who visited these sites ended up watching content that made them feel uncomfortable. Even more telling, 40 per cent said what they saw was enough to put them off returning altogether.

The charity, which focuses on preventing online child sexual abuse, has warned that these unregulated spaces can quietly increase the risk of people stumbling into harmful — and illegal — material.

Vicky Young, who leads the Stop It Now UK and Ireland anonymous helpline, said these sites can become a dangerous stepping stone toward indecent images of children.

“We work with people who have looked at indecent images of children to try and address that behaviour, to help support them to change their behaviour,” she said.

“One of the things that people say to us frequently is that they started looking at legal adult pornography and that their behaviour escalated. In part they were spending maybe longer online, but also the sort of content that they were looking at became more extreme and often started getting younger, and that’s when they then crossed into illegal behaviour, so looking at indecent images of children.”

That pattern — the slow creep from curiosity to something far more serious — is what worries the charity most.

“Because of that pathway and that coming out constantly in conversations we have with people, it concerns us that if people are accessing sites where there is this riskier content, that actually they are putting themselves at a higher chance of accessing indecent images,” she said.

“Sometimes that might not be intentional in the beginning, but what people tell us is that actually, if they come across those images as part of their other pornography, that then sparks curiosity. There’s something that kind of adds to the excitement around the risk, and they don’t necessarily stop at one image. They actually then start looking for more images.”

The survey also revealed a quieter anxiety bubbling under the surface. Nearly 30 per cent of respondents said they were worried about how much pornography they consume. That concern was highest among young men aged 18 to 24, with more than half admitting it’s something that troubles them — a group the Foundation describes as particularly vulnerable.

At the same time, the rules appear to be forcing a moment of self-reflection for many. Almost 47 per cent said they’ve reduced how much pornography they watch since age checks were introduced, while 55 per cent said the changes made them stop and think about their habits.

Recent data backs that up. Enforcement of highly effective age assurance led to an immediate drop in traffic to major porn sites from late July. But as one door closed, another cracked open: VPN use surged as people looked for ways around the new barriers.

The UK’s most visited adult site recorded a drop of around 1.5 million viewers year on year, falling from 11.3 million in August 2024 to 9.8 million this August.

Meanwhile, VPN usage more than doubled after the rules came in, jumping from roughly 650,000 daily users to a peak of over 1.4 million in mid-August 2025. Although that number has since dipped, it still hovered around 900,000 by November.

In other words, fewer people are walking through the front door — but plenty are still trying the side entrance.

Dr Alexandra Bailey, head of psychology at the Foundation and an associate professor at the University of Roehampton, said the intention behind age verification is sound, but the consequences are more complicated.

“Age verification is vital to protect children, and we fully support it,” she said. “But we also need to recognise that some adults are choosing riskier sites to avoid age checks. These sites can expose people to harmful material, including illegal content depicting child sexual abuse. Even if you’re not looking for it, you could encounter it — and that can have serious life-changing consequences.”

She added that the rules have created a pause many people probably needed, but not everyone is responding in a safe way.

“Age verification is also prompting adults to reflect on their online behaviour, which can be a good thing for people worried about their porn use. But we need to address the risks for those who are turning to sites that avoid the new regulations.

“Every day, our advisors speak to people whose pornography use has spiralled into something much more harmful. We know embarrassment can stop people from reaching out, but confidential help is available. If you’re worried about your own behaviour or someone else’s, contact Stop It Now before it’s too late.”

Sometimes the most dangerous part isn’t the rule itself — it’s what people do when they decide to dodge it.

Read More »

When HBO’s Industry Meets the Age Verification Reckoning

Age verification image

WHEN THEY DECIDED to take on age verification in their latest season, Industry cocreators Konrad Kay and Mickey Down didn’t expect to wander straight into a political minefield. It probably felt, at first, like one more sharp storyline—edgy, timely, a little dangerous in the way good TV often is. But sometimes a writers’ room accidentally opens a door to something bigger. And once it’s open, there’s no quietly closing it again.

“It was in the ether of British politics, but it wasn’t front and center when we started writing the scripts or shooting it, and then it really flared up as a kind of front-page-of-BBC topic of conversation,” Kay says.

Season 4 of HBO’s sexy, darkly funny financial drama—premiering Sunday—pushes Industry even further beyond the blood-slick trading floors that first defined it. This time, the story spills into tech, porn, age verification, and the uncomfortable politics sitting between them. Early in the season, tensions rise inside Tender, a fintech firm fresh off its IPO, as executives debate whether to keep processing payments for Siren, an adult platform in the OnlyFans mold. Siren—and other porn and gambling businesses—account for a sizable slice of Tender’s revenue. But looming threats of new age-verification laws and a rising tide of anti-porn rhetoric from the UK’s Labour Party have some leaders wondering if reputational cleanup might be more profitable than cashing controversial checks. It’s boardroom fear dressed up as moral clarity, the kind that tends to surface right before regulators do.

In the real world, the UK’s Online Safety Act—requiring age verification to access porn and other restricted content—didn’t take effect until July 2025, long after Kay and Down had mapped out this season’s arc. Still, the parallels are hard to ignore. Platforms like Pornhub saw UK traffic plunge by nearly 80 percent after the rules kicked in, and similar pressures are mounting in the U.S., where roughly half of all states now enforce some form of age-verification law. Even Capitol Hill is circling the issue: in December alone, lawmakers considered 19 bills aimed at protecting minors online. Critics, meanwhile, argue that several of those proposals stray into unconstitutional territory. It’s messy, unresolved, and very much still unfolding.

“It’s kind of shown how fragile free speech absolutism is,” says Down, pointing to the “wildly different” reactions the issue has provoked—from puritan instincts cropping up in liberal circles to a more blunt, censor-first “shut everything down” posture on the conservative side. And that tension, hanging in the air, feels like the real cliffhanger. Not who wins the argument—but what gets lost while everyone’s busy shouting.

Read More »

Utah Senator Floats Porn Tax to Pay for Age-Verification Enforcement

Utah House building

SALT LAKE CITY—There are some ideas that arrive quietly and others that walk in like they own the place. This one does the latter. At the opening of Utah’s new legislative session, a Republican lawmaker dropped a bill that would tax online adult content, funneling the money toward age-verification enforcement and teen mental health programs.

Sen. Calvin R. Musselman, who represents the small town of West Haven, is the driving force behind Senate Bill (SB) 73. The proposal introduces what it calls a “material harmful to minors tax,” set at seven percent of the “gross receipts” from sales of content classified under that label.

SB 73 has been formally introduced but hasn’t yet landed in a committee. Even so, the odds of it clearing the legislature are widely considered high.

The bill defines “gross receipts” as “the total amount of consideration received for a transaction […] without deduction for the cost of materials, labor, service, or other expenses.” In other words, it’s the top line, not the leftovers.

And the reach is… expansive. The tax would apply to “the gross receipts of all sales, distributions, memberships, subscriptions, performances, and content, amounting to material harmful to minors that is: (a) produced in this state; (b) sold in this state; (c) filmed in this state; (d) generated in this state; or (e) otherwise based in this state.” That’s a wide net, and it’s not subtle about it.

Because of that scope, the tax wouldn’t just hit one corner of the industry. Producers, creators, platforms—anyone touching qualifying content—would likely feel it. And it wouldn’t exist in a vacuum. The levy would stack on top of existing obligations, including Utah’s digital sales tax and other state fees.

Revenue from the tax would flow into a newly created government account, earmarked for teen mental health treatment through the state Department of Health and Human Services. It’s worth noting that Utah is among the states that formally frame pornography consumption as a public health crisis, a position tied to the still-contested concept of “pornography addiction.”

The bill doesn’t stop at taxation. It also introduces a $500 annual recurring fee, paid into accounts overseen by the Division of Consumer Protection. This so-called “notification fee” would apply to companies producing content deemed “harmful to minors” and is tied directly to age-verification compliance.

Those funds would be used by the Division to monitor compliance in a system modeled after the United Kingdom’s Ofcom framework. Companies would need to notify annually. Miss that step, and the penalty jumps to $1,000 per day until the paperwork—and compliance—are in order.

Utah, of course, has already been down this road. It was one of the first states to pass a statewide age-verification law structured as a “bounty law,” allowing private individuals to sue on the state’s behalf over noncompliance. That approach famously led Aylo, the owner of Pornhub, to block Utah IP addresses, just as it has done in other states with similar laws.

Utah wouldn’t be alone in adding a porn-specific tax to the mix. Alabama already has one on the books, imposing a ten percent levy on top of existing digital goods and sales taxes.

And the idea is still spreading. In Pennsylvania, a bipartisan pair of state senators recently announced plans to propose a measure that would tax online pornography subscriptions in that state’s digital marketplace.

Read More »

2025: The Year Tighter Regulation Came to Town for the Online Adult Industry by Morley Safeword

Adults only button

When I got my start in the online sector of the adult entertainment business, back in the mid-nineties, there was no video streaming. Individual photos often took web users several minutes to download. And you hardly heard a peep from anyone suggesting that the fledgling industry needed to be reined in.

To be fair, many people were only vaguely aware of what was available on the internet at the time, let alone worried about what their kids might be looking for on there – and frankly, the web was so slow, using it exceeded the patience of a lot of kids, anyway.

Oh, how things have changed.

What evolved fastest, of course, was the technology underpinning the internet. As high-speed connectivity became the norm rather than the exception and video streaming capabilities increased year over year, online porn went from something enjoyed by a small subset of early adopters to a massive, multibillion dollar industry. Along with those changes in technology came ever louder calls for the online adult industry to be more tightly regulated – or regulated at all, in the still-early-internet days of the mid-nineties.

In the United States, Congress began cooking up proposals to prevent minors from accessing online porn. While these proposals enjoyed broad bipartisan support (within the legislature, at least), what they didn’t get was much support from the courts.

Early attempts to impose things like age verification requirements were slapped down by the courts, most notably in cases like Reno v. ACLU, decided in 1997. In Reno, the Supreme Court held that certain provisions of the Communications Decency Act of 1996 (“CDA”) violated the First Amendment. Specifically, the court found that the CDA’s “indecent transmission” and “patently offensive display” provisions trod upon the freedom of speech protected by the First Amendment.

What changed in 2025, as the Supreme Court again considered an age verification proposal, this time a state law passed in Texas (“HB 1181”), was in part the continued forward march of technology. But more crucially, what changed was the court’s disposition as to which “standard of review” ought to be applied.

In previous cases involving online age verification proposals, the court has applied “strict scrutiny,” a high bar that requires the government to show its actions (and laws) are “narrowly tailored” to further a “compelling government interest” and are the “least restrictive means” to further that interest.

In the case Free Speech Coalition v. Paxton, which the Supreme Court decided in June, the district court had applied strict scrutiny and found that HB 1181 failed to satisfy the standard. When the case reached the Supreme Court, however, the majority decided the court had erred in applying strict scrutiny and that the correct standard to apply was “intermediate scrutiny,” which sets the bar much lower for the government.

Writing for the majority, Justice Clarence Thomas asserted that HB 1181 has “only an incidental effect on protected speech.”

“The First Amendment leaves undisturbed States’ traditional power to prevent minors from accessing speech that is obscene from their perspective,” Thomas wrote. “That power includes the power to require proof of age before an individual can access such speech. It follows that no person – adult or child – has a First Amendment right to access such speech without first submitting proof of age.”

Since the law “simply requires adults to verify their age before they can access speech that is obscene to children,” Thomas found that HB 1181 “is therefore subject only to intermediate scrutiny, which it readily survives.”

The three justices who dissented from the majority’s position didn’t see things quite the same way, naturally. In her dissent, Justice Elena Kagan criticized the majority’s holding as “confused” and highlighted the ways in which it departed from the court’s previous rulings in similar cases.

“Cases raising that question have reached this Court on no fewer than four prior occasions – and we have given the same answer, consistent with general free speech principles, each and every time,” Kagan observed. “Under those principles, we apply strict scrutiny, a highly rigorous but not fatal form of constitutional review, to laws regulating protected speech based on its content. And laws like H. B. 1181 fit that description: They impede adults from viewing a class of speech protected for them (even though not for children) and defined by its content. So, when we have confronted those laws before, we have always asked the strict scrutiny question: Is the law the least restrictive means of achieving a compelling state interest? There is no reason to change course.”

Whether there was reason to change course or not, surely now the course has been changed. Make no mistake, laws like HB 1181 are here to stay – and they will be followed by other measures designed to restrict access to sexually-explicit materials online, as well as regulation which goes much further and sweeps in an even broader range of controversial content.

The old cliché about the “canary in the coal mine” has often been applied to pornography in the context of free speech discussions. Even those who don’t like or approve of porn have often warned that crackdowns on sexually explicit expression can presage attempts at regulating other forms of speech.

If indeed those of us who work in the adult industry are part of a sentinel species, the warning to our peers in the broader world of entertainment and self-expression could not be more clear, as we look out to 2026 and beyond: Here in the world of online porn canaries, we’re choking on this new regulatory push – and most likely, some of you other birds are going to be feeling short of breath too, soon enough.

Read More »

Commentary: Age Verification Trounced Free Speech in 2025

Here’s a commentary on age verification from Michael McGrady of AVN.

Read More »

Congressional Push to Amend – or Simply End – Section 230 Safe Harbor Continues by Morley Safeword

Section 230

For years now, legislators at both the state and federal level have been calling for reform to the “safe harbor” provisions of Section 230 of the Communications Decency Act of 1996, a provision which has long protected providers (and to an extent, users) of “interactive computer services” from liability stemming from the actions of third parties.

There are several proposals floating around the U.S. House of Representatives and the U.S. Senate currently, some of which are far broader than others in terms of their impact on Section 230 safe harbor and the entities that rely on it. The most extreme of these proposals is one that would simply eliminate Section 230 altogether after December 31, 2026.

The need for reforms to Section 230, according to the legislators pushing for such, is rooted in the belief that changes and advances in communications technology have outpaced the law – and have turned Section 230 into too large a shield, in effect, for the technology companies it protects.

“Changes in technology have created new opportunities for criminals to harass, exploit, intimidate and harm American children,” said Senator Chuck Grassley (R-Iowa) in a statement about the Section 230 reform bills he sponsors or supports. “These horrific crimes – often committed by violent online groups who take advantage of our nation’s outdated laws – have gone unchecked for far too long.”

Senator Dick Durbin (D-Ill.) has joined Grassley in his effort to amend Section 230—and echoed Grassley’s sentiments in the same statement.

“Because of modern technology, child predators from anywhere in the world can target American kids online,” Grassley said. “As technology has evolved, so have online child exploiters. Today, offenders are engaging in sadistic online exploitation and coercing kids to take their own lives. Big Tech continues to fail our most vulnerable because they refuse to incorporate safety-by-design measures into their platforms or make meaningful efforts to detect the increasingly violent and depraved sexual exploitation of children on their services.”

The most extreme Section 230 reform idea being bandied about in Congress right now is the “Sunset To Reform Section 230 Act,” a very short bill that would simply append the following text to the law: “(g) Sunset.—This section shall have no force or effect after December 31, 2026.” The effect of this Act, should it pass, seems to be a complete repeal of Section 230, as opposed to a reform of the law.

While it’s perfectly understandable for people to want to do more to protect children who use the internet and other communications technologies and platforms, eliminating Section 230 would have far-reaching implications, some of which I get the feeling Congress has not fully considered.

Publication of user-generated content (UGC) is not limited to the likes of adult tube sites or major social media platforms. It’s one thing to approach Section 230 reform ‘surgically,’ by limiting the scope of its protections, or requiring more of the largest and best-funded platforms in terms of policing the content uploaded by their users, but to repeal Section 230 entirely would create a flood of lawsuits, potentially directed at any site or platform that enables users to publish content.

It’s not hard for one to imagine the chaos that could ensue, even for legislators themselves. If a representative or senator has a website of their own that allows readers and users to post comments, does the legislator in question want to face liability for anything untoward or illicit those users might post? This is the sort of hypothetical I’m not sure the likes of Grassley and Durbin have fully taken on board.

Reasonable people can disagree on whether the scope of Section 230 immunity, particularly as it has been interpreted by the courts, is too broad. But when it comes to reforming the safe harbor, outright elimination of Section 230 would create far more problems than it would solve.

Read More »

French Regulator Gets Its Way as Targeted Adult Sites Add Age Verification

France flag

PARIS — There’s a particular hush that falls right before the hammer drops. Five high-traffic adult websites based outside of France have now put age-verification systems in place under the country’s Security and Regulation of the Digital Space (SREN) law, after receiving pointed warnings from media regulator Arcom. It’s one of those moments where the room goes quiet and everyone waits to see who blinks first.

Back in August, Arcom sent enforcement notices to xHamster, Xvideos, XNXX, xHamsterLive, and TNAFlix, giving them a tight three-week window to comply or brace for delisting and blocking proceedings. Not exactly a friendly nudge—more like a stopwatch set on the table.

According to the regulator, all five sites now have age-verification solutions in place, and for the moment, that’s been enough to halt further action. No public victory laps, no dramatic announcements—just a sense that compliance, at least for now, has won the day.

That hasn’t stopped the arguments, though. From the start, there’s been real tension over whether France even has the authority to regulate companies based in other EU member states, and how that authority would work in practice. Arcom asked media regulators in Cyprus and the Czech Republic to help enforce its rules against the warned sites, but those agencies declined, saying they simply don’t have the legal tools to enforce French age-verification law within their own borders.

Then came a shift in September. In a case involving WebGroup Czech Republic, which operates XVideos.com, and NKL Associates, which operates XNXX.com, an advocate general of the European Union’s Court of Justice advised that France may, in fact, require pornographic websites based in other EU states to implement age verification in line with French law. It wasn’t a ruling—more like a legal compass—but it pointed in a very clear direction.

The opinion isn’t binding, but if the EU Court of Justice follows it, the ripple effects could be enormous. It would set precedent for other member states wrestling with the same jurisdiction questions, especially as similar litigation plays out in Germany over whether national laws or the EU’s Digital Services Act ultimately take precedence. This is the slow, grinding part of policymaking—courts, counsels, and contradictions, all trying to decide who gets the final word.

And this likely isn’t the end of it. Arcom has made clear that its next move will be to widen enforcement to include smaller adult sites. The message feels unmistakable now: this isn’t a one-off crackdown—it’s a line being drawn, and the rest of the industry is standing just behind it, watching to see how hard it holds.

Read More »

Florida AG Drops Age-Verification Case Against Segpay

James Uthmeier

TALLAHASSEE, Fla. — Sometimes legal battles don’t end with a bang, but with a quiet agreement and a collective exhale. On Monday, the Florida attorney general’s office agreed to drop its claims against payment processor Segpay in a lawsuit tied to alleged noncompliance with the state’s age-verification law.

Back in September, Florida Attorney General James Uthmeier brought lawsuits against both Segpay and Aylo in the 12th Judicial Circuit Court of Florida. The accusations centered on alleged violations of HB3, the state’s age-verification statute — a law with real teeth, carrying potential fines of up to $50,000 for each individual violation. The stakes were never abstract; they were painfully concrete.

Then, on Monday, the temperature shifted. The Office of the Attorney General and Segpay jointly filed a stipulation of voluntary dismissal, effectively closing that chapter of the case. No dramatic courtroom showdown. Just a line drawn under it.

Attorney Corey Silverstein, who represented Segpay alongside fellow industry attorney Lawrence Walters, said he and his clients were relieved by how the matter ultimately played out. Anyone who’s spent time in regulatory trench warfare knows that resolution — especially a fair one — can feel like a small miracle.

“We are very appreciative that the Florida AG’s office worked with us to get a clear understanding of the real facts involved here,” Silverstein said.

The lawsuit against Aylo, however, is still moving forward, a reminder that while one door has quietly closed, others remain very much open.

Read More »

The Adult Industry Has Been Through Worse. We Will Survive by Morley Safeword

Anthony Comstock

These are challenging times for the adult entertainment industry, no doubt. Around the globe, governments are passing increasingly strict regulations around age verification and other, more censorious measures putatively designed to “protect minors,” but which legislators and anti-porn crusaders also hope will reduce porn consumption among adults, as well.

If all this is enough to inspire some folks in the adult industry want to wave the white flag, close up shop, and find something else to do for a living, I can certainly understand why. As the name of this site reflects, people in the industry rightfully feel like they’re under siege, waging a battle against forces with a great deal more wealth and power to enlist as weapons than does our side.

As someone who has worked in the adult industry for nearly 30 years (and who has enjoyed its products even longer), take it from me when I tell you none of this is new. Some of the battlefields are new and they are constantly evolving, but the war itself goes back longer than many of us can remember.

In the United States, obscenity laws and other statutes designed to maintain public morals and prevent the corruption of society date back to colonial times. In other words, long before there was an adult entertainment industry against which to wage war, the government was taking aim at sexual expression and conduct.

Fast forward to the 19th Century and there was the establishment of the Comstock Act of 1873, which—among many other things—made it a criminal offense to send obscene materials through the U.S. mail. The Act also made it illegal to use the mail to tell someone where such materials might be found, or how to make them provisions, which was later struck down by the courts as overly broad, thankfully.

To give you an idea of just how much more restrictive the obscenity laws were in the early 20th Century than they are today, you need only look as far as the name of a seminal case from 1933 – United States v. One Book Called Ulysses. Frankly, the contents of James Joyce’s Ulysses wouldn’t even be enough to raise one-half of a would-be censor’s eyebrow these days, yet it was considered positively scandalous in its day.

From an American adult industry perspective, the War on Porn arguably reached its zenith in the 1980s and 1990s, under Presidents Ronald Reagan and George H.W. Bush. According to the Bureau of Justice Statistics, in 1990 alone there were 74 federal obscenity prosecutions targeting adult materials (as opposed to Child Sexual Abuse Materials, which are patently illegal and have no protection under the First Amendment). Contrast that figure with 2009, in which there were a total of six.

Despite the number of prosecutions at the start of the decade, the 1990s were a period of tremendous growth for the adult industry, driven in large part by the advent of the commercial internet and its relatively unregulated environment. What we’re seeing now is what governments might call a “correction” of that laissez faire approach – and what those of us in the industry might call an overcorrection.

Yes, age-verification laws present a challenge. Like a lot of people in the adult industry, I don’t object to the idea of making people prove they’re adults before consuming porn; what I object to is the means by which we’re required to offer such proof and the way those methods compromise not only our privacy, but potentially open us up to extortion, identity theft and other crimes. I’m also not convinced age verification, at least as currently executed, does much to prevent minors from being exposed to porn.

If you were to ask any of the people who have been prosecuted for obscenity for the movies they’ve made, books they’ve written, or magazines they’ve published, I think you’d find near unanimity on the question of whether they’d rather pay a financial penalty, or face serving years in prison in addition to being fined, as the likes of Paul Little (AKA “Max Hardcore”) have done in the past.

My point here is not that those of us currently working in the adult industry should simply thank our lucky stars we avoided the crackdowns of the past or simply accept the current campaign against the adult industry without putting up a fight. My point is simply this: We’ve been under the gun for decades and we’ve not only survived but expanded as an industry considerably along the way.

The bottom line, whether the anti-porn zealots like it or not, is many humans like sexual expression, whether one calls it “porn,” “erotica,” or “filth.” Neither the desire to consume the products we make nor the desire to make them is going away—and neither are we.

Read More »

FTC Revisits ‘Click to Cancel’ Subscription Rules

Click to cancel

There’s a familiar hum starting up again in Washington — that low, bureaucratic buzz that usually means a rule thought to be dead isn’t quite finished yet. The Federal Trade Commission has opened the door for public comment on a petition that would revive trade regulation rulemaking around negative option plans, following a federal court decision that knocked out the agency’s earlier “click-to-cancel” rule meant to simplify subscription cancellations.

Earlier this month, the FTC received and published a petition for rulemaking submitted by the Consumer Federation of America and the American Economic Liberties Project. The clock is now ticking, with the public comment period set to run through Jan. 2.

This isn’t the commission’s first time around this particular block. After announcing proposed changes back in March 2023, the FTC was flooded with feedback — more than 16,000 comments poured in from consumers, government agencies, advocacy groups, and trade associations. That kind of response tends to linger, even when the rules themselves get stalled.

Then came the judicial roadblock. In July, the U.S. Court of Appeals for the 8th Circuit vacated the FTC’s updated Negative Option Rule while further review was pending. Critics had argued that the agency overstepped its authority and cut corners procedurally, particularly by failing to issue a preliminary regulatory analysis. The court agreed enough to hit pause.

The irony is that the Negative Option Rule itself isn’t new or radical. It dates back to the 1970s, when it was designed to stop consumers from being quietly signed up for subscriptions they never agreed to. The proposed updates would have dramatically expanded its reach, applying to most negative option programs — from automatic renewals to free trials that roll into paid plans. For many website operators, that would’ve meant rethinking how sign-ups work, how cancellations happen, and how friction gets engineered into the process.

This new petition may be the first real sign that the “further review” ordered by the appeals court is officially underway. It opens the possibility that the FTC could come back with the same ideas — or close cousins — this time wrapped in tighter procedure and cleaner paperwork. Whether that leads to meaningful consumer protection or just another round of regulatory whiplash remains to be seen. But one thing feels clear: the click-to-cancel fight isn’t over. It just took a breath before getting back up.

Read More »