The War on Porn

FTC Inches Closer to a New ‘Click to Cancel’ Subscription Rule

Click to cancel

It always seems to start the same way: you notice a charge you don’t recognize, scroll through your bank app, and realize—again—that you’re paying for something you thought you canceled months ago. That frustration is the backdrop as the Federal Trade Commission once more steps into the messy, bureaucratic maze of subscription rules, trying to revive its long-stalled effort to rein in negative option plans after a federal court knocked down its last attempt.

In a statement released Friday, the FTC said it has submitted a draft Advance Notice of Proposed Rulemaking, or ANPRM, on its Negative Option Rule to the Office of Information and Regulatory Affairs. OIRA, which sits inside the Office of Management and Budget, now gets to scrutinize the proposal before the FTC can publish it in the Federal Register. Only then does the public get a say—one more round of comments, one more chance for consumers to vent about subscriptions that refuse to die.

The commission’s vote to approve sending the draft to OIRA was unanimous, though that unanimity comes with an asterisk. The FTC currently has only two sitting commissioners, leaving three seats empty. One of those two, Chairman Andrew N. Ferguson, had previously voted against the updated Negative Option Rule when it narrowly passed in October 2024. It’s a strange kind of consensus, the sort you get when the room is half-empty.

That earlier rule didn’t survive long anyway. The U.S. Court of Appeals for the 8th Circuit vacated it while further review plays out, siding with critics who argued the agency overstepped its authority and skipped required procedural steps by failing to issue a preliminary regulatory analysis. In regulatory terms, it was less a slap on the wrist and more a reminder that process still matters—even when intentions are good.

Back in December 2025, the FTC also posted a petition for rulemaking from the Consumer Federation of America and the American Economic Liberties Project. The public comment window on that petition closed Jan. 2, quietly adding another layer of pressure and paperwork to an already complicated path forward.

The Negative Option Rule itself isn’t new. It dates back to the 1970s, born in an era of mail-order clubs and surprise shipments, designed to stop consumers from being signed up—and billed—without clear consent. The 2024 amendments would have dramatically expanded its reach, covering nearly all negative option programs, from auto-renewing subscriptions to “free trial” offers that quietly flip into paid plans. For many websites, that would have meant rethinking how sign-ups work and, more importantly, how easy it is to cancel.

Now, with the process restarted yet again, the FTC could circle back with the same ideas, or something close to them. Whether this time leads to real change—or just another loop through regulatory limbo—remains the open question hanging over every “Cancel subscription” button that somehow never quite does what it promises.

Read More »

Conservative Push for Porn Taxes Sparks Constitutional Backlash

Tax

It feels like the walls are closing in a little more every week. As age-verification laws continue to reshape—and in some cases dismantle—the adult industry, a Utah lawmaker has now stepped forward with a bill that would slap a new tax on porn sites operating in the state. It’s the kind of proposal that makes you pause, reread the headline, and wonder how we got here so fast.

Introduced by Republican state senator Calvin Musselman, the bill would impose a 7 percent tax on total receipts “from sales, distributions, memberships, subscriptions, performances, and content amounting to material harmful to minors that is produced, sold, filmed, generated, or otherwise based” in Utah. If passed, it would take effect in May and require adult sites to pay an additional $500 annual fee to the State Tax Commission. According to the legislation, revenue from the tax would be directed to Utah’s Department of Health and Human Services to expand mental health support for teens.

A new strain of American conservatism is asserting itself more boldly, and lawmakers across the US are calling for tighter restrictions on adult content. In September, Alabama became the first state to introduce a porn tax—10 percent on adult entertainment companies—after passing age-verification mandates that require users to upload ID or other personal documentation before accessing explicit material. Pennsylvania lawmakers are also exploring a proposal that would tack an extra 10 percent tax onto subscriptions and one-time purchases from online adult platforms, despite already charging a 6 percent sales and use tax on digital products, two state senators wrote in an October memo. Other states have flirted with similar ideas before. In 2019, Arizona state senator Gail Griffin, a Republican, proposed taxing adult content distributors to help fund a border wall during Donald Trump’s first term. To date, 25 US states have enacted some form of age verification.

Professor Answers Television History Questions

Efforts to criminalize sex workers and regulate the industry have been unfolding for years, accelerating alongside increased online surveillance and censorship. Yet targeted taxes have repeatedly stalled, in part because the legality of such measures remains deeply contested.

“This kind of porn tax is blatantly unconstitutional,” says Evelyn Douek, an associate professor of law at Stanford Law School. “It singles out a particular type of protected speech for disfavored treatment, purely because the legislature doesn’t like it—that’s exactly what the First Amendment is designed to protect against. Utah may not like porn, but as the Supreme Court affirmed only last year, adults have a fully protected right to access it.”

Utah, Alabama, and Pennsylvania are among 16 states that have adopted resolutions declaring pornography a public health crisis. “We realize this is a bold assertion not everyone will agree on, but it’s the full-fledged truth,” Utah governor Gary Herbert tweeted in 2016 after signing the resolution. Utah’s early response to the spread of adult content dates back to 2001, when it became the first state to establish an office focused on sexually explicit material by appointing an obscenity and pornography complaints ombudsman. The role—often referred to as the “porn czar”—was eliminated in 2017.

“Age restriction is a very complex subject that brings with it data privacy concerns and the potential for uneven and inconsistent application for different digital platforms,” Alex Kekesi, vice president of brand and community at Pornhub, said previously. In November, the company urged Google, Microsoft, and Apple to adopt device-based verification across app stores and operating systems. “We have seen several states and countries try to impose platform-level age verification requirements, and they have all failed to adequately protect children.” To comply with existing mandates, Pornhub has blocked access to users in 23 states.

Critics argue that age verification has never truly been about protecting children, but about quietly scrubbing porn from the internet. In 2024, a leaked video from the Centre for Climate Reporting showed Russell Vought, a Trump ally and Project 2025 coauthor, describing age-verification laws as a “back door” to a federal porn ban.

Platforms like OnlyFans and Pornhub have pushed sex work further into the mainstream, but they’ve also made it easier to monitor and police both performers and audiences. As states consider new tariffs and penalties, it’s creators who are most likely to absorb the shock.

The cultural conservatism taking shape under Trump 2.0 is driven by a desire to punish sexual expression, says Mike Stabile, director of public policy at the Free Speech Coalition, the US adult industry’s trade association. “When we talk about free speech, we generally mean the freedom to speak, the ability to speak freely without government interference. But in this case, free also means not having to pay for the right to do so. A government tax on speech limits that right to those who can afford it.”

OnlyFans says it complies with all tax requirements in the jurisdictions where it operates, while creators remain responsible for their own tax obligations. Pornhub, which is currently blocked in Utah and Alabama, did not respond to a request for comment.

Douek points out that while states can regulate minors’ access to explicit material following the Supreme Court’s decision upholding Texas’ age-verification law, “a porn tax does nothing to limit minors’ access to this speech—it simply makes it more expensive to provide this content to adults.” A 2022 report from Common Sense Media found that 73 percent of teens aged 13 to 17 had viewed adult content online. Today, much of that exposure happens through social media platforms like X and Snap. A recent survey from the UK’s Office of the Children’s Commissioner found that 59 percent of minors encounter porn accidentally—up from 38 percent the year before—mostly via social feeds.

In Alabama, as would be the case in Utah, revenue from the porn tax is earmarked for behavioral health services, including prevention, treatment, and recovery programs for young people.

Last year, Alabama state representative Ben Robbins, the Republican sponsor of the bill, said adult content was “a driver in causing mental health issues” in the state. It’s a familiar claim among lawmakers advocating for a nationwide porn ban. While some studies suggest adolescent exposure to porn may correlate with depression, low self-esteem, or normalized violence, medical experts have never reached a clear consensus.

As lawmakers increasingly frame the issue around harm to minors, Stabile says it’s crucial to remember that adult content is not a special category outside the bounds of free expression. Courts have repeatedly struck down content-specific taxes as unconstitutional censorship.

“What if a state decided that Covid misinformation was straining state health resources and taxed newsletters who promoted it? What if the federal government decided to require a costly license to start a podcast? What if a state decided to tax a certain newspaper it didn’t like?” he says. “Porn isn’t some magical category of speech separate from movies, streaming services, or other forms of entertainment. Adult businesses already pay taxes on the income they earn, just as every other business does. Taxing them because of imagined harms isn’t just dangerous for our industry—it’s a dangerous expansion of government power.”

Read More »

‘An Embarrassment’: Critics Slam UK’s Proposed VPN Age Checks

VPN

It started the way these things always seem to start lately—with a vote that felt small on paper and enormous everywhere else. Politicians, technologists, and civil society groups reacted with visible dismay after the House of Lords backed a move that would ban children from using VPNs and force providers to roll out age verification.

The backlash was swift. Wikipedia co-founder Jimmy Wales blasted the decision on X, calling the UK’s position an embarrassment. Windscribe CEO Yegor Sak had already summed up the idea as the “dumbest possible fix,” warning that forcing age checks on VPNs would set a deeply troubling precedent for digital privacy.

By Tuesday morning, the argument had spilled fully into the open. Online debate surged, with X logging more than 20,000 posts on the issue in just 24 hours—one of those moments where you can almost hear the internet arguing with itself.

Labour, Lords & VPN laws

Last week, the House of Lords voted in favor of an amendment to the Children’s Wellbeing and Schools Bill that would, in effect, bar anyone under 18 from using VPNs.

The proposal would require commercial VPN providers to deploy mandatory age assurance technology, specifically to stop minors from using VPNs to bypass online safety measures. It sounds tidy in theory. In reality, it opens a can of worms no one seems eager to fully acknowledge.

Notably, the government itself opposed the amendment. Instead, it has opened a three-month consultation on children’s social media use, which includes a broader look at VPNs and how—or whether—they should be addressed.

Political pushback

Even though the House of Lords has shown its hand, the proposal now heads to the House of Commons, where it’s expected to hit serious resistance from the Labour government.

If the Commons throws it out, as many expect, the Lords will have to decide whether to dig in and trigger a round of parliamentary “ping-pong” or quietly step aside.

Labour’s Lord Knight of Weymouth, who voted against the amendment, suggested there’s little appetite for a drawn-out fight. He told TechRadar that it’s unlikely politicians will “die in a ditch” over banning VPNs.

In his view, many lawmakers are chasing “something iconic” on child safety—something headline-friendly—rather than wading into the technical swamp that regulating VPNs would require.

That said, Knight didn’t dismiss the broader concern. He argued that regulator Ofcom “needs to do better” at enforcing existing safety laws and agreed that more should be done to protect children online, provided it’s handled “carefully.” That word—carefully—does a lot of work here.

Civil society’s response

Regardless of whether this particular amendment survives, one thing is clear: VPNs are under a brighter spotlight than ever, and not just in the UK.

In the United States, lawmakers in Wisconsin are pushing a bill that would require adult websites to block access from users connected via a VPN. In Michigan, legislators have floated ideas around ISP-level blocking of circumvention tools. Different routes, same destination.

Evan Greer, director of the US-based group Fight for the Future, warned that policies designed to discourage or ban VPN use will “will put human rights activists, journalists, abuse survivors and other vulnerable people in immediate danger.”

Fight for the Future is running a campaign that lets users contact lawmakers directly, arguing in an open letter that the ability to use the internet safely and privately is a fundamental human right.

Back in the UK, a public petition is urging the government to reject any plan that would effectively ban VPNs for children.

The Open Rights Group has also been vocal, pointing out that detecting or banning VPN use isn’t realistically possible without resorting to what it calls an “extreme level of digital authoritarianism.”

And just in case anyone missed the point the first time, the reaction hasn’t softened. Politicians, technologists, and civil society organizations continue to express dismay after the House of Lords vote to ban children from using VPNs and force providers to introduce age verification.

Jimmy Wales again called the UK’s stance an embarrassment, while Windscribe CEO Yegor Sak repeated his warning that this is the “dumbest possible fix” and a terrible precedent for privacy.

The conversation flared once more as public debate peaked Tuesday morning, with more than 20,000 posts appearing on X in a single day—a reminder that when it comes to privacy, the internet rarely stays quiet for long.

Read More »

Alabama Adds Notary Requirement to Adult Content Consent Forms

Alabama flag

Something shifted in the room last week, and you could feel it before anyone said a word. The kind of tension that creeps up your spine when people realize a line has been crossed. Alarm is spreading across the adult industry over new record-keeping and consent rules tied to adult content production and user access in Alabama, and that anxiety boiled over after a series of legal panels at the AVN Adult Entertainment Expo in Las Vegas.

At the center of the panic is a sweeping state law that folds age verification, a statewide 10 percent “sin” tax on pornography, and a wider web of regulation into one heavy package. For producers, the most immediate shock is a new demand that written consent documentation be notarized by a notary public, as directed by the Alabama Secretary of State.

The legislation, House Bill (HB) 164, tucks this requirement into the state’s deceptive trade practices statutes, a placement that feels almost casual given how disruptive the impact could be.

The law reads, “Any commercial entity, before knowingly publishing or distributing a private image […] through an adult website, shall obtain written consent to publish or distribute the private image from every individual depicted in the private image.

“The written consent required by this section shall be signed by the individual depicted and sworn to by a notary public,” the law further decrees. “The commercial entity shall maintain records of the written consent for not less than five calendar years following the publication or distribution of the private image.”

What’s missing, conspicuously, is guidance. No major regulatory clarification has come down from the state, leaving the notary requirement sitting in a gray zone that some believe could be worked around. In theory, studios could hire an in-house notary licensed in Alabama or elsewhere, assuming that person can legally perform notarial acts. In practice, that’s easier said than done.

Even with workarounds, the pressure mounts. Creators, producers, and studios are already feeling the weight of added friction, cost, and uncertainty. Several platforms are now warning users about written consent requirements, and many have begun restricting—or outright blocking—users tied to Alabama IP addresses.

It’s a familiar move. Parent companies behind major platforms like Pornhub have already chosen to geo-block nearly half the country in response to age-gating laws. When compliance becomes a minefield, the simplest option is often to shut the gate entirely.

First Amendment attorney Corey Silverstein, who represents adult industry clients, didn’t mince words when discussing Alabama’s approach.

“One could argue that Alabama created one of the most aggressive age verification laws because it included mandatory health warnings,” he said. “Alabama’s law is so burdensome to website operators that it’s no surprise that they are simply closing their doors to Alabama content creators.

“This goes to the heart of my continued position that these laws have nothing to do with protecting minors,” he continued. “States like Alabama want to control what a person can view or publish and completely ignore the First Amendment. Sadly, Alabama enacted a law so burdensome that it has achieved its nefarious goal of eliminating content it deems unsuitable.”

Silverstein shared those concerns during a legal panel at the expo, alongside Lawrence Walters of Walters Law Group, as the room wrestled with a question no one seems ready to answer out loud: when regulation becomes this heavy, is the point compliance—or disappearance?

Read More »

Alabama’s Latest Adult Content Law Pushes Creators Out, Not Toward Safety

Picture of the Alabama Flag

Most adult creators didn’t need a push notification to feel it. The moment the news started circulating, it landed with a familiar weight: Clips4Sale has restricted access in Alabama after the passage of House Bill 164, a law that introduces notarized consent requirements for performers and platforms. The company frames the decision as compliance—necessary, even prudent. Creators read it differently. To many, it felt like the ground quietly disappearing beneath their feet.

Both interpretations can coexist. And maybe that’s the most unsettling part.

Legislation like Alabama’s is almost always sold as “protective.” The language is comforting, even noble—designed to reassure the public that something dangerous is being handled. But when you listen to the people living under these laws—performers, indie creators, small operators—the tone shifts. What comes through isn’t relief. It’s confusion. Anxiety. A creeping sense that they’re being legislated out of existence without anyone actually talking to them.

House Bill 164 didn’t arrive out of nowhere. It’s part of a broader pattern unfolding across the country, where states are targeting adult platforms through new consent rules, age checks, and documentation standards. On paper, they sound reasonable. In reality, they unravel fast.

What they create isn’t safety. It’s splintering.

A Law That Misses the Reality of the Industry

Adult performers aren’t operating without rules. They never have been. For decades, the industry has been bound by strict federal record-keeping requirements—ID verification, age documentation, signed releases. These systems already exist. They’re already enforced. They’re already audited. And they’re treated seriously because the penalties for failure are brutal.

Which is exactly why Alabama’s law sparked disbelief instead of reassurance.

Adult performer Leilani Lei cut through the noise on X by asking a simple question: do lawmakers actually understand what notarization does? A notary verifies identity and witnesses a signature. That’s the full job description. They don’t assess consent. They don’t evaluate content. They don’t make legal judgments. Requiring notarization doesn’t increase safety—it adds friction, expense, and logistical chaos.

Is a notary expected on every set? For every solo clip? For content created privately by independent performers in their own homes? These aren’t dramatic hypotheticals. They’re practical questions that expose how disconnected the law is from how adult work actually functions.

When laws ignore operational reality, compliance stops being ethical and starts being geographic. Platforms block states. Creators lose access. Income vanishes—not because of misconduct, but because following the rules becomes impossible.

When “Protection” Quietly Becomes Economic Damage

One consequence of laws like HB 164 rarely gets discussed out loud: money.

Adult creators aren’t faceless entities. They’re people paying rent, covering medical bills, supporting families. For many, digital platforms aren’t side hustles—they’re lifelines. When a state gets geoblocked, creators living there lose their audience instantly. When platforms restrict access, creators with fans in that state watch sales drop overnight.

Cupcake SinClair’s response on X captured the mood perfectly—not panic, but dread. Not fear of regulation itself, but fear of where this path leads. If these laws keep spreading—each state tweaking the rules just enough—what does the landscape look like in a year? Two years? Does access slowly shrink until it’s determined entirely by ZIP code?

That’s not protection. That’s erosion.

And while platforms like Clips4Sale may view geoblocking as the least damaging option on the table, the fallout doesn’t land on the platform. It lands on creators. The backlash reflects more than anger—it reflects a growing sense that major decisions are being made without creators in the room, reshaping livelihoods without alternatives or support.

From the creator’s side, these aren’t abstract compliance choices. They translate into fewer customers, lower visibility, and more instability in an already fragile industry.

The Patchwork Problem Everyone Pretends Isn’t a Problem

One of the most dangerous aspects of this legislative trend is how inconsistent it is.

Each state passes its own version of “protective” law, often without coordination, consultation, or technical understanding. The result is a patchwork of requirements no platform can realistically meet across the board. What’s compliant in one state may be illegal in the next.

For massive tech companies, patchwork laws are an inconvenience. For adult platforms—already operating under heavier scrutiny, higher fees, and greater risk—they can be fatal.

For independent creators, they’re destabilizing by design.

When lawmakers ignore the cumulative effect of these laws, compliance becomes less about doing the right thing and more about surviving. Platforms that can’t afford bespoke, state-by-state systems opt out entirely. Creators are left scrambling to adapt, relocate, or rebuild somewhere else.

Who Is Actually Being Protected Here?

Supporters of laws like HB 164 often speak in moral absolutes. They invoke exploitation, trafficking, consent—serious issues that deserve serious responses.

But when legislation refuses to distinguish between criminal behavior and lawful adult work, it ends up punishing the latter while barely touching the former.

Bad actors don’t notarize forms. They don’t operate transparently. They don’t comply with documentation requirements. Meanwhile, compliant creators and legitimate platforms absorb the cost of laws that don’t meaningfully address wrongdoing.

Protection that collapses under scrutiny isn’t protection. It’s performance.

A Future Built on Exclusion Isn’t a Fix

The adult industry isn’t asking for no rules. It’s asking for rules that reflect reality.

That means lawmakers engaging with performers, platforms, and legal experts who understand how consent, documentation, and digital distribution actually work. It means recognizing that piling on procedural hurdles doesn’t automatically make anyone safer—and that cutting off access often harms the very people these laws claim to defend.

If this trend continues unchecked, the future of adult content in the U.S. won’t look like reform. It will look like retreat. More geoblocking. More platform withdrawals. More creators pushed out of legitimate marketplaces and into less secure corners of the internet.

That outcome serves no one—not performers, not platforms, and not the public.

Until the conversation moves beyond slogans and starts grappling with consequences, laws like Alabama’s will keep feeling less like protection and more like disappearance.

Read More »

FTC Comes Out in Favor of Age Verification at the Federal Level

FTC Building

It started the way a lot of policy shifts do in Washington—quietly, almost casually, with a few carefully chosen words that hinted at something much bigger. Federal Trade Commission commissioner Mark Meador, a figure aligned with a right-wing populist wing of the Republican Party, publicly backed age verification as a “better way” to shield minors from age-restricted online material that’s still protected by the First Amendment.

Meador has served on the Federal Trade Commission since his appointment and Senate confirmation in April 2025. His comments surfaced during an FTC-hosted workshop on January 28, a long, technical day that brought together experts and critics of age-verification laws and technology, including representatives connected to the adult-entertainment space.

But the room itself told another story. As previously reported, adult-industry companies and other key stakeholders—including the Free Speech Coalition—were conspicuously absent. According to an anonymous source, the workshop was planned with a built-in assumption that the industry had little credibility when it came to online safety for minors.

That exclusion didn’t go unnoticed. Several stakeholder groups said the adult industry was shut out not just from the panels, but from the planning process entirely. When you start deciding who gets a seat at the table—and who doesn’t—you’re already shaping the outcome.

“Age verification offers a better way—it offers a way to unleash American innovation without compromising the health and well-being of America’s most important resource: its children,” Meador said. “It is a tool that empowers rather than replaces America’s parents — really, I don’t know that we can afford to forego it.”

His enthusiasm didn’t come out of thin air. Meador’s position closely tracks with endorsements of age-verification laws and technology from conservative and far-right groups that have long opposed pornography and the companies that produce or distribute it. At the same time, even supporters admit the legal landscape is a mess—something that came up repeatedly in legal panels at AVN’s Adult Entertainment Expo in Las Vegas just last week.

Meador’s alliances are also telling. He has ties to prominent anti-pornography figures like Utah Senator Mike Lee and Texas Attorney General Ken Paxton. Lee has repeatedly introduced federal bills targeting adult platforms, often through age-verification mandates and penalties. Paxton, for his part, was sued by the Free Speech Coalition and other adult-industry companies over enforcement actions in Texas.

Right now, roughly half of U.S. states have age-verification laws on the books. Penalties can range from civil fines to criminal charges. During the FTC workshop, several speakers openly backed child-protection proposals pending in Congress, including the controversial Kids Online Safety Act and the SCREEN Act.

FTC chair Andrew Ferguson, also a Republican, echoed support for age verification as a means of complying with the Children’s Online Privacy Protection Act. Central to his view is the expectation that businesses deploy third-party age-verification tools—such as those offered by companies like Yoti and Incode—to prevent what he described as “innovative ways of breaking the law.”

All of this is unfolding inside an agency with an unusual power imbalance. Ferguson and Meador are currently the only two commissioners steering the FTC. Investigative reporting by Al Jazeera has noted that both men have expressed strong support for using regulatory authority to suppress certain forms of LGBTQ+ speech.

The story isn’t over. Federal age-verification policy is still taking shape across the FTC, the broader executive branch, and Congress. What’s clear already is that the debate isn’t just about technology or children—it’s about who gets heard, who gets sidelined, and how much privacy anyone is expected to give up along the way.

Read More »

FSC Backs OpenAge’s AgeKey as a Privacy-First Age Verification Option

Open age logo

Something quietly important happened this week—one of those moments that doesn’t scream for attention but might change how a lot of people experience the internet going forward. The Free Speech Coalition threw its support behind the OpenAge Initiative and its flagship technology, AgeKey, calling it a rare attempt to meet age-assurance rules without turning privacy into collateral damage.

FSC is a nonprofit that advocates for the adult entertainment industry, a corner of the internet that sees hundreds of billions of visits every year and tends to feel regulatory pressure before almost anyone else does.

“We believe that device-based solutions are more effective than fragmented platform or site-specific approaches,” said Alison Boden, executive director of the Free Speech Coalition. “OpenAge and AgeKey offer a practical bridge between these models, allowing users to store a verified age result locally on the device and reuse it across multiple platforms without repeated verification or resubmission of sensitive data.”

She added, “This approach holds the promise of reduced friction and privacy risks that have undermined compliance with age-verification mandates, and provides a path that is affordable for large platforms, independent creators, and small businesses alike.”

At its core, AgeKey is a reusable, FIDO2 passkey-based age credential. It lets someone prove they meet an age requirement without ever handing over who they are—no names, no identity trail, no awkward oversharing just to get through a digital door.

Because it’s natively supported by major devices, operating systems, and browsers, AgeKey doesn’t ask users to download an app, register an account, or jump through extra hoops. Verifications can happen up to 95 percent faster than traditional age checks. And thanks to a double-blind architecture, neither the service provider nor the AgeKey issuer knows who the user is—or where they’re going online.

The OpenAge Initiative itself is focused on building something bigger than a single tool: an interoperable, cross-industry, cross-platform framework for age assurance. Any platform or certified verification provider can adopt AgeKey and participate. Sites still decide which methods, providers, and recency rules they accept, while AgeKeys remain optional and free for users.

“OpenAge believes deeply in interoperability and reusability when it comes to age assurance and users’ own data,” said Julian Corbett, head of OpenAge. “Our mandate is to think first and foremost about what users’ needs and rights should be. This includes the right of children to receive age-appropriate experiences and protections from harmful content, and the right of adults to privacy and frictionless access online.”

For anyone watching the slow collision between regulation, privacy, and real-world usability, this is one of those developments worth sitting with for a moment. Not flashy. Just quietly ambitious.

Read More »

App Meant to Help Users Quit Porn Leaked Their Masturbation Habits

Hand and porn site

There’s something quietly disturbing about discovering that a tool meant to help people wrestle with their most private habits accidentally left the blinds wide open. An app that claims to help users stop consuming pornography ended up exposing intensely sensitive personal data — the kind of stuff most people wouldn’t even admit to a close friend. Ages. Masturbation frequency. Emotional triggers. How porn makes them feel afterward. And tucked inside that data were a lot of minors, which makes your stomach drop a little when you really sit with it.

One user profile, for instance, listed their age as “14.” Their “frequency” showed porn use “several times a week,” sometimes up to three times a day. Their “triggers” were logged as “boredom” and “Sexual Urges.” The app had even assigned a “dependence score” and listed their “symptoms” as “Feeling unmotivated, lack of ambition to pursue goals, difficulty concentrating, poor memory or ‘brain fog.’” It reads less like analytics and more like a vulnerable diary entry — something that was supposed to stay locked away.

The app isn’t being named because the developer still hasn’t fixed the issue. The problem was uncovered by an independent security researcher who asked to remain anonymous. He first flagged it to the app’s creator back in September. The creator said he’d fix it quickly. That didn’t happen. The flaw comes from a misconfiguration in how the app uses Google Firebase, a popular mobile app development platform. By default, Firebase can make it surprisingly easy for anyone to become an “authenticated” user and access backend storage — the digital attic where all the private boxes tend to live if you’re not careful.

Overall, the researcher said he could access information belonging to more than 600,000 users of the porn-quitting app, with roughly 100,000 identifying as minors. That number lands heavy. It’s not abstract. It’s classrooms. It’s school buses. It’s kids who probably assumed they were talking into a void, not a wide-open window.

The app also invites users to write confessions about their habits. One of them read: “I just can’t do this man I honestly don’t know what to do know more, such a loser, I need serious help.” You can almost hear the frustration in that sentence — the messy spelling, the emotional spill. That’s not data. That’s a human having a rough night.

When reached by phone, the creator of the app said he had spoken with the researcher but claimed the app never exposed any user data due to a misconfigured Google Firebase. He suggested the researcher may have fabricated the data that was reviewed.

“There is no sensitive information exposed, that’s just not true,” the founder said. “These users are not in my database, so, like, I just don’t give this guy attention. I just think it’s a bit of a joke.”

When asked why he previously thanked the researcher for responsibly disclosing the misconfiguration and said he would rush to fix it, he wished me a good day and hung up. One of those conversations that ends abruptly, leaving a strange quiet buzzing in the room.

After the call, an account was created on the app. The researcher was then able to see that new account appear inside the misconfigured Google Firebase environment — confirmation that user information was still exposed and accessible. Sometimes reality has a way of answering arguments faster than any debate ever could.

This type of Google Firebase misconfiguration isn’t new. Security researchers have been talking about it for years, and it continues to surface today. It’s one of those problems that feels boring until it suddenly isn’t — until someone’s real life data is sitting out in the open.

Dan Guido, CEO of cybersecurity research and consulting firm Trail of Bits, said in an email that this Firebase issue is “a well known weakness” and easy to find. He recently noted on X that Trail of Bits was able to build a tool using Claude to scan for this vulnerability in just 30 minutes.

“If anyone is best positioned to implement guardrails at scale, it is Google/Firebase themselves. They can detect ‘open rules’ in a user’s account and warn loudly, block production configs, or require explicit acknowledgement,” he said. “Amazon has done this successfully for S3.” S3 is a cloud storage product from AWS that previously struggled with similar data exposure issues due to misconfigurations.

The researcher who uncovered the app’s vulnerability added that this insecure setup is often the default in Google Firebase. He also pointed a finger at Apple, arguing that apps should be reviewed for backend security issues before being allowed into the App Store.

“Apple will literally decline an app from the App Store if a button is two pixels too wide against their design guidelines, but they don’t, and they don’t check anything to do with the back end database security you can find online,” he said. It’s one of those comments that lands with an uncomfortable kind of truth — polished surfaces, shaky foundations.

Apple and Google did not respond to requests for comment.

And that’s the part that lingers. People trusted this app with their most awkward truths, their late-night regrets, their quiet attempts at self-control. Some of them were kids. They weren’t posting for an audience. They were whispering into what they thought was a locked room. Turns out the door was never really closed.

Read More »

Lawmakers Advance Proposal That Could Ban VPN Use for UK Minors

VPN

A new ban on VPNs could hit web users in the UK following a vote on a law change. And yeah, that sentence alone makes you pause for a second. It has that quiet, slightly unsettling “wait… what?” energy — the kind that sneaks up on you between inbox refreshes and half-finished cups of coffee. You don’t expect something as invisible and ordinary as a VPN to suddenly become headline material, yet here we are.

The House of Lords has now passed a vote which would, if approved by the government, see an amendment to a law which would ban under-18s from using a Virtual Private Network (VPN). On the surface it sounds tidy and well-intentioned. Underneath it, though, sits a tangle of questions about privacy, enforcement, and how much digital freedom anyone — especially young people — should actually have.

VPNs have increasingly been used in the UK since the Online Safety Act was put in place. Often used by employers to create a network to share resources, VPNs can also be used to spoof or hide your browsing location, thereby sidestepping geographical restrictions. A VPN is a system which connects somebody’s device – normally a computer or smartphone – to a server in a different location. This means that the websites that person visits cannot see their IP address. Think of it like slipping on a digital disguise — not to vanish completely, but to move through the internet without leaving your name tag on every door you open.

It is used by many people for privacy or getting around restrictions that websites put on who can visit a page. It can also be useful for allowing people to work from home and still access their workplace’s resources. I’ve leaned on a VPN more times than I can count while traveling or working remotely — one of those quiet tools you only notice when it breaks, or when someone suddenly tells you it might disappear.

Last week, a Conservative-led amendment in the House of Lords called for a change to the Children’s Wellbeing and Schools Bill following calls from campaigners including Hollywood star Hugh Grant. It’s one of those oddly modern moments where celebrity advocacy bumps into legislative machinery, and suddenly a policy debate has a familiar face attached to it.

Peers backed by 207 votes to 159, a ban on providing VPN services to children over concerns they can be used to bypass age verification restrictions on accessing adult content. The logic is straightforward enough: if age gates exist, lawmakers don’t want easy digital side doors around them.

Changes made by peers to the Bill will be considered by MPs during the process known as ping-pong, when legislation is batted between the Commons and Lords until agreement is reached. The name sounds playful, almost harmless. The outcomes, of course, rarely are.

Separately, in a heavy Government defeat, peers supported a ban on social media for under 16s too. That’s not a small add-on — that’s a tectonic shift in how young people would experience daily life online.

Supporters of the Australian-style ban have argued parents are in “an impossible position” with regard to the online harms their children are being exposed to. And honestly, that rings true. Anyone who’s watched a teenager scroll endlessly into the night knows the uneasy mix of concern, resignation, and quiet panic that can creep in.

Technology Secretary Liz Kendall announced a three-month consultation this week, which will consider the advantages and disadvantages of a ban, as well as possible overnight curfews and actions to prevent “doom-scrolling”, before reporting back in the summer. It sounds like a collective deep breath — gather the facts, test the ideas, try not to rush into something that reshapes daily habits for millions of families.

However, Tory former schools minister Lord Nash, who has spearheaded calls for a ban, argued the late concession simply represented more delay. You can almost hear the frustration between the lines, the sense that patience has already been exhausted.

He said: “The Government’s consultation is, in my view, unnecessary, misconceived and clearly a last-minute attempt to kick this can down the road.”

Proposing an amendment to the Children’s Wellbeing and Schools Bill, the Conservative peer told the upper chamber: “Many teenagers are spending long hours – five, six, seven or more a day – on social media.

“The evidence is now overwhelming as to the damage that this is causing.

“We have long passed the point of correlation or causation. There is now so much evidence from across the world that it is clear that by every measure, health, cognitive ability, educational attainment, crime, economic productivity, children are being harmed.”

He added: “This is going to happen. The only question is when. We have the opportunity to do it now in this Bill, and every day which passes, more damage is being done to children. We must act now.”

A Government spokesperson said: “We will take action to give children a healthier relationship with mobile phones and social media.

“It is important we get this right, which is why we have launched a consultation and will work with experts, parents and young people to ensure we take the best approach, based on evidence.”

And that’s the uneasy tension humming beneath all of this. Everyone wants kids safer online — that part isn’t controversial. But once you start tightening the screws on tools like VPNs and access itself, you’re not just nudging behavior. You’re redefining privacy, autonomy, and who ultimately controls the shape of the internet inside everyday life. The line between protection and overreach gets thin fast.

Maybe this really is the moment lawmakers draw a harder boundary. Maybe it’s another long rally of legislative ping-pong before anything truly changes. Either way, it’s hard to shake the feeling that this debate isn’t really about VPNs at all. It’s about who gets to decide how much freedom we’re willing to quietly trade away — and whether safety, once promised, ever really knows when to stop.

Read More »

Pornhub to Restrict UK Access to Verified Account Holders Starting Feb. 2

Pornhub logo

Something strange is about to happen when a curious UK user types a familiar orange-and-black URL into their browser after Feb. 2. Instead of the usual scroll-and-click routine, the door quietly closes — unless they already built a verified account before the cutoff. Aylo, the parent company behind Pornhub and several other free platforms, says new UK visitors won’t be getting in.

Aylo explained in a statement, “New users in the UK will no longer be able to access Aylo’s content sharing platforms, including Pornhub, YouPorn, and Redtube. UK users who have verified their age will retain access through their existing accounts.”

During a press conference, Aylo VP of Brand and Community Alexzandra Kekesi clarified that anyone who already completed the age verification process — which requires creating an account — will still have access to Pornhub and Aylo’s other free sites. What won’t exist anymore is the on-ramp. No new accounts will be allowed after Feb. 2.

“You will have to use credentials to log in and access your account,” Kekesi said. “Anyone who has not gone through that process prior to February will be redirected elsewhere. Their journey on our platform will start and end there.”

Back in June 2025, Aylo had rolled out age assurance tools designed to meet government requirements under the UK’s Online Safety Act. At the time, Kekesi even praised Ofcom’s framework, calling it “the most robust in terms of actual and meaningful protection we’ve seen to date.” There was cautious optimism then — the kind you get when a system feels imperfect but workable.

That optimism has faded. At Tuesday’s press conference, Kekesi said Aylo now views the OSA as fundamentally broken. Sites remain “very accessible” to minors, she said, while traffic simply flows to noncompliant platforms that dodge enforcement altogether. Scale becomes a mirage. She also pointed out that most adult sites still don’t comply with the law, and warned that the system raises “considerable privacy issues” and exposes users to data breaches. It’s one of those uncomfortable moments where a rule meant to protect ends up creating new vulnerabilities.

“We can no longer participate in the flawed system that has been created in the UK as a result of the OSA,” Kekesi said.

Solomon Friedman, partner and VP for compliance at Ethical Capital Partners — the firm that acquired MindGeek in 2023 and rebranded it as Aylo — took a more hands-on approach during the briefing. From a UK IP address, he searched “free porn” to show how quickly unverified sites appear, even as Pornhub requires age verification. It was a simple demo, but the kind that lands like a thud.

“As new sites continue to pop up that are noncompliant, they simply repopulate and move higher in the Google ranking,” Friedman said.

He added that sites ignoring age assurance rules often ignore other safeguards as well — including measures meant to prevent CSAM and intimate image abuse. The problem doesn’t stay neatly contained in one policy lane.

“This law by its very nature is pushing adults and children alike to the cesspools of the internet,” Friedman warned.

In regions where Pornhub has already been forced to implement age verification, the platform has seen traffic drop by as much as 80%, as users chase free content elsewhere. Anyone who’s ever watched internet habits shift overnight knows how fast a crowd migrates when friction shows up.

Friedman stressed that Ofcom itself isn’t the villain in this story, saying the regulator has been acting in good faith, consulting with industry stakeholders and taking enforcement seriously.

“You have a dedicated regulator working in good faith,” he said. “But unfortunately, the law they are operating under cannot possibly succeed.”

He returned to a position Aylo has been repeating for some time: the only realistic way to keep minors away from adult content is device-level age assurance, not site-by-site gates.

“Microsoft, Apple and Google all have very robust built-in parental controls,” he pointed out. “Those are device-based controls that operate regardless of whether or not the site that is being accessed is compliant. The only thing needed is a mandate that these controls be activated by default.”

Right now, he noted, those protections are still “opt-in, not opt-out.” In other words, they exist — but only if someone actively turns them on. Human nature being what it is, that’s a fragile bet.

Friedman demonstrated how device-level tools like Google’s SafeSearch can block access to adult content even when a VPN is in use, and urged major tech companies to “do the right thing proactively” or risk being “forced to do the right thing by government.”

When asked whether shifting responsibility to big tech simply pushes the problem onto someone else, Friedman framed it as a question of what actually works in the real world.

“This is not a matter of shifting responsibility to anyone,” Friedman said. “When access is controlled at the device level, it’s efficient, it’s effective, it’s privacy-preserving, it gets the job done. It just works.

“Human behavior is why these laws are failing,” Friedman added. “Legislate not contrary to human behavior, but consistent with human behavior online — and that is at the device level.”

A company representative also confirmed that outside the UK, Aylo still plans to participate in the European Commission’s pilot program for its “white label” age verification app — a reminder that this debate isn’t settling anytime soon. If anything, it’s just changing shape, like water finding the next crack in the pavement.

Read More »