Political Attacks

Alabama’s Latest Adult Content Law Pushes Creators Out, Not Toward Safety

Picture of the Alabama Flag

Most adult creators didn’t need a push notification to feel it. The moment the news started circulating, it landed with a familiar weight: Clips4Sale has restricted access in Alabama after the passage of House Bill 164, a law that introduces notarized consent requirements for performers and platforms. The company frames the decision as compliance—necessary, even prudent. Creators read it differently. To many, it felt like the ground quietly disappearing beneath their feet.

Both interpretations can coexist. And maybe that’s the most unsettling part.

Legislation like Alabama’s is almost always sold as “protective.” The language is comforting, even noble—designed to reassure the public that something dangerous is being handled. But when you listen to the people living under these laws—performers, indie creators, small operators—the tone shifts. What comes through isn’t relief. It’s confusion. Anxiety. A creeping sense that they’re being legislated out of existence without anyone actually talking to them.

House Bill 164 didn’t arrive out of nowhere. It’s part of a broader pattern unfolding across the country, where states are targeting adult platforms through new consent rules, age checks, and documentation standards. On paper, they sound reasonable. In reality, they unravel fast.

What they create isn’t safety. It’s splintering.

A Law That Misses the Reality of the Industry

Adult performers aren’t operating without rules. They never have been. For decades, the industry has been bound by strict federal record-keeping requirements—ID verification, age documentation, signed releases. These systems already exist. They’re already enforced. They’re already audited. And they’re treated seriously because the penalties for failure are brutal.

Which is exactly why Alabama’s law sparked disbelief instead of reassurance.

Adult performer Leilani Lei cut through the noise on X by asking a simple question: do lawmakers actually understand what notarization does? A notary verifies identity and witnesses a signature. That’s the full job description. They don’t assess consent. They don’t evaluate content. They don’t make legal judgments. Requiring notarization doesn’t increase safety—it adds friction, expense, and logistical chaos.

Is a notary expected on every set? For every solo clip? For content created privately by independent performers in their own homes? These aren’t dramatic hypotheticals. They’re practical questions that expose how disconnected the law is from how adult work actually functions.

When laws ignore operational reality, compliance stops being ethical and starts being geographic. Platforms block states. Creators lose access. Income vanishes—not because of misconduct, but because following the rules becomes impossible.

When “Protection” Quietly Becomes Economic Damage

One consequence of laws like HB 164 rarely gets discussed out loud: money.

Adult creators aren’t faceless entities. They’re people paying rent, covering medical bills, supporting families. For many, digital platforms aren’t side hustles—they’re lifelines. When a state gets geoblocked, creators living there lose their audience instantly. When platforms restrict access, creators with fans in that state watch sales drop overnight.

Cupcake SinClair’s response on X captured the mood perfectly—not panic, but dread. Not fear of regulation itself, but fear of where this path leads. If these laws keep spreading—each state tweaking the rules just enough—what does the landscape look like in a year? Two years? Does access slowly shrink until it’s determined entirely by ZIP code?

That’s not protection. That’s erosion.

And while platforms like Clips4Sale may view geoblocking as the least damaging option on the table, the fallout doesn’t land on the platform. It lands on creators. The backlash reflects more than anger—it reflects a growing sense that major decisions are being made without creators in the room, reshaping livelihoods without alternatives or support.

From the creator’s side, these aren’t abstract compliance choices. They translate into fewer customers, lower visibility, and more instability in an already fragile industry.

The Patchwork Problem Everyone Pretends Isn’t a Problem

One of the most dangerous aspects of this legislative trend is how inconsistent it is.

Each state passes its own version of “protective” law, often without coordination, consultation, or technical understanding. The result is a patchwork of requirements no platform can realistically meet across the board. What’s compliant in one state may be illegal in the next.

For massive tech companies, patchwork laws are an inconvenience. For adult platforms—already operating under heavier scrutiny, higher fees, and greater risk—they can be fatal.

For independent creators, they’re destabilizing by design.

When lawmakers ignore the cumulative effect of these laws, compliance becomes less about doing the right thing and more about surviving. Platforms that can’t afford bespoke, state-by-state systems opt out entirely. Creators are left scrambling to adapt, relocate, or rebuild somewhere else.

Who Is Actually Being Protected Here?

Supporters of laws like HB 164 often speak in moral absolutes. They invoke exploitation, trafficking, consent—serious issues that deserve serious responses.

But when legislation refuses to distinguish between criminal behavior and lawful adult work, it ends up punishing the latter while barely touching the former.

Bad actors don’t notarize forms. They don’t operate transparently. They don’t comply with documentation requirements. Meanwhile, compliant creators and legitimate platforms absorb the cost of laws that don’t meaningfully address wrongdoing.

Protection that collapses under scrutiny isn’t protection. It’s performance.

A Future Built on Exclusion Isn’t a Fix

The adult industry isn’t asking for no rules. It’s asking for rules that reflect reality.

That means lawmakers engaging with performers, platforms, and legal experts who understand how consent, documentation, and digital distribution actually work. It means recognizing that piling on procedural hurdles doesn’t automatically make anyone safer—and that cutting off access often harms the very people these laws claim to defend.

If this trend continues unchecked, the future of adult content in the U.S. won’t look like reform. It will look like retreat. More geoblocking. More platform withdrawals. More creators pushed out of legitimate marketplaces and into less secure corners of the internet.

That outcome serves no one—not performers, not platforms, and not the public.

Until the conversation moves beyond slogans and starts grappling with consequences, laws like Alabama’s will keep feeling less like protection and more like disappearance.

Read More »

FTC Comes Out in Favor of Age Verification at the Federal Level

FTC Building

It started the way a lot of policy shifts do in Washington—quietly, almost casually, with a few carefully chosen words that hinted at something much bigger. Federal Trade Commission commissioner Mark Meador, a figure aligned with a right-wing populist wing of the Republican Party, publicly backed age verification as a “better way” to shield minors from age-restricted online material that’s still protected by the First Amendment.

Meador has served on the Federal Trade Commission since his appointment and Senate confirmation in April 2025. His comments surfaced during an FTC-hosted workshop on January 28, a long, technical day that brought together experts and critics of age-verification laws and technology, including representatives connected to the adult-entertainment space.

But the room itself told another story. As previously reported, adult-industry companies and other key stakeholders—including the Free Speech Coalition—were conspicuously absent. According to an anonymous source, the workshop was planned with a built-in assumption that the industry had little credibility when it came to online safety for minors.

That exclusion didn’t go unnoticed. Several stakeholder groups said the adult industry was shut out not just from the panels, but from the planning process entirely. When you start deciding who gets a seat at the table—and who doesn’t—you’re already shaping the outcome.

“Age verification offers a better way—it offers a way to unleash American innovation without compromising the health and well-being of America’s most important resource: its children,” Meador said. “It is a tool that empowers rather than replaces America’s parents — really, I don’t know that we can afford to forego it.”

His enthusiasm didn’t come out of thin air. Meador’s position closely tracks with endorsements of age-verification laws and technology from conservative and far-right groups that have long opposed pornography and the companies that produce or distribute it. At the same time, even supporters admit the legal landscape is a mess—something that came up repeatedly in legal panels at AVN’s Adult Entertainment Expo in Las Vegas just last week.

Meador’s alliances are also telling. He has ties to prominent anti-pornography figures like Utah Senator Mike Lee and Texas Attorney General Ken Paxton. Lee has repeatedly introduced federal bills targeting adult platforms, often through age-verification mandates and penalties. Paxton, for his part, was sued by the Free Speech Coalition and other adult-industry companies over enforcement actions in Texas.

Right now, roughly half of U.S. states have age-verification laws on the books. Penalties can range from civil fines to criminal charges. During the FTC workshop, several speakers openly backed child-protection proposals pending in Congress, including the controversial Kids Online Safety Act and the SCREEN Act.

FTC chair Andrew Ferguson, also a Republican, echoed support for age verification as a means of complying with the Children’s Online Privacy Protection Act. Central to his view is the expectation that businesses deploy third-party age-verification tools—such as those offered by companies like Yoti and Incode—to prevent what he described as “innovative ways of breaking the law.”

All of this is unfolding inside an agency with an unusual power imbalance. Ferguson and Meador are currently the only two commissioners steering the FTC. Investigative reporting by Al Jazeera has noted that both men have expressed strong support for using regulatory authority to suppress certain forms of LGBTQ+ speech.

The story isn’t over. Federal age-verification policy is still taking shape across the FTC, the broader executive branch, and Congress. What’s clear already is that the debate isn’t just about technology or children—it’s about who gets heard, who gets sidelined, and how much privacy anyone is expected to give up along the way.

Read More »

FSC Backs OpenAge’s AgeKey as a Privacy-First Age Verification Option

Open age logo

Something quietly important happened this week—one of those moments that doesn’t scream for attention but might change how a lot of people experience the internet going forward. The Free Speech Coalition threw its support behind the OpenAge Initiative and its flagship technology, AgeKey, calling it a rare attempt to meet age-assurance rules without turning privacy into collateral damage.

FSC is a nonprofit that advocates for the adult entertainment industry, a corner of the internet that sees hundreds of billions of visits every year and tends to feel regulatory pressure before almost anyone else does.

“We believe that device-based solutions are more effective than fragmented platform or site-specific approaches,” said Alison Boden, executive director of the Free Speech Coalition. “OpenAge and AgeKey offer a practical bridge between these models, allowing users to store a verified age result locally on the device and reuse it across multiple platforms without repeated verification or resubmission of sensitive data.”

She added, “This approach holds the promise of reduced friction and privacy risks that have undermined compliance with age-verification mandates, and provides a path that is affordable for large platforms, independent creators, and small businesses alike.”

At its core, AgeKey is a reusable, FIDO2 passkey-based age credential. It lets someone prove they meet an age requirement without ever handing over who they are—no names, no identity trail, no awkward oversharing just to get through a digital door.

Because it’s natively supported by major devices, operating systems, and browsers, AgeKey doesn’t ask users to download an app, register an account, or jump through extra hoops. Verifications can happen up to 95 percent faster than traditional age checks. And thanks to a double-blind architecture, neither the service provider nor the AgeKey issuer knows who the user is—or where they’re going online.

The OpenAge Initiative itself is focused on building something bigger than a single tool: an interoperable, cross-industry, cross-platform framework for age assurance. Any platform or certified verification provider can adopt AgeKey and participate. Sites still decide which methods, providers, and recency rules they accept, while AgeKeys remain optional and free for users.

“OpenAge believes deeply in interoperability and reusability when it comes to age assurance and users’ own data,” said Julian Corbett, head of OpenAge. “Our mandate is to think first and foremost about what users’ needs and rights should be. This includes the right of children to receive age-appropriate experiences and protections from harmful content, and the right of adults to privacy and frictionless access online.”

For anyone watching the slow collision between regulation, privacy, and real-world usability, this is one of those developments worth sitting with for a moment. Not flashy. Just quietly ambitious.

Read More »

Lawmakers Advance Proposal That Could Ban VPN Use for UK Minors

VPN

A new ban on VPNs could hit web users in the UK following a vote on a law change. And yeah, that sentence alone makes you pause for a second. It has that quiet, slightly unsettling “wait… what?” energy — the kind that sneaks up on you between inbox refreshes and half-finished cups of coffee. You don’t expect something as invisible and ordinary as a VPN to suddenly become headline material, yet here we are.

The House of Lords has now passed a vote which would, if approved by the government, see an amendment to a law which would ban under-18s from using a Virtual Private Network (VPN). On the surface it sounds tidy and well-intentioned. Underneath it, though, sits a tangle of questions about privacy, enforcement, and how much digital freedom anyone — especially young people — should actually have.

VPNs have increasingly been used in the UK since the Online Safety Act was put in place. Often used by employers to create a network to share resources, VPNs can also be used to spoof or hide your browsing location, thereby sidestepping geographical restrictions. A VPN is a system which connects somebody’s device – normally a computer or smartphone – to a server in a different location. This means that the websites that person visits cannot see their IP address. Think of it like slipping on a digital disguise — not to vanish completely, but to move through the internet without leaving your name tag on every door you open.

It is used by many people for privacy or getting around restrictions that websites put on who can visit a page. It can also be useful for allowing people to work from home and still access their workplace’s resources. I’ve leaned on a VPN more times than I can count while traveling or working remotely — one of those quiet tools you only notice when it breaks, or when someone suddenly tells you it might disappear.

Last week, a Conservative-led amendment in the House of Lords called for a change to the Children’s Wellbeing and Schools Bill following calls from campaigners including Hollywood star Hugh Grant. It’s one of those oddly modern moments where celebrity advocacy bumps into legislative machinery, and suddenly a policy debate has a familiar face attached to it.

Peers backed by 207 votes to 159, a ban on providing VPN services to children over concerns they can be used to bypass age verification restrictions on accessing adult content. The logic is straightforward enough: if age gates exist, lawmakers don’t want easy digital side doors around them.

Changes made by peers to the Bill will be considered by MPs during the process known as ping-pong, when legislation is batted between the Commons and Lords until agreement is reached. The name sounds playful, almost harmless. The outcomes, of course, rarely are.

Separately, in a heavy Government defeat, peers supported a ban on social media for under 16s too. That’s not a small add-on — that’s a tectonic shift in how young people would experience daily life online.

Supporters of the Australian-style ban have argued parents are in “an impossible position” with regard to the online harms their children are being exposed to. And honestly, that rings true. Anyone who’s watched a teenager scroll endlessly into the night knows the uneasy mix of concern, resignation, and quiet panic that can creep in.

Technology Secretary Liz Kendall announced a three-month consultation this week, which will consider the advantages and disadvantages of a ban, as well as possible overnight curfews and actions to prevent “doom-scrolling”, before reporting back in the summer. It sounds like a collective deep breath — gather the facts, test the ideas, try not to rush into something that reshapes daily habits for millions of families.

However, Tory former schools minister Lord Nash, who has spearheaded calls for a ban, argued the late concession simply represented more delay. You can almost hear the frustration between the lines, the sense that patience has already been exhausted.

He said: “The Government’s consultation is, in my view, unnecessary, misconceived and clearly a last-minute attempt to kick this can down the road.”

Proposing an amendment to the Children’s Wellbeing and Schools Bill, the Conservative peer told the upper chamber: “Many teenagers are spending long hours – five, six, seven or more a day – on social media.

“The evidence is now overwhelming as to the damage that this is causing.

“We have long passed the point of correlation or causation. There is now so much evidence from across the world that it is clear that by every measure, health, cognitive ability, educational attainment, crime, economic productivity, children are being harmed.”

He added: “This is going to happen. The only question is when. We have the opportunity to do it now in this Bill, and every day which passes, more damage is being done to children. We must act now.”

A Government spokesperson said: “We will take action to give children a healthier relationship with mobile phones and social media.

“It is important we get this right, which is why we have launched a consultation and will work with experts, parents and young people to ensure we take the best approach, based on evidence.”

And that’s the uneasy tension humming beneath all of this. Everyone wants kids safer online — that part isn’t controversial. But once you start tightening the screws on tools like VPNs and access itself, you’re not just nudging behavior. You’re redefining privacy, autonomy, and who ultimately controls the shape of the internet inside everyday life. The line between protection and overreach gets thin fast.

Maybe this really is the moment lawmakers draw a harder boundary. Maybe it’s another long rally of legislative ping-pong before anything truly changes. Either way, it’s hard to shake the feeling that this debate isn’t really about VPNs at all. It’s about who gets to decide how much freedom we’re willing to quietly trade away — and whether safety, once promised, ever really knows when to stop.

Read More »

Pornhub to Restrict UK Access to Verified Account Holders Starting Feb. 2

Pornhub logo

Something strange is about to happen when a curious UK user types a familiar orange-and-black URL into their browser after Feb. 2. Instead of the usual scroll-and-click routine, the door quietly closes — unless they already built a verified account before the cutoff. Aylo, the parent company behind Pornhub and several other free platforms, says new UK visitors won’t be getting in.

Aylo explained in a statement, “New users in the UK will no longer be able to access Aylo’s content sharing platforms, including Pornhub, YouPorn, and Redtube. UK users who have verified their age will retain access through their existing accounts.”

During a press conference, Aylo VP of Brand and Community Alexzandra Kekesi clarified that anyone who already completed the age verification process — which requires creating an account — will still have access to Pornhub and Aylo’s other free sites. What won’t exist anymore is the on-ramp. No new accounts will be allowed after Feb. 2.

“You will have to use credentials to log in and access your account,” Kekesi said. “Anyone who has not gone through that process prior to February will be redirected elsewhere. Their journey on our platform will start and end there.”

Back in June 2025, Aylo had rolled out age assurance tools designed to meet government requirements under the UK’s Online Safety Act. At the time, Kekesi even praised Ofcom’s framework, calling it “the most robust in terms of actual and meaningful protection we’ve seen to date.” There was cautious optimism then — the kind you get when a system feels imperfect but workable.

That optimism has faded. At Tuesday’s press conference, Kekesi said Aylo now views the OSA as fundamentally broken. Sites remain “very accessible” to minors, she said, while traffic simply flows to noncompliant platforms that dodge enforcement altogether. Scale becomes a mirage. She also pointed out that most adult sites still don’t comply with the law, and warned that the system raises “considerable privacy issues” and exposes users to data breaches. It’s one of those uncomfortable moments where a rule meant to protect ends up creating new vulnerabilities.

“We can no longer participate in the flawed system that has been created in the UK as a result of the OSA,” Kekesi said.

Solomon Friedman, partner and VP for compliance at Ethical Capital Partners — the firm that acquired MindGeek in 2023 and rebranded it as Aylo — took a more hands-on approach during the briefing. From a UK IP address, he searched “free porn” to show how quickly unverified sites appear, even as Pornhub requires age verification. It was a simple demo, but the kind that lands like a thud.

“As new sites continue to pop up that are noncompliant, they simply repopulate and move higher in the Google ranking,” Friedman said.

He added that sites ignoring age assurance rules often ignore other safeguards as well — including measures meant to prevent CSAM and intimate image abuse. The problem doesn’t stay neatly contained in one policy lane.

“This law by its very nature is pushing adults and children alike to the cesspools of the internet,” Friedman warned.

In regions where Pornhub has already been forced to implement age verification, the platform has seen traffic drop by as much as 80%, as users chase free content elsewhere. Anyone who’s ever watched internet habits shift overnight knows how fast a crowd migrates when friction shows up.

Friedman stressed that Ofcom itself isn’t the villain in this story, saying the regulator has been acting in good faith, consulting with industry stakeholders and taking enforcement seriously.

“You have a dedicated regulator working in good faith,” he said. “But unfortunately, the law they are operating under cannot possibly succeed.”

He returned to a position Aylo has been repeating for some time: the only realistic way to keep minors away from adult content is device-level age assurance, not site-by-site gates.

“Microsoft, Apple and Google all have very robust built-in parental controls,” he pointed out. “Those are device-based controls that operate regardless of whether or not the site that is being accessed is compliant. The only thing needed is a mandate that these controls be activated by default.”

Right now, he noted, those protections are still “opt-in, not opt-out.” In other words, they exist — but only if someone actively turns them on. Human nature being what it is, that’s a fragile bet.

Friedman demonstrated how device-level tools like Google’s SafeSearch can block access to adult content even when a VPN is in use, and urged major tech companies to “do the right thing proactively” or risk being “forced to do the right thing by government.”

When asked whether shifting responsibility to big tech simply pushes the problem onto someone else, Friedman framed it as a question of what actually works in the real world.

“This is not a matter of shifting responsibility to anyone,” Friedman said. “When access is controlled at the device level, it’s efficient, it’s effective, it’s privacy-preserving, it gets the job done. It just works.

“Human behavior is why these laws are failing,” Friedman added. “Legislate not contrary to human behavior, but consistent with human behavior online — and that is at the device level.”

A company representative also confirmed that outside the UK, Aylo still plans to participate in the European Commission’s pilot program for its “white label” age verification app — a reminder that this debate isn’t settling anytime soon. If anything, it’s just changing shape, like water finding the next crack in the pavement.

Read More »

Arizona is Just Full of Great Ideas (Again) by Stan Q. Brick

Arizona flag

Ah, Arizona – my home state. Famous for the geological splendor of the Grand Canyon, the ancient beauty of the Petrified Forest, the marvel of engineering that is the Hoover Dam – and some of the dumbest legislative proposals ever to see the light of day.

Right about seven years ago, an Arizona State Senator proposed to tax porn to fund the construction of a wall on our southern border. Thankfully, that bill died on the vine, deemed a non-starter even by a legislature that once seriously toyed with the notion of making it illegal to “offend or annoy” people using an “electronic or digital device.”

Here’s hoping that HB 2900, one of the latest brain farts to emanate from the state’s socially conservative corner, meets the same fate as some of its even dumber predecessors, never making it to the floor for a full vote. Because, while I’d like to say I’m confident the bill would wither under court scrutiny if it made it all the way to being signed by the state’s current governor (a doubtful proposition, itself), who the hell knows anymore?

The ‘good news’ about HB 2900, if there is any, is that the bill at least wouldn’t make it a crime punishable by prison time to make or sell porn in the state. It would merely make it prohibitively expensive to do so and potentially ruinous for any business charged with having the temerity to sell porn to those who wish to watch it, through its onerous fines of up to $10,000 per day (and $10,000 per instance).

Oh, and it’s not just the attorney general who could bring action against those who run afoul of the law; the state’s private citizens could get in on the act, as well. I can just imagine how enthusiastic the state’s judiciary must be about the prospect of hearing dozens of lawsuits with titles like Johnson v. Hustler or Peters v. Pornhub because clearly these judges have nothing better or more important to do than sit in a courtroom and hear testimony about how pornographers are ruining the state almost as surely as the undocumented immigrants who handle everyone’s landscaping and roofing needs.

Look, I get it: A lot of people in Arizona (and elsewhere) don’t approve of porn and several of those people are highly religious and socially conservative folks who also happen to hold seats in the state’s legislature. I happen to be an atheist, but you don’t see me going around advocating for suing people for proselytizing, or banning sale of the Bible, or (shudder) running for office so I can attempt to impose my libertine values on the rest of my fellow citizens under the color of state law.

As I mentioned earlier, one small silver lining in the heaping pile of shitty cloud that is HB 2900 is this bit of limiting construction: “This section does not…Impose civil liability on an individual solely for the private possession, private viewing or private receipt of pornography.” But this stipulation raises questions for me, too.

If porn is so awful, morally ruinous and threatening that legislature must take measures to dissuade businesses from making and selling porn in Arizona, why would the legislature allow people to freely possess and view it? Or, to position the same sort of question in reverse, if it’s OK to posses and watch porn, why are this bill’s sponsors so eager to punish the people who make it possible for the state’s citizens to possess and view it?

Also, if this bill is limited to “commercial entities,” as it appears to be, does that mean I can film and distribute porn in Arizona, so long as I don’t charge anything for my product? Can I now finally fulfill my lifelong dream of being a mass porn donor?

Ideally, I will never have occasion to learn the answers to any of the above questions because HB 2900 quietly slips into the dustbin of history, alongside its wall-funding and anti-annoyance predecessors. I suppose we’ll know soon enough, as the Arizona legislature will close its current session sometime this summer.

Read More »

FTC Schedules Jan. 28 Age Verification Panel Without Adult Industry Representation

FTC logo

WASHINGTON—There’s something oddly quiet about a room that’s supposed to be filled with experts. You can almost hear the missing voices echo. That’s the feeling hanging over the Federal Trade Commission’s upcoming expert panel on age verification and its federal implications — a conversation that, on paper, looks serious and important, but somehow feels strangely incomplete.

Taking place at the Constitution Center, 400 7th St SW, Washington, D.C. 20024, on January 28, the panel will feature experts in age verification and online child safety, including representatives from organizations that have been openly critical of existing age verification proposals currently before Congress.

Glaringly, no companies or stakeholders from the adult entertainment industry or adjacent sectors were invited to sit on the panel.

Juliana Gruenwald Henderson, deputy director of the FTC’s Office of Public Affairs, declined to elaborate on why the panelists don’t include adult industry stakeholders, such as the Free Speech Coalition. She simply said in an email, “No comment.” It’s the kind of answer that lands with a thud — short, closed, and oddly loud in its silence — and it inevitably raises the question of whether bias is at play.

An anonymous source at the FTC confirmed that the decision not to involve adult industry firms was due to bias.

Further speculation only grows when key stakeholder groups share that the adult industry was iced out of planning and programming for the panels tied to the FTC event. That kind of exclusion has a way of shaping outcomes before the first microphone is even turned on.

A spokesperson for the Free Speech Coalition confirmed that the FTC never reached out to senior leadership at the coalition for input or to appear on Jan. 28. Not even a courtesy call. Nothing.

Expanding on the concerns of bias at the FTC, First Amendment attorney Corey Silverstein said in a text that the agency’s behavior in this situation doesn’t exactly come as a shock.

“This is a long-expected and goes hand in hand with the eventual federal age verification law,” Silverstein warned. “The FTC will be tasked with being the enforcement arm for federal age verification, and they want to be ready to hit the ground running.”

“I have no doubt that the FTC wants to determine for themselves which age verification technologies and providers they will deem acceptable,” he added.

Panelists named include several experts connected to organizations that have actively lobbied for age verification laws under the banner of “protecting minors.” That phrasing always sounds comforting — who could possibly argue with protecting kids? — but the motivations and downstream effects tend to get a lot messier once you zoom out.

For example, Clare Morell sits on a panel session called “Navigating the Regulatory Maze of Age Verification.” Morell is a fellow at the Ethics & Public Policy Center.

The center played a central role in Project 2025, an effort organized by the conservative Heritage Foundation. Project 2025 — tied to the administration of President Donald Trump — has openly pushed proposals aimed at banning pornography and revoking First Amendment protections for the category altogether. That context matters, even if it sometimes gets tucked quietly into the footnotes of policy discussions.

Also featured on the panel is Iain Corby, executive director of the Age Verification Providers Association (AVPA). Corby has long been a divisive figure in the adult industry space. He’s publicly supported free expression for adult companies — yet frequently aligns himself with anti-pornography groups. It’s the kind of tightrope act that leaves both sides uneasy.

Some stakeholders on the FTC panel include state-level regulators and lawmakers. For instance, Katherine Haas, director of the Utah Department of Commerce Consumer Protection Division, will speak on the same panel. Haas played a key role in the FTC complaint that was later settled against Aylo, the parent company of Pornhub, which maintains a U.S. headquarters in Austin, Texas, and operational hubs in California, primarily Los Angeles and San Diego.

Last September, Aylo reached a $5 million settlement with the FTC and the State of Utah tied to CSAM allegations stemming from compliance issues that predated the company’s acquisition by Ottawa-based private equity firm Ethical Capital Partners. Aylo did not respond to a request for comment on whether the company had been invited to speak on the FTC panel. It’s also worth noting that the only other stakeholders involved appear to be representatives from major technology firms and trade organizations, including officials from Google, Meta Platforms (Facebook and Instagram), Apple, and age verification companies such as industry heavyweight Yoti.

One expert, however, stands out from the rest of the lineup: Jennifer Huddleston, a technology fellow at the libertarian-leaning, free-market Cato Institute in Washington, D.C. True to the organization’s long-standing commitment to the First Amendment, Huddleston has openly raised concerns about age verification and the quiet social contract we’re being asked to sign — trusting massive corporations with deeply sensitive personal data just to exist online.

“There are broader debates about how to encourage the potentially beneficial uses of technology while protecting kids and teens from potential harms, but an approach that would require all users to verify their age or identity when logging on not only fails to resolve the concerns about kids’ and teens’ technology use but also creates a range of pitfalls related to privacy and speech for users of all ages,” Huddleston wrote in a 2025 op-ed for the Dallas Morning News.

And that’s the tension humming underneath this entire event: protection versus privacy, safety versus speech, certainty versus the uncomfortable gray areas we’d rather not sit with. When certain voices never make it into the room, the conversation doesn’t just become narrower — it risks becoming rehearsed. Sometimes the loudest warning isn’t shouted at all. It’s the silence you can’t quite ignore.

Read More »

Arizona State Lawmaker Pushes Porn Ban Proposal

Arizona flag

PHOENIX — There’s something jarring about waking up to the idea that an entire category of human expression could suddenly become illegal. Not regulated. Not filtered. Not nudged behind another age-gate wall. Just… gone. That’s the direction Arizona may be staring down after a member of the state House introduced a bill Wednesday that would make it illegal to produce or distribute adult content anywhere in the state.

Republican Rep. Khyl Powell’s HB 2900 would impose civil penalties for producing, publishing, selling, offering for sale, or commercially distributing pornography in Arizona, including via websites or digital services. It’s written in that broad, sweeping legislative language that always makes me pause — the kind that doesn’t just touch the margins, but tries to redraw the whole map.

The bill allows for civil penalties of up to $10,000 per violation, or per day in violation of the law. While the bill assigns enforcement to the state attorney general, it also allows private individuals to pursue civil action “in the name of” Arizona, in cases where the attorney general does not do so first. That little clause is the one that tends to keep lawyers up at night — the quiet invitation for citizens to become enforcers, neighbors to become watchdogs, and courtrooms to become battlegrounds.

Arizona already has an age verification law on the books, which took effect Sept. 26. But a total ban? That’s a different animal entirely. It bumps straight into long-standing legal precedent recognizing adult content as protected speech under the First Amendment. You can almost hear the constitutional gears grinding as soon as the idea hits the page.

Still, the push to ban adult content hasn’t come out of nowhere. Over the past year, it’s been popping up like political whack-a-mole across the country. In January 2025, an Oklahoma state senator introduced a bill that would criminalize all adult content and authorize the state to imprison those who create or even view it. In May, Republican Senator Mike Lee of Utah rolled out federal legislation aimed at redefining nearly all visual depictions of sex as obscene and therefore illegal — a goal also laid out in the Heritage Foundation’s Project 2025 policy blueprint, which has heavily guided the Trump administration’s agenda. Then in September, Michigan lawmakers floated a bill that would ban distributing pornography online in that state and require internet service providers to install filtering technology to block access for residents. It starts to feel less like coincidence and more like a coordinated drumbeat.

So far, though, reality has had a way of slowing the march. All three of those proposals appear to have stalled in their respective legislatures. HB 2900 has now been referred to the Arizona House Commerce and Rules committees, where it will either gather momentum — or quietly fade into the familiar legislative limbo. Either way, the bigger question lingers in the air: how far can lawmakers push before the Constitution pushes back? Sometimes the laws we propose say just as much about our fears as they do about our values.

Read More »

Yet Another Version of the “PROTECT Act” Introduced by Morley Safeword

Section 230

Add Congressman Jimmy Patronis (R-Fla.) to the list of elected officials hellbent on repealing Section 230 of the Communications Decency Act.

In a press release issued January 14th, Patronis celebrated his introduction of H.R. 7045, AKA the “Promoting Responsible Online Technology and Ensuring Consumer Trust” (PROTECT) Act.

The argument Patronis made in support of his proposal is a well-worn one, rooted in the notion that Section 230 is enabling evil tech platforms to ruin America’s children by shielding them from liability for things published by third parties on those platforms.

“As a father of two young boys, I refuse to stand by while Big Tech poisons our kids without consequence,” Patronis said. “This is the only industry in America that can knowingly harm children, some with deadly consequences, and walk away without responsibility. Big Tech is digital fentanyl that is slowly killing our kids, pushing parents to the sidelines, acting as therapists, and replacing relationships with our family and friends. This must stop.”

There’s a reasonable argument to be had about whether the courts have extended Section 230’s coverage too far in some cases, but to hear people like Patronis tell it, the statute’s safe harbor provision allows “Big Tech” to do anything it pleases with total impunity.

“These companies design their platforms to hook children, exploit their vulnerability, and keep them scrolling no matter the cost,” Patronis added. “When children are told by an algorithm, or a chatbot, that the world would be better without them, and no one is being held responsible, something is deeply broken. I bet they would actually self-police their sticky apps and technologies if they knew they would have to pay big without the Big Tech Liability Protection of Section 230.”

In his press release, Patronis claims that “Section 230 shields social media companies and other online platforms from liability for content published on their sites.” This claim is a half-truth, at best. Section 230 shields social media companies from liability for content published by others on their sites. That’s an important distinction, not a distinction without a difference.

Let’s try a thought experiment here: Let’s suppose you’re a congressman whose website permits users to post comments in response to things you post on the site. Let’s further suppose one of your site’s users decides to post something defamatory about another of your colleagues. Would you want to be held directly liable for that comment? How about if instead of something defamatory, the user posted something patently illegal, like an image of a child being sexually abused; is Patronis saying my hypothetical congressman ought to go to prison in that scenario?

There are many reasons why groups like the Computer and Communications Industry Association (CCIA) are against the repeal of Section 230 – and yes, one of those reasons is that the CCIA is funded by everyone’s current favorite boogeyman, Big Tech. Another more important reason is the people behind the CCIA can see where this is all heading, if Section 230 is outright repealed and no safe harbor at all is provided for those who offer forums in which users can publish their content and comments.

“In the absence of Section 230, digital services hosting user-created content, including everything from online reviews to posts on social media, would risk constant litigation,” the CCIA asserted in an analysis published January 12th. “Continuing to provide services optimized for user experience would require massively increased legal expenses.”

How massively would those legal expenses increase? The CCIA said, given the sheer volume of user-generated posts published in a year, if “just one post or comment in a million led to a lawsuit, digital services could face over 1.1 million lawsuits per year following a Section 230 repeal.”

“A single lawsuit reaching discovery typically costs over $100K in fees, and sometimes much more,” CCIA correctly noted. “If companies face 1.1 million lawsuits, that’s $110 billion in legal costs annually.”

I suppose those who say Big Tech is the devil (while using the platforms enabled by Big Tech to say so) might think this is a good thing, I’m not sure they’ve thought this all the way through. If social media platforms can’t operate due to overwhelming legal costs, we lose all the good things about social media, too – not to mention a whole lot of jobs when those platforms inevitably go out of business.

From the perspective of the adult industry and those who enjoy adult entertainment, repealing Section 230 would likely spell the end of platforms allowing adult content creators to post self-produced content, as well. What platform would want to risk being held strictly liable for anything and everything depicted in the videos and photos adult creators produce? It would be absolute madness for platforms like OnlyFans and its competitors to maintain their current business model in the absence of Section 230 safe harbor.

Again, for those who think porn should be abolished, that development might be seen as a feature and not a bug where the idea of repealing Section 230 is concerned. But extend that same outcome to some platform they DO like – YouTube, TikTok, Facebook, Instagram, X or what have you – and they might not like the collapse quite as much.

From where I sit, the idea of repealing Section 230 should be accompanied by that old standby of a warning: “Be careful what you wish for, because you might just get it.”

Read More »

Florida Lawmaker Introduces New Bill to Repeal Section 230

Section 230

It starts with that familiar little jolt in the gut — the kind you get when a political idea lands a little too close to home. WASHINGTON — Rep. Jimmy Patronis of Florida has become the latest member of Congress to float legislation that would repeal Section 230 of the Communications Decency Act, the rule that shields interactive computer services — including adult platforms — from being held responsible for user-generated content. One of those moments where you pause mid-scroll and think, Oh… this could get interesting. Or messy. Or both.

Patronis introduced HR 7045 in the House of Representatives earlier this week, slipping it into the legislative bloodstream where big ideas tend to either explode or quietly mutate over time. Sometimes you can almost hear the gears grinding behind the scenes.

A statement posted on his website declared, “For too long, the law has prevented parties harmed by online content from obtaining relief. Instead of protecting our younger generations from sensitive content, these sites prioritize profit over safety while continuing to push out harmful, explicit, and dangerous materials without any accountability.” Strong words, the kind that land heavy and don’t really leave much room for nuance.

Would-be reformers on both sides of the aisle have been taking swings at “Big Tech” for years now, accusing platforms of profiting off illegal and harmful content while hiding behind legal shields. The idea is to force companies to moderate more aggressively by making them legally responsible for what users post. Meanwhile, right-wing critics argue the same rule lets platforms censor conservative voices, and they want limits placed on how much moderation power these companies can wield. It’s like watching two very different fires being fueled by the same match.

Back in December, two other repeal bills were already making their way through Congress: HR 6746, the Sunset to Reform Section 230 Act, which would amend the law by simply adding, “This section shall have no force or effect after December 31, 2026,” and S 3546, which calls for a full repeal of Section 230 two years after enactment. The clock imagery alone makes you feel like something is quietly counting down in the background.

Industry attorneys and advocates, though, have been sounding alarms. They worry that once lawmakers start tinkering with Section 230, it opens the door to a patchwork of carve-outs — the kind that slowly chip away at protections, much like what happened with FOSTA/SESTA and its exemptions targeting sites that “unlawfully promote and facilitate” prostitution or sex trafficking. It’s rarely just one small change, is it? It’s the domino effect.

A carve-out aimed at — or even loosely touching — the adult industry would effectively gut Section 230 for those platforms. That would suddenly make sites hosting user-generated content legally responsible for what users upload, inviting a flood of civil lawsuits and uncertainty. And once that door cracks open, it’s hard not to wonder how wide it eventually swings.

Read More »