Commentary

There’s a “Porn Lesson” to Take from Lindsey Vonn’s Olympic Experience (No, Really) by Stan Q. Brick

Lindsay Vonn

When champion skier Lindsey Vonn experienced a terrible crash on what turned out to be her final run in the women’s downhill skiing event at the Winter Olympics in Milan earlier this month, maybe there were a few people out there thinking she shouldn’t have been permitted to take the risk of running the race, given that she already had a torn ACL injury in her left knee. But if a significant number of people felt that way, they seem to have kept it to themselves, for the most part.

Instead, the dominant reaction to Vonn’s knowing acceptance of added risk rightfully has been to praise her bravery, determination and champion spirit. As Madison Chapman wrote for Newsweek, “Winner or not, Vonn is the ideal Olympic champion. Her grit and resilience helped me shed my own fear of risk and learn to see myself as a champion over adversity after my cancer treatment and subsequent knee injury. She may not have clinched gold, but Lindsay Vonn reminded us all how to live.”

I’ve always been fascinated by the way people view the act of taking a physical risk, be it in the context of competitive skiing, climbing a mountain or something as fundamental managing one’s personal health. I’ve long believed that the question of whether something is safe to do is a different question than whether ought to be allowed to do it. As I see it, it’s not complicated; adults should be allowed to take informed risks – including a litany of risks I would never take, myself.

Doubtlessly, one reason Vonn found so much support for her decision is the competitive context. She was attempting to win a gold medal, an achievement for which there’s a very limited window of opportunity, one that only comes around every four years – and only for so many cycles in an athlete’s career.

Make no mistake, though; the reason Vonn’s decision, the Olympic Games themselves and Vonn’s injuries are global news is because sports are popular entertainment – and big business.

In other words, while we support Vonn’s chosen form of risk taking because competition is deemed a worthy enterprise by a significant portion of the human population, we also support it because we accept, at least in the context of sport, that people have a right to risk bodily harm in the process of entertaining other people.

We’re not consistent about this acceptance of risk for entertainment’s sake, of course. The response to people taking risks in the context of porn is less enthusiastic. Sometimes it inspires proposals specifically designed to deter peoplefrom plying their trade in adult entertainment.

I’m not saying I think social media should light up with words of encouragement every time a porn star gets nominated for an award, or when an adult content creator releases a new clip (although that would be nice). But maybe, if society can applaud people for risking grievous bodily harm while competing on the Olympic stage, society can at least also manage to avoid shaming people and subjecting them to paternalistic government regulation when the risks they take involve other, less celebrated forms of entertainment.

Read More »

Debanking Explained: How Politics, Policy, and Perception Shape Account Closures

A board with the word debanking.

The Cato Institute has published a report on debanking. You can find it here: https://www.cato.org/policy-analysis/understanding-debanking-evaluating-governmental-operational-political-religious?utm_source=social&utm_medium=email&utm_campaign=Cato%20Social%20Share

Here’s what the report is all about:

Debanking can be a frustrating and deeply unsettling experience. One day everything seems fine, and the next, a notice arrives giving just 30 days to withdraw funds and find a new financial institution. Confusion quickly turns into anxiety. Bills were paid, nothing appeared out of the ordinary — so what changed? A call to the bank rarely brings clarity, and the response is often the same: no further details can be provided. Customers are left with more questions than answers.

On the other side of the conversation, bankers are frequently constrained by strict confidentiality requirements. Even frontline staff may not have access to the underlying reasons for an account closure. Financial institutions operate within a framework of anti–money laundering, know your customer, and countering the financing of terrorism regulations — commonly referred to as AML, KYC, and CFT. While these rules are standard practice within the industry, they remain largely invisible to the public, creating a disconnect that fuels frustration on both sides.

For those looking to address the growing concern around debanking, some argue that meaningful change will require greater transparency. That could mean reconsidering the confidentiality that surrounds account closures, removing reputational risk as a regulatory factor, and reevaluating the Bank Secrecy Act framework that effectively places financial institutions in the role of investigative gatekeepers.

Read More »

Good News: Sometimes Adult Businesses DO Get Treated Like Everyone Else by Stan Q. Brick

Judge's gavel

In a country where it seems like lawsuits get filed at the drop of a hat – particularly if the hat is quite hard, quite heavy and falls on someone’s toes, causing both physical injury and extreme emotional distress – the fact that our courts do make plaintiffs jump through at least a minimal set of hoops can be something of a comfort.

For example, if I get into a fender bender with someone in California, but that person lives in New York, they probably can’t haul me into court in New York to make me face a lawsuit there, simply because New York happens to be the plaintiff’s state of domicile. They’d likely have to sue me in California, due to the way the courts handle the question of personal jurisdiction.

As you may have heard, a district court in Kansas applied this logic in dismissing a couple lawsuits filed against companies that operate adult websites, because a plaintiff there alleged those sites are not complying with the state’s age verification requirements for adult sites.

Among other things, judge in the case, U.S. District Judge Holly Teeter, wrote in her decision dismissing a lawsuit against Titan Websites that “merely intending that users accessing its content be able to do so from a wide geographic area is not the same as purposefully directing one’s activities at a forum.”

“Technical steps taken to make a universally accessible website easier for all users to access no matter where they are located is no more purposeful direction than the act of setting up the website in the first place,” Teeter added. “And just like the act of setting up a website, were the indiscriminate use of a CDN or other technologies to indiscriminately facilitate content delivery enough, ‘then the defense of personal jurisdiction, in the sense that a State has geographically limited judicial power, would no longer exist.’”

Teeter also wrote that her reasoning “does not mean that a website owner’s use of a CDN is never relevant” and “does not mean that a website owner’s use of a CDN could never show purposeful direction.”

“It does mean that more is needed to determine how the CDN is used and whether the CDN is being used to target a forum or an immediate region of which the forum is a part,” Teeter wrote. “The Court need not dissect the contours to resolve this case. Here, Plaintiff simply alleges that a CDN is being used and that the CDN has servers near the forum because logically it must. Defendant responds with evidence that it uses a third-party web-hosting service and that it does not know or care where the CDNs are located. This record is not enough to carry Plaintiff’s admittedly light burden.”

This dismissal of this case, as well as Teeter’s decision dismissing an identical case against a company called ICF Technology, is certainly good news for other adult businesses that might find themselves hailed into court over alleged violations of a state’s age verification law. They are not, of course, the end of the story.

The plaintiff is likely to appeal these decisions, whereupon the matter will go to the Tenth Circuit Court of Appeals. I’m no lawyer and I don’t have much to offer in terms of a prediction as to how the Tenth Circuit might ultimately rule. I just know that I don’t have much confidence in how the next court up the chain, the U.S. Supreme Court, might rule, should they take up the question.

Having found the age verification law passed in Texas to be constitutional, it wouldn’t surprise me one bit if SCOTUS decided that merely being accessible in a state creates a sufficient “minimum contact” with any given state for a court there to assert personal jurisdiction.

Still, at least for the time being, Teeter’s decisions represent something of a victory for the porn side of the War on Porn. Whether that victory is lasting or ephemeral remains to be seen. Fingers crossed.

Read More »

Adult Creators Keep Getting Debanked — And the Fallout Goes Far Beyond Them

Financial discrimination

Your bank may never send you a memo about it, but it’s quietly shaping your life.

Every time you click “buy now,” a small army of institutions decides whether that purchase gets to exist. And for adult creators, that army has been steadily tightening its grip. For years, people in the industry have been warning about financial discrimination and debanking — the sudden closure of accounts, the polite but devastating “we can no longer do business with you.” It’s happening more often now. And it’s happening quietly.

“I don’t know what could happen next or when it might happen,”

Adult VTuber, journalist, and activist Ana Valens says. In just two weeks last November, nearly every platform she relied on either removed her content or suspended her outright. “While my Patreon and Ko-fi were reinstated, I’ve spent the past two months waiting for the other shoe to drop — another Patreon ban, my PayPal deactivated, and so on.” She reached out for explanations. Most platforms couldn’t clearly articulate how she’d violated their terms. Ko-fi didn’t respond until repeated messages finally led to reinstatement.

That kind of uncertainty lingers. It’s like walking on ice that might crack at any moment.

“Deplatforming and debanking are an occupational hazard for any adult content creator,” says Gina, a co-founder of PeepMe, a startup that set out to build a worker-owned creator marketplace. PeepMe was imagined as an alternative to OnlyFans and Patreon — a space where creators could hold equity, elect a democratic board, and receive quarterly profit-sharing dividends.

Gina requested that a pseudonym be used, given her continued work adjacent to the adult industry and the very real fear of financial fallout. “Even still, I’ve never seen someone banned on so many sites before [as Ana has been],” she says.

And it’s not just adult creators feeling the pressure. Companies in oil and gas, cryptocurrency, tobacco, and firearms have also raised concerns about politically motivated debanking. The pushback has grown loud enough that U.S. regulators are now stepping in, attempting to rein in financial discrimination.

Who’s Blocking My Buying?

When you make an online purchase, your money doesn’t travel in a straight line. It passes through layers of gatekeepers. The pipeline often looks like this:

  1. Platform (merchant) websites: where creators earn income — YouTube, Patreon, Etsy, DoorDash, Steam.

  2. Payment processors: companies that route the transaction between card networks and banks — PayPal, Stripe.

  3. Card networks: Visa, American Express, Mastercard — the rule-makers that standardize how buyers and sellers interact.

  4. Your bank and the seller’s bank: Wells Fargo, Bank of America, and so on.

Each step has discretion. Beyond preventing illegal activity, these institutions can decide what kinds of money they’re willing to touch.

“The rules set by card networks are sometimes vague,” says Dr. Val Webber, a postdoctoral researcher at Dalhousie University’s Sexual Health and Gender Research Lab. Mastercard’s June 2025 rules restrict “any Transaction that […] in the sole discretion of [Mastercard], may damage the goodwill of [Mastercard] or reflect negatively on the [brand].”

“In the sole discretion” is doing a lot of work there.

Last summer, Steam and itch.io removed or deindexed adult games after pressure from payment processors and card networks. Steam cited pressure from Mastercard, conveyed through processors like Stripe. Stripe told itch.io, “Stripe is currently unable to support sexually explicit content due to restrictions placed on them by their banking partners, despite card networks generally supporting adult content.” Stripe’s prohibited business list includes “pornography and other mature audience content (including literature, imagery, and other media) designed for the purpose of sexual gratification.”

Mastercard later denied involvement. In August 2025, the company stated, “Mastercard has not evaluated any game or required restrictions of any activity on game creator sites and platforms, contrary to media reports and allegations.”

Meanwhile, Valens saw her articles disappear from Vice. “My suspicion is that it was easy for a financial company to flag me as high risk as a punitive measure for my content, or my activism work,” she says. Attempts to obtain comment from Vice were unsuccessful.

Who Can Get Debanked?

“We have lots of data to show that people in the adult industry face financial discrimination in the form of their accounts being closed, being denied mortgages, business loans, and other banking services — despite banks often not being able to substantiate legal reasons related to these individual accounts,” says Maggie MacDonald, a PhD researcher at the University of Toronto.

The tension escalated in December 2020 when Visa and Mastercard cut ties with Pornhub, citing child sexual abuse material (CSAM). “Our adult content standards allow for legal adult activity created by consenting individuals or studios,” Mastercard said at the time. “Merchants must have controls to monitor, block and remove unlawful content from being posted.” Pornhub denied hosting illegal content and emphasized the harm to “the hundreds of thousands of models who rely on [their] platform for their livelihoods.”

But here’s the inconsistency that nags at people: X continues to process payments despite widespread reports of CSAM and non-consensual deepfake content. No sweeping financial freeze there.

Watching major platforms lose payment relationships makes smaller startups tread lightly. “We just can’t afford to lose our ability to do business with these financial companies,” Gina says. “Stripe takes only 2.9 percent from businesses they’re willing to work with, while high-risk processors willing to take on adult content can charge up to 15 percent.”

That difference can sink a company before it starts.

“Losing a relationship with card networks is a risk payment processors can’t afford, and losing relationships with payment processors is a risk that platform websites can’t afford,” explains Webber. “In the end, the responsibility of ensuring their content stays within the lines of these oftentimes unclear rules trickles down to each individual creator. Because ultimately, content creators are more expendable to platforms than payment processors and card networks.”

One justification often cited is chargebacks — when customers reverse credit card transactions. Gina isn’t convinced.

“Locking out entire industries makes less and less sense as fraud detection technology advances,” she says. “Payment processors and card networks already have processes to step in when an individual business has a high rate of chargebacks, there’s no reason to block out a whole industry.” Mastercard recently announced expanded generative AI fraud-detection tools, building on already sophisticated monitoring systems.

“We also haven’t seen the claim of high-chargebacks in adult content substantiated anywhere in terms of measured data,” adds MacDonald. “As a researcher, that makes me suspicious of the criteria these companies are using behind the scenes.”

The Evolving Landscape of Banking Regulations

In February 2025, the Free Speech Coalition filed a statement with the U.S. House Committee on Financial Services, calling for due process protections, objective risk assessments, and explicit recognition that lawful adult businesses do not inherently present financial crime risk. Blocking entire industries without individualized evaluation, the statement argued, is regulatory overreach with serious implications for free speech.

Multiple efforts are underway in the United States to limit financial institutions from denying service for reasons beyond legal violations. In August 2025, President Donald Trump issued an executive order directing regulators to investigate and reverse politically motivated debanking. Bank regulators have begun removing “reputational risk” from compliance criteria, and proposed Senate legislation would impose civil fines on banks and card networks that avoid entire categories of customers.

“Card networks and payment processors began by blocking pornography, but they’ve moved into other online industries as well,” says Webber. “The line in the sand continues to shift, and it has recently expanded to video game creators and streamers as well. We don’t know how these rules might evolve, and what type of online content might be next.”

Valens has spent months urging customers to call Mastercard, Visa, PayPal, and Stripe to question purchase restrictions and account freezes. Visa points to its policies for combating illegal activity; PayPal requires pre-approval for adult materials, similar to tobacco; Stripe states it does not support adult content.

“Private companies have been deputized to decide how we can earn and spend our money,” says MacDonald. “Anyone who is ideologically misaligned with any of these companies faces the risk of losing their livelihood.”

That’s the part that lingers.

It’s not just about porn, or games, or activism. It’s about the invisible committee that votes on your transactions — and whether one day, without warning, they decide you don’t get a vote at all.

Read More »

Conservative Lawmakers Push Porn Taxes — Critics Call It Unconstitutional Speech Policing

Taxes

The war on porn doesn’t look like a war anymore. It looks like a line item on a tax form.

As age-verification laws keep tightening their grip on the adult industry — and, quietly, on the broader idea of free speech online — an Utah lawmaker has proposed something new. Or maybe not new. Just sharper. A bill introduced last month would slap a tax on porn sites operating in the state.

Introduced by state senator Calvin Musselman, a Republican, the bill would impose a 7 percent tax on total receipts “from sales, distributions, memberships, subscriptions, performances, and content amounting to material harmful to minors that is produced, sold, filmed, generated, or otherwise based” in Utah. If passed, the bill would go into effect in May and would also require adult sites to pay a $500 annual fee to the State Tax Commission. Per the legislation, the money made from the tax will be used by Utah’s Department of Health and Human Services to provide more mental health support for teens.

Musselman did not respond to a request for comment.

There’s a certain rhythm to this moment in American politics. Conservative lawmakers across the country are circling adult content with renewed intensity. In September, Alabama became the first state to impose a porn tax on adult entertainment companies (10 percent) after passing age-verification mandates requiring users to upload ID before viewing explicit material. Pennsylvania lawmakers are weighing a bill that would add a 10 percent tax on “subscriptions to and one-time purchases from online adult content platforms,” even though digital products are already subject to a 6 percent sales tax, two state senators wrote in a memo in October. Arizona floated a similar idea back in 2019, when state senator Gail Griffin proposed taxing adult content distributors to help fund the border wall during Donald Trump’s first term. Meanwhile, 25 states have passed some form of age verification.

It’s not just about taxes. For years, efforts to criminalize or restrict sex work have ebbed and flowed, usually intensifying during moments of heightened online surveillance and censorship. But targeted taxes have struggled to gain widespread traction. Why? Because their legality is murky at best.

“This kind of porn tax is blatantly unconstitutional,” says Evelyn Douek, an associate professor of law at Stanford Law School. “It singles out a particular type of protected speech for disfavored treatment, purely because the legislature doesn’t like it—that’s exactly what the First Amendment is designed to protect against. Utah may not like porn, but as the Supreme Court affirmed only last year, adults have a fully protected right to access it.”

Utah, Alabama, and Pennsylvania are among 16 states that have adopted resolutions declaring porn a public health crisis. “We realize this is a bold assertion not everyone will agree on, but it’s the full-fledged truth,” Utah governor Gary Herbert tweeted in 2016 after signing the resolution. Utah’s long-running campaign against explicit material stretches back decades. In 2001, it became the first state to appoint an obscenity and pornography complaints ombudsman — a position colloquially known as the “porn czar.” That role was eliminated in 2017.

The industry, for its part, has been trying to keep up with the shifting rules. “Age restriction is a very complex subject that brings with it data privacy concerns and the potential for uneven and inconsistent application for different digital platforms,” Alex Kekesi, vice president of brand and community at Pornhub, said in a previous interview. In November, the company urged Google, Microsoft, and Apple to implement device-based age verification across their operating systems and app stores. “We have seen several states and countries try to impose platform-level age verification requirements, and they have all failed to adequately protect children.” To comply with new age gate mandates, Pornhub has blocked access in 23 states.

Critics argue that these policies were never truly about protecting children in the first place. In 2024, a video leaked by the Centre for Climate Reporting showed Russell Vought, a Trump ally and Project 2025 coauthor, describing age verification laws as a “back door” to a federal porn ban.

There’s a strange irony here. Platforms like OnlyFans and Pornhub helped mainstream digital sex work, bringing it out of the shadows and into subscription dashboards and creator analytics. But that visibility has made it easier to regulate, track, and now tax. As more states experiment with tariffs tied specifically to sexual content, creators — not lawmakers — are likely to feel the immediate impact.

The skewed ideology of cultural conservatism that is taking shape under Trump 2.0 wants to punish sexual expression, says Mike Stabile, director of public policy at the Free Speech Coalition, a trade association for the adult industry in the US. “When we talk about free speech, we generally mean the freedom to speak, the ability to speak freely without government interference. But in this case, free also means not having to pay for the right to do so. A government tax on speech limits that right to those who can afford it.”

OnlyFans states that it complies with tax requirements in the jurisdictions where it operates, and creators are responsible for managing their own tax affairs. Pornhub, which is currently blocked in Utah and Alabama, did not respond to a request for comment.

Douek notes that while the Supreme Court recently upheld age-verification laws in Texas — allowing states to regulate minors’ access to explicit material — “a porn tax does nothing to limit minors’ access to this speech—it simply makes it more expensive to provide this content to adults.” A 2022 report from Common Sense Media found that 73 percent of teens aged 13 to 17 have watched adult content online. Young people often encounter explicit material on social media platforms such as X and Snap, sometimes intentionally, often accidentally. A survey last year from the UK’s Office of the Children’s Commissioner reported that 59 percent of minors are exposed to porn unintentionally, primarily via social media, up from 38 percent the year before.

In Alabama, as in Utah’s proposal, tax revenue is earmarked for behavioral health services, including prevention, treatment, and recovery programs for young people.

Alabama state representative Ben Robbins, the bill’s Republican sponsor, said in an interview last year that adult content was “a driver in causing mental health issues” in the state. It’s an argument that surfaces again and again in debates about a nationwide porn ban. Some research suggests adolescent exposure may correlate with depression, lower self-esteem, or normalization of violence, but health professionals have not reached consensus.

With lawmakers reframing the conversation around underage harm, Stabile argues that the principle at stake is bigger than porn itself. Content-specific taxes on speech, he notes, have repeatedly been struck down as unconstitutional censorship.

“What if a state decided that Covid misinformation was straining state health resources and taxed newsletters who promoted it? What if the federal government decided to require a costly license to start a podcast? What if a state decided to tax a certain newspaper it didn’t like?” he says. “Porn isn’t some magical category of speech separate from movies, streaming services, or other forms of entertainment. Adult businesses already pay taxes on the income they earn, just as every other business does. Taxing them because of imagined harms is not only dangerous to our industry, it sets a dangerous precedent for government power.”

And that’s the quiet part that lingers.

When governments start deciding which kinds of speech deserve a surcharge, the debate stops being about porn. It becomes about who gets to speak freely — and who has to pay extra for the privilege.

Read More »

The Human Cost of Overregulation by Morley Safeword

Age verification

Over the decades I’ve worked in the adult entertainment business, it has struck me many times how concerned the industry’s critics appear to be about the welfare of those of us who work in the industry – and how quickly that concern turns to consternation and scorn, should we insist that we’re doing what we do gladly and of our own free will.

“Nonsense,” the critics say, “these poor souls only think they are engaging in this depravity willingly; the truth is they have been brainwashed, coerced, cajoled and manipulated into believing they want to participate in this filth.”

Granted, not a lot of people have spilled ink along these lines to fret over the wellbeing of freelance writers like me. I think we’re counted as being among the exploiters, rather than the exploited, or perhaps as enablers of exploitation. Still, there’s no denying I derive my living, meager though it may be, from adult entertainment, even if all I do is write about it, rather than perform in or film it.

While many of the regulations aimed at the adult industry are couched as attempts to protect minors from the alleged harm of viewing pornography, when these measures are discussed by their proponents, “success” is often defined as making the adult industry retreat from their jurisdiction altogether. If a site like Pornhub blocks visitors from an entire state, including all the adults in that state who are still legally entitled to access the site even under newly established age verification mandates, those who cooked up the laws often describe this development as a sign the law is “working.” As I’ve written before, the chilling effect is a feature of these measures, not a bug.

By the same token, if a new law or regulation makes it harder for adult content creators to make their own movies, distribute their own photos or perform live on webcams, that too is something to be celebrated by the legislators and activists who champion those regulations.

Gone is all thought or discussion of the wellbeing of adult content creators and performers, once the potential cause of harm is the law itself. This holds true of purported “anti-trafficking” statutes. While sex workers themselves largely oppose measures like FOSTA/SESTA and say the law has made them less safe, not more, the proponents and sponsors of such legislation don’t want to hear it. Yes, these paternalistic politicos and crusading critics will protect these wayward adults from themselves, even if it kills them.

I can only imagine that if a state legislator from any of the dozens of states that have passed age verification requirements were to learn that adult content creators (and the platforms that host their work) are having a harder time earning a living under these new regulatory schemes, their response would be brief and callous: “Good,” they’d probably say, “now they can go out and look for more respectable work!”

And what happens when former porn performers do find work in other fields? The stigma of porn follows them. They get fired. They are told their mere presence in a classroom is disruptive. They are hounded on social media. They are treated like pariahs by the very people who supposedly care about their welfare.

A law or regulation can be well-intended and still do harm. I don’t doubt some of the politicians involved in crafting age-verification laws and other purportedly protective regulations believe they are doing things in the best interests of both minors and the adults who work in porn, or in the broader world of sex work. But it’s hard to believe they truly care about the latter two when there’s so little thought given to the potential negative impact on them during the crafting of these laws.

As more states toy with the idea of establishing a “porn tax,” will any of them pause to consider the impact on the human beings targeted by such taxes? I’d strongly advise not trying to hold your breath while waiting for that manner of concern to be expressed.

Read More »

Click Here to Keep Clicking Here, So You Can Click There (Eventually) by Stan Q. Brick

Ofcom logo

I was on the lookout for something to write about. “I know,” I thought, “I’ll see what the latest news is to come out of OfCom, the UK’s regulatory authority for broadcast, internet, telecommunications and postal services!

In the old days, days I remember with great fondness, I could have just typed Ofcom.org.uk into the nav bar on my browser and I’d be there, reading the latest from Ofcom. Not anymore – because now, even to read a drab, dull, regulatory agency’s website, first I must satisfy a machine’s demand that I prove I’m human, first.

No big deal. Just a simple captcha test (one probably easily defeated by a sophisticated enough bot, tbh) and I’m on my way… sort of. Which is to say I would be on my way, except now I must read a disclosure about cookies, perhaps adjust some settings and then “accept” or “save” or “surrender to” those preferences, or whatever the verbiage might be.

This is using the internet now, apparently. Instead of “surfing” and the freedom of movement that term suggests, it’s more like navigating a joyless obstacle course, in which I’m required to verify my age and/or my very humanity as I hop from step to step.

I’m sure this seems to many people like an overstated complaint. “So what?” they might say. Why is it a big deal to verify minor details like your age, or to have your internet path blocked in one way or another, based largely on where I live and where the site I’m accessing is located?

People used to call the internet the “information superhighway.” While this was an admittedly irritating buzz phrase, the term did at least capture the sense that the internet was something largely unfettered, where data, entertainment, information, misinformation and all manner of expressive content was available to all those able to access it.

Now, despite the fact I’ve been an adult for nearly 40 years, every time I turn around while online, I’m being asked to verify the fact of my adulthood anew. (Yes, I do visit a lot of porn sites; it sort of comes with the territory of – you know – working in and writing about the online porn industry.)

I understand a lot of people are hot to make the internet “safer,” but to me, this desire betrays an ignorance of what the internet is – or if not an ignorance of its nature, a stubborn desire to transform the internet to something else. But the internet, whatever else it might be, is a massive computer network about which the best thing has always been the worst thing, as well: Virtually anyone can publish virtually anything on it.

Slap as many age gates and regulations as you’d like on a massive, global, computer network; you’re still just engaging in an endless game of whack-a-mole. OfCom themselves reported that after the requirement that adult sites employ “Highly Effective Age Assurance” (HEAA) methods, VPN usage in the UK more than doubled, “rising from about 650k daily users before 25 July 2025 and peaking at over 1.4m in mid-August 2025.”

OfCom is undeterred by numbers like these, of course. Their inevitable answer will be to impose restrictions on VPN use. Because like any government regulatory agency, if there’s one thing OfCom will not be able to tolerate, it will be the sense they can’t control that which is in their remit to tame.

Speaking of OfCom, when I did finally satisfy their system that I’m a human who doesn’t want to spend a lot of time deciding which cookies he does and doesn’t want attaching to his browser, what I found was an explanation of – and almost an apology for – the upper limit of the agency’s regulatory reach with respect to AI chatbots.

After stating with apparent pride that OfCom was “one of the first regulators in the world to act on concerning reports of the Grok AI chatbot account on X being used to create and share demeaning sexual deepfakes of real people,” OfCom goes on to explain that “not all AI chatbots are regulated” by the agency.

“Broadly, the Online Safety Act regulates user-to-user services, search services and services that publish pornographic content,” OfCom explained. (They don’t say so, but just for your edification, this limited scope is due to sexually explicit depictions being awful, youth-corrupting and inherently sinister, while depictions of people getting shot in the head or beaten bloody with lead pipes are fine.)

On the other hand, “AI chatbots are not subject to regulation if they… only allow people to interact with the chatbot itself and no other users (i.e. they are not user-to-user services);

do not search multiple websites or databases when giving responses to users (i.e. are not search services); and cannot generate pornographic content.”

OfCom ends its notice with a how-to guide on reporting anything you find online “that you think might be harmful or illegal.”

I’d try reporting OfCom’s website itself for harmful content, because I sure feel like I’m getting dumber just by reading it… but I suspect to execute this vengeful little practical joke, I’d have to pass at least three captcha tests, verify my age seven times and produce some manner of HCPN (“Highly Compelling Proof of Netizenship”).

You know what? I think I’ll just read a book. So far as I’m aware, I’m not required to present ID to grab an old tome off the shelves in my study… yet.

Read More »

Conservative Push for Porn Taxes Sparks Constitutional Backlash

Tax

It feels like the walls are closing in a little more every week. As age-verification laws continue to reshape—and in some cases dismantle—the adult industry, a Utah lawmaker has now stepped forward with a bill that would slap a new tax on porn sites operating in the state. It’s the kind of proposal that makes you pause, reread the headline, and wonder how we got here so fast.

Introduced by Republican state senator Calvin Musselman, the bill would impose a 7 percent tax on total receipts “from sales, distributions, memberships, subscriptions, performances, and content amounting to material harmful to minors that is produced, sold, filmed, generated, or otherwise based” in Utah. If passed, it would take effect in May and require adult sites to pay an additional $500 annual fee to the State Tax Commission. According to the legislation, revenue from the tax would be directed to Utah’s Department of Health and Human Services to expand mental health support for teens.

A new strain of American conservatism is asserting itself more boldly, and lawmakers across the US are calling for tighter restrictions on adult content. In September, Alabama became the first state to introduce a porn tax—10 percent on adult entertainment companies—after passing age-verification mandates that require users to upload ID or other personal documentation before accessing explicit material. Pennsylvania lawmakers are also exploring a proposal that would tack an extra 10 percent tax onto subscriptions and one-time purchases from online adult platforms, despite already charging a 6 percent sales and use tax on digital products, two state senators wrote in an October memo. Other states have flirted with similar ideas before. In 2019, Arizona state senator Gail Griffin, a Republican, proposed taxing adult content distributors to help fund a border wall during Donald Trump’s first term. To date, 25 US states have enacted some form of age verification.

Professor Answers Television History Questions

Efforts to criminalize sex workers and regulate the industry have been unfolding for years, accelerating alongside increased online surveillance and censorship. Yet targeted taxes have repeatedly stalled, in part because the legality of such measures remains deeply contested.

“This kind of porn tax is blatantly unconstitutional,” says Evelyn Douek, an associate professor of law at Stanford Law School. “It singles out a particular type of protected speech for disfavored treatment, purely because the legislature doesn’t like it—that’s exactly what the First Amendment is designed to protect against. Utah may not like porn, but as the Supreme Court affirmed only last year, adults have a fully protected right to access it.”

Utah, Alabama, and Pennsylvania are among 16 states that have adopted resolutions declaring pornography a public health crisis. “We realize this is a bold assertion not everyone will agree on, but it’s the full-fledged truth,” Utah governor Gary Herbert tweeted in 2016 after signing the resolution. Utah’s early response to the spread of adult content dates back to 2001, when it became the first state to establish an office focused on sexually explicit material by appointing an obscenity and pornography complaints ombudsman. The role—often referred to as the “porn czar”—was eliminated in 2017.

“Age restriction is a very complex subject that brings with it data privacy concerns and the potential for uneven and inconsistent application for different digital platforms,” Alex Kekesi, vice president of brand and community at Pornhub, said previously. In November, the company urged Google, Microsoft, and Apple to adopt device-based verification across app stores and operating systems. “We have seen several states and countries try to impose platform-level age verification requirements, and they have all failed to adequately protect children.” To comply with existing mandates, Pornhub has blocked access to users in 23 states.

Critics argue that age verification has never truly been about protecting children, but about quietly scrubbing porn from the internet. In 2024, a leaked video from the Centre for Climate Reporting showed Russell Vought, a Trump ally and Project 2025 coauthor, describing age-verification laws as a “back door” to a federal porn ban.

Platforms like OnlyFans and Pornhub have pushed sex work further into the mainstream, but they’ve also made it easier to monitor and police both performers and audiences. As states consider new tariffs and penalties, it’s creators who are most likely to absorb the shock.

The cultural conservatism taking shape under Trump 2.0 is driven by a desire to punish sexual expression, says Mike Stabile, director of public policy at the Free Speech Coalition, the US adult industry’s trade association. “When we talk about free speech, we generally mean the freedom to speak, the ability to speak freely without government interference. But in this case, free also means not having to pay for the right to do so. A government tax on speech limits that right to those who can afford it.”

OnlyFans says it complies with all tax requirements in the jurisdictions where it operates, while creators remain responsible for their own tax obligations. Pornhub, which is currently blocked in Utah and Alabama, did not respond to a request for comment.

Douek points out that while states can regulate minors’ access to explicit material following the Supreme Court’s decision upholding Texas’ age-verification law, “a porn tax does nothing to limit minors’ access to this speech—it simply makes it more expensive to provide this content to adults.” A 2022 report from Common Sense Media found that 73 percent of teens aged 13 to 17 had viewed adult content online. Today, much of that exposure happens through social media platforms like X and Snap. A recent survey from the UK’s Office of the Children’s Commissioner found that 59 percent of minors encounter porn accidentally—up from 38 percent the year before—mostly via social feeds.

In Alabama, as would be the case in Utah, revenue from the porn tax is earmarked for behavioral health services, including prevention, treatment, and recovery programs for young people.

Last year, Alabama state representative Ben Robbins, the Republican sponsor of the bill, said adult content was “a driver in causing mental health issues” in the state. It’s a familiar claim among lawmakers advocating for a nationwide porn ban. While some studies suggest adolescent exposure to porn may correlate with depression, low self-esteem, or normalized violence, medical experts have never reached a clear consensus.

As lawmakers increasingly frame the issue around harm to minors, Stabile says it’s crucial to remember that adult content is not a special category outside the bounds of free expression. Courts have repeatedly struck down content-specific taxes as unconstitutional censorship.

“What if a state decided that Covid misinformation was straining state health resources and taxed newsletters who promoted it? What if the federal government decided to require a costly license to start a podcast? What if a state decided to tax a certain newspaper it didn’t like?” he says. “Porn isn’t some magical category of speech separate from movies, streaming services, or other forms of entertainment. Adult businesses already pay taxes on the income they earn, just as every other business does. Taxing them because of imagined harms isn’t just dangerous for our industry—it’s a dangerous expansion of government power.”

Read More »

‘An Embarrassment’: Critics Slam UK’s Proposed VPN Age Checks

VPN

It started the way these things always seem to start lately—with a vote that felt small on paper and enormous everywhere else. Politicians, technologists, and civil society groups reacted with visible dismay after the House of Lords backed a move that would ban children from using VPNs and force providers to roll out age verification.

The backlash was swift. Wikipedia co-founder Jimmy Wales blasted the decision on X, calling the UK’s position an embarrassment. Windscribe CEO Yegor Sak had already summed up the idea as the “dumbest possible fix,” warning that forcing age checks on VPNs would set a deeply troubling precedent for digital privacy.

By Tuesday morning, the argument had spilled fully into the open. Online debate surged, with X logging more than 20,000 posts on the issue in just 24 hours—one of those moments where you can almost hear the internet arguing with itself.

Labour, Lords & VPN laws

Last week, the House of Lords voted in favor of an amendment to the Children’s Wellbeing and Schools Bill that would, in effect, bar anyone under 18 from using VPNs.

The proposal would require commercial VPN providers to deploy mandatory age assurance technology, specifically to stop minors from using VPNs to bypass online safety measures. It sounds tidy in theory. In reality, it opens a can of worms no one seems eager to fully acknowledge.

Notably, the government itself opposed the amendment. Instead, it has opened a three-month consultation on children’s social media use, which includes a broader look at VPNs and how—or whether—they should be addressed.

Political pushback

Even though the House of Lords has shown its hand, the proposal now heads to the House of Commons, where it’s expected to hit serious resistance from the Labour government.

If the Commons throws it out, as many expect, the Lords will have to decide whether to dig in and trigger a round of parliamentary “ping-pong” or quietly step aside.

Labour’s Lord Knight of Weymouth, who voted against the amendment, suggested there’s little appetite for a drawn-out fight. He told TechRadar that it’s unlikely politicians will “die in a ditch” over banning VPNs.

In his view, many lawmakers are chasing “something iconic” on child safety—something headline-friendly—rather than wading into the technical swamp that regulating VPNs would require.

That said, Knight didn’t dismiss the broader concern. He argued that regulator Ofcom “needs to do better” at enforcing existing safety laws and agreed that more should be done to protect children online, provided it’s handled “carefully.” That word—carefully—does a lot of work here.

Civil society’s response

Regardless of whether this particular amendment survives, one thing is clear: VPNs are under a brighter spotlight than ever, and not just in the UK.

In the United States, lawmakers in Wisconsin are pushing a bill that would require adult websites to block access from users connected via a VPN. In Michigan, legislators have floated ideas around ISP-level blocking of circumvention tools. Different routes, same destination.

Evan Greer, director of the US-based group Fight for the Future, warned that policies designed to discourage or ban VPN use will “will put human rights activists, journalists, abuse survivors and other vulnerable people in immediate danger.”

Fight for the Future is running a campaign that lets users contact lawmakers directly, arguing in an open letter that the ability to use the internet safely and privately is a fundamental human right.

Back in the UK, a public petition is urging the government to reject any plan that would effectively ban VPNs for children.

The Open Rights Group has also been vocal, pointing out that detecting or banning VPN use isn’t realistically possible without resorting to what it calls an “extreme level of digital authoritarianism.”

And just in case anyone missed the point the first time, the reaction hasn’t softened. Politicians, technologists, and civil society organizations continue to express dismay after the House of Lords vote to ban children from using VPNs and force providers to introduce age verification.

Jimmy Wales again called the UK’s stance an embarrassment, while Windscribe CEO Yegor Sak repeated his warning that this is the “dumbest possible fix” and a terrible precedent for privacy.

The conversation flared once more as public debate peaked Tuesday morning, with more than 20,000 posts appearing on X in a single day—a reminder that when it comes to privacy, the internet rarely stays quiet for long.

Read More »

Alabama’s Latest Adult Content Law Pushes Creators Out, Not Toward Safety

Picture of the Alabama Flag

Most adult creators didn’t need a push notification to feel it. The moment the news started circulating, it landed with a familiar weight: Clips4Sale has restricted access in Alabama after the passage of House Bill 164, a law that introduces notarized consent requirements for performers and platforms. The company frames the decision as compliance—necessary, even prudent. Creators read it differently. To many, it felt like the ground quietly disappearing beneath their feet.

Both interpretations can coexist. And maybe that’s the most unsettling part.

Legislation like Alabama’s is almost always sold as “protective.” The language is comforting, even noble—designed to reassure the public that something dangerous is being handled. But when you listen to the people living under these laws—performers, indie creators, small operators—the tone shifts. What comes through isn’t relief. It’s confusion. Anxiety. A creeping sense that they’re being legislated out of existence without anyone actually talking to them.

House Bill 164 didn’t arrive out of nowhere. It’s part of a broader pattern unfolding across the country, where states are targeting adult platforms through new consent rules, age checks, and documentation standards. On paper, they sound reasonable. In reality, they unravel fast.

What they create isn’t safety. It’s splintering.

A Law That Misses the Reality of the Industry

Adult performers aren’t operating without rules. They never have been. For decades, the industry has been bound by strict federal record-keeping requirements—ID verification, age documentation, signed releases. These systems already exist. They’re already enforced. They’re already audited. And they’re treated seriously because the penalties for failure are brutal.

Which is exactly why Alabama’s law sparked disbelief instead of reassurance.

Adult performer Leilani Lei cut through the noise on X by asking a simple question: do lawmakers actually understand what notarization does? A notary verifies identity and witnesses a signature. That’s the full job description. They don’t assess consent. They don’t evaluate content. They don’t make legal judgments. Requiring notarization doesn’t increase safety—it adds friction, expense, and logistical chaos.

Is a notary expected on every set? For every solo clip? For content created privately by independent performers in their own homes? These aren’t dramatic hypotheticals. They’re practical questions that expose how disconnected the law is from how adult work actually functions.

When laws ignore operational reality, compliance stops being ethical and starts being geographic. Platforms block states. Creators lose access. Income vanishes—not because of misconduct, but because following the rules becomes impossible.

When “Protection” Quietly Becomes Economic Damage

One consequence of laws like HB 164 rarely gets discussed out loud: money.

Adult creators aren’t faceless entities. They’re people paying rent, covering medical bills, supporting families. For many, digital platforms aren’t side hustles—they’re lifelines. When a state gets geoblocked, creators living there lose their audience instantly. When platforms restrict access, creators with fans in that state watch sales drop overnight.

Cupcake SinClair’s response on X captured the mood perfectly—not panic, but dread. Not fear of regulation itself, but fear of where this path leads. If these laws keep spreading—each state tweaking the rules just enough—what does the landscape look like in a year? Two years? Does access slowly shrink until it’s determined entirely by ZIP code?

That’s not protection. That’s erosion.

And while platforms like Clips4Sale may view geoblocking as the least damaging option on the table, the fallout doesn’t land on the platform. It lands on creators. The backlash reflects more than anger—it reflects a growing sense that major decisions are being made without creators in the room, reshaping livelihoods without alternatives or support.

From the creator’s side, these aren’t abstract compliance choices. They translate into fewer customers, lower visibility, and more instability in an already fragile industry.

The Patchwork Problem Everyone Pretends Isn’t a Problem

One of the most dangerous aspects of this legislative trend is how inconsistent it is.

Each state passes its own version of “protective” law, often without coordination, consultation, or technical understanding. The result is a patchwork of requirements no platform can realistically meet across the board. What’s compliant in one state may be illegal in the next.

For massive tech companies, patchwork laws are an inconvenience. For adult platforms—already operating under heavier scrutiny, higher fees, and greater risk—they can be fatal.

For independent creators, they’re destabilizing by design.

When lawmakers ignore the cumulative effect of these laws, compliance becomes less about doing the right thing and more about surviving. Platforms that can’t afford bespoke, state-by-state systems opt out entirely. Creators are left scrambling to adapt, relocate, or rebuild somewhere else.

Who Is Actually Being Protected Here?

Supporters of laws like HB 164 often speak in moral absolutes. They invoke exploitation, trafficking, consent—serious issues that deserve serious responses.

But when legislation refuses to distinguish between criminal behavior and lawful adult work, it ends up punishing the latter while barely touching the former.

Bad actors don’t notarize forms. They don’t operate transparently. They don’t comply with documentation requirements. Meanwhile, compliant creators and legitimate platforms absorb the cost of laws that don’t meaningfully address wrongdoing.

Protection that collapses under scrutiny isn’t protection. It’s performance.

A Future Built on Exclusion Isn’t a Fix

The adult industry isn’t asking for no rules. It’s asking for rules that reflect reality.

That means lawmakers engaging with performers, platforms, and legal experts who understand how consent, documentation, and digital distribution actually work. It means recognizing that piling on procedural hurdles doesn’t automatically make anyone safer—and that cutting off access often harms the very people these laws claim to defend.

If this trend continues unchecked, the future of adult content in the U.S. won’t look like reform. It will look like retreat. More geoblocking. More platform withdrawals. More creators pushed out of legitimate marketplaces and into less secure corners of the internet.

That outcome serves no one—not performers, not platforms, and not the public.

Until the conversation moves beyond slogans and starts grappling with consequences, laws like Alabama’s will keep feeling less like protection and more like disappearance.

Read More »