Commentary

Beware Opportunists in Superhero Capes by Stan Q. Brick

Superheroes

Some folks who favor suppression of sexually explicit materials are more forthright about what gives life to their censorious zeal than others. Say what you will about the old “Morality in Media” brand, back when the organization went by that moniker, everybody knew where they were coming from just by reading the sign on their door.

Perhaps because the folks at Morality in Media perceived they were limiting their demographic reach with the judgy-sounding, clunky old name, they opted for a rebrand back in 2015, becoming the National Center on Sexual Exploitation. Suddenly, with the flip of a logo, they sounded less like angry Bible thumpers out to cancel your favorite sitcom and more like a serious nongovernmental agency out to prevent real harm.

You know what didn’t change when MIM became NCOSE? The president of the organization. Patrick A. Trueman ran the joint on both sides of the rebrand, from 2010 to 2023. Before that, Trueman was prosecutor at the U.S. Department of Justice during the administration of George H.W. Bush, which also happens to be the last time federal prosecutors aggressively enforced the nation’s obscenity laws. Trueman remains the President Emeritus of NCOSE to this day.

Just as I doubt Trueman lost his zest for cleaning up American media when his organization rebranded, I don’t buy that a lot of the organizations most strenuously supporting various age verification mandates at the state and federal level are really in it to protect minors from harmful materials online – unless one happens to define “harmful” the same way they do, of course.

Referencing remarks recently made by Rep. Leigh Finke, a transgender member of the Minnesota Legislature who has criticized elements of her state’s proposed age verification law, Rindala Alajaji, Associate Director of State Affairs at the Electronic Frontier Foundation (EFF), and Molly Buckley, one of the organization’s legislative analysts, call attention not only to the impact of the Supreme Court’s ruling in Free Speech Coalition v. Paxton, but the nature of the organizations supporting Texas in the case.

“The Paxton case, and the coalition behind it, illustrates exactly how these laws can be weaponized,” Alajaji and Buckley write. “They weren’t there just to stand up for young people’s privacy online—they were there to argue that the state has a compelling interest in shielding minors from material that, in practice, often includes LGBTQ content. Ultimately, these groups would like to age-gate not just porn sites, but also any content that might discuss sex, sexuality, gender, reproductive health, abortion, and more.”

Alajaji and Buckley add that the “coalition of organizations that filed amicus briefs in support of Texas’s age verification law tells us everything we need to know about the true intentions behind legislating access to information online: censorship, surveillance, and control.”

“After all, if the race to age-gate the internet was purely about child safety, we would expect its strongest supporters to be child-development experts or privacy advocates,” the authors note. “Instead, the loudest advocates are organizations dedicated to policing sexuality, attacking LGBTQ+ folks and reproductive rights, and censoring anything that doesn’t fit within their worldview.”

The thing about appealing to people’s desire to protect children is that it works – and for a good reason. It’s a good thing to want to protect your kids. God knows they need protection, including from themselves. Parents should do all the reasonable, rational, normal things they can do to protect their kids.

But if you’re denying a gay or trans kid access to information from people who have been through the same things that kid is going through and can offer guidance, support and maybe a little solace for the kid, you’re not protecting that kid; you’re stifling, aggravating and alienating that kid. Shit, you might be killing that kid – even if you earnestly believe you’re helping.

I can also understand why the idea of age-gating the internet might sound good to people, especially frightened people who are raising kids who are online much more than their parents. But fear is a state of mind that can make people suggestible – and that’s when opportunists don their superhero capes and make a dramatic entrance, promising to make the world (wide web) a safer, better place for you and your kids—without really mentioning the part about how they’re actually in this to keep The Gays from enacting their Sinister Agenda, or whatever it is that animates some of these zealots.

I guess what I’m saying is this: You can’t save your kid from drowning by throwing someone else’s kid into the deep end of the pool with lead boots on. And some of the people promising to provide your kid a life jacket are heavily invested in lead.

Read More »

Australia’s Porn Age-Verification Law Sparks Debate Over Safety and Shift to “Darker Corners”

Pornhub warning collage

Something changed overnight — not just on adult sites, but in how people moved through the internet itself.

When major porn platforms began blocking Australians from access, it didn’t stop there. X also started requiring age checks before users could view adult content. And for some, that meant something far more intrusive: being asked to submit a video selfie just to look at a single post.

“Almost every post on my alt account has a content warning and asks me [for a] selfie for age verification,” one Australian porn consumer, Joe*, said. “It’s maddening.”

Others described pulling back entirely, choosing to walk away rather than comply.

“I’m honestly no longer engaging with any of the sites and platforms I used to use because not only is the verification process really invasive, but some of them even give you the option to sign in with Google … and that’s the last platform I’d trust with any sensitive data,” Jethro said.

“The choices are: link your perversions to your government ID, or submit your face into the AI slop machine,” Chris* said.

It’s still early days. Aside from several Aylo-owned sites like RedTube blocking Australians outright, and Pornhub limiting access to safe-for-work content for users who aren’t logged in, most of the top free adult platforms have yet to fully implement age verification.

Data from the SEO firm Semrush suggests only one site in the country’s top 20 — Thisvid — had complied so far. But with potential fines reaching $49.5 million for violations, more platforms are expected to follow. Users have already begun to react.

Search interest in porn-related terms has climbed to its highest level since pandemic lockdowns ended in 2022. At the same time, searches for virtual private networks — tools that allow users to appear as though they’re browsing from outside Australia — have surged to levels not seen since 2015, when website blocking laws targeting piracy were introduced.

Sex workers say none of this is surprising. For years, they warned that regulations developed between the eSafety commissioner and industry stakeholders could drive users away from regulated spaces and into less controlled environments.

“We’ve already warned that these laws will funnel traffic away from platforms that do have moderation safeguards in place and towards sites that profit from non-consensual and stolen porn, including the unpaid work of sex workers,” said Mish Pony, chief executive of Scarlet Alliance.

“So driving people off mainstream services, such as Pornhub, does not stop porn consumption, it just pushes it into darker corners of the internet. It makes it harder to address real harms.”

Andy Conboi, an OnlyFans creator based in Sydney, said he has already seen the effects firsthand. Engagement on his posts has dropped.

“People don’t really want to send a photo of themselves or their licence or whatever to these platforms, particularly Twitter [X],” he said.

“In the group chats I do have with creators, people are just frustrated and annoyed, their engagement is down [and] it’s much more difficult to put stuff out there and be seen a lot of the time.”

Some creators, he added, are pivoting. They’re shifting toward safe-for-work content on platforms like Instagram and TikTok just to maintain visibility — a move he described as ironic, given the presence of underage users on those services.

For longtime opponents of pornography, however, the changes mark a milestone.

After earlier attempts at internet filtering fell short under previous governments, and opt-out filtering proposals were abandoned before the 2013 election, regulators have gradually expanded their authority over online content. The eSafety commissioner’s role has grown significantly over the past decade.

Advocacy groups that have campaigned for tighter controls welcomed the developments.

“This day was hard fought for,” said Melinda Tankard Reist, movement director for Collective Shout. “Collective Shout and our partners and allies worked hard to bring it to fruition.”

“It is a relief to know proof-of-age protections are now in place as one obstacle in the way of young people being exposed to rape porn, torture porn, incest porn and extreme violence and degradation of women.”

The Australian Christian Lobby also supported the outcome.

“The fact that P*rnhub have ceased operating in Australia is already proof of its effectiveness,” said chief executive Michelle Pearse.

Questions remain about whether those outcomes will hold — or simply shift behavior elsewhere.

Researchers studying similar laws in parts of the United States found that when major sites restricted access, users didn’t necessarily stop searching. They redirected.

“We saw very large substitution effects for search traffic for XVideos, which is the second largest porn website in the states,” said David Lang, a Stanford University researcher and lead author of the report.

“It’s a sufficiently large change that the No 2 site is now the No 1 site in states that passed those laws.”

Tracking VPN use proved more difficult, researchers noted, since users often disappear from local data once they connect through external servers.

For digital rights advocates, the concern isn’t just where people go — it’s what they leave behind.

Tom Sulston, head of policy at Digital Rights Watch, warned that age-verification systems could create centralized pools of highly sensitive personal data.

“It would be absolutely trivial for a criminal to set up porn sites as honeytraps to capture Australians’ identities and sexual interests; and then use that material for blackmail, similar to existing sextortion schemes,” Sulston said.

“Foreign intelligence services looking to trap Australian targets could easily do the same. The age-verification regime puts Australians at greater risk of harm, not less.”

And that’s the uneasy part of it all. The behavior doesn’t disappear — it just moves.

Read More »

Utah’s Proposed Porn Tax Raises Major Civil Liberties Concerns

Utah House building

SALT LAKE CITY — Utah lawmakers are again stepping into the middle of the long-running debate over how far governments should go when regulating online adult content. This time, the focus is a proposed tax on pornography purchased through digital platforms.

Senate Bill 73, introduced earlier this year by Republican lawmakers in the Utah Legislature, would impose what the bill calls a “material harmful to minors” tax on revenue generated from the sale of online pornography. The rate is currently set at 2 percent, after originally being proposed at 7 percent.

After several amendments, the measure passed the state Senate with broad support and now awaits further consideration in the House of Representatives. If approved there, it would head to the desk of Gov. Spencer Cox, who has publicly supported policies aimed at restricting access to online pornography.

The legislation was introduced by Republican state Sen. Calvin R. Musselman and state Rep. Steve Eliason, both of whom have supported previous efforts in Utah to regulate online adult content.

Under the proposal, revenue generated by the tax would be directed toward several state programs. The bill specifies that funds could be used to support enforcement efforts tied to Utah’s existing age verification laws for social media and adult websites, among other regulatory initiatives.

During the legislative process, lawmakers added language addressing virtual private networks (VPNs) and similar technologies used to bypass location-based restrictions. The revised bill would make it illegal to intentionally circumvent content blocks implemented by platforms as part of age verification compliance, with violations subject to civil penalties.

The measure also includes provisions aimed at limiting how websites communicate with users in Utah about these tools. Specifically, the bill states that platforms covered by age verification requirements may not provide instructions or guidance that would allow users to bypass those restrictions.

The current version of Senate Bill 73 states:

“A commercial entity that operates a website that contains a substantial portion of material harmful to minors may not facilitate or encourage the use of a virtual private network, proxy server, or other means to circumvent age verification requirements, including by providing: (a) instructions on how to use a virtual private network or proxy server to access the website; or (b) means for individuals in this state to circumvent geofencing or blocking.”

Measures regulating adult content have appeared in several states in recent years. Alabama, for example, enacted legislation that imposes a 10 percent tax on pornography-related revenue generated within the state, alongside additional legal requirements for adult performers involving notarized consent documentation.

Utah’s proposal does not include those record-keeping provisions, but it does expand the scope of enforcement mechanisms connected to age verification and online access controls.

The tax itself would function similarly to what policymakers often describe as a “sin tax,” a type of levy commonly applied to products such as alcohol, tobacco and gambling. In this case, the tax would apply to companies that generate revenue from online adult content through methods including clip sales, subscriptions and fan-based platforms.

Under the proposal, entities meeting the bill’s definition of “covered entities” would calculate the portion of revenue generated from Utah-based users and pay the 2 percent tax to the state on an annual basis.

If the measure becomes law, larger online platforms could likely absorb the additional compliance costs with relative ease. For smaller companies operating in the adult content market, however, the administrative and regulatory requirements could prove more challenging.

The bill’s future now depends on the outcome of deliberations in the Utah House. Should it pass there and receive the governor’s signature, the measure would add Utah to a growing list of states experimenting with new approaches to regulating digital adult content.

Whether the proposal ultimately reshapes how online platforms operate — or instead becomes the subject of courtroom challenges — may become clear only after the legislative process runs its course.

Read More »

The Web Used to be the “Information Superhighway;” it’s Becoming a Low-Speed School Zone by Stan Q. Brick

Blurred highway

Back in the late 90s and early aughts, it was commonplace to hear the internet referred to as “The Information Superhighway,” a term that for many of us connoted not just speed of transfer, but the relatively unfettered regulatory environment surrounding what was then an emerging network for communications and commerce.

Fast forward to 2026 and those heady days of rapid growth and regulatory permissiveness are gone. Some might say “good riddance,” but I can’t help but wonder what we’re losing as we grope for ways to make the web ‘safer’ for a population who arguably shouldn’t be using it, at all.

During an adult industry trade event over 20 years ago, an attorney friend of mine posed a good question: If the web is the “information superhighway,” who in their right mind would want to build a playground for children in the median of such a thoroughfare?

The answer, then and now, is: “Far too many people.” Crucially, a significant subset of those people are legislators, national, state and local. And these days, every time you turn around, one of them is sponsoring, writing or endorsing a measure like the Kids Internet and Digital Safety (KIDS) Act, or the Innocence Act, or some manner of tax directed specifically at adult websites.

I can’t speak for the populations of other countries, but here in the U.S., what I’ve noticed over the decades is many people look to the government to handle jobs they probably ought to be doing themselves – or indeed, that it’s only possible for them to do for themselves.

Look, I get it; it’s hard raising kids. But the difficulty of being a parent is not a new thing – and it certainly isn’t limited to the internet era. When I was kid, way back in the early 1970s, once I left the immediate vicinity of my parents’ home, they had almost no way of knowing what I was up to – a worrying fact for a lot of parents, especially during times when panics over child abductions and general “stranger danger” were in full swing.

Was it easier for my parents to watch me walk off to catch the school bus back when I couldn’t text to confirm my arrival at school than it is for parents these days to do the same, when their kids have dozens of options for checking in or marking themselves “safe”? I think that’s a tough argument to make.

Yes, largely because of the internet and related technologies, kids today have easier access to things like porn than I did when I was a kid. Guess what? Even in the days when we had to go digging through our fathers’ sock drawers to find porn, we still managed to find it. (Where there’s a hormone-fueled will, there is always a way.)

Of course, the impulse to restrict and regulate access to content deemed to be beyond the years of kids is a lot older than the internet, too. They seem almost quaint now, but broadcast decency standards have been around for decades. Does anyone believe these standards have prevented kids from hearing “profane language” or being exposed to content that is “patently offensive” but does not rise to the point of being “obscene” under federal law? If so, I have a healthy store of bridges on hand to sell to these poor, credulous souls.

Yes, the internet is filled with problematic content. But if your concern about what kids stumble across online is limited to “obscene” or “indecent” content, then you’re either ignorant of what lurks online, or the nature of your concern says more about you than it does the internet.

One thing about the internet has not changed since the days when it was common to call it the Information Superhighway: It remains an enormous network of independently operated computers, on which virtually anyone can publish virtually anything. Mixed in to that ‘everything’ is a long list of things that are potentially “harmful to minors.”

Are sites that promote racial hatred less damaging to minors than pornography? How about sites that disseminate misinformation and disinformation? Are false medical claims something we want kids to be perusing with no guidance or guardrails? How about deepfake videos of a war in progress?

Don’t get me wrong: Not for one minute am I suggesting all those things listed above should be subject to governmental blocking, censorship or over-regulation to prevent their spread. What I’m suggesting – and what I’ve been telling my less-wired friends for literal decades – is simply this: The internet isn’t for children, and it simply can’t be made “safe” for them, try as we might.

The difficult fact is, even if every proposed measure to limit kids’ access to “harmful” content currently under consideration is passed and vigorously enforced, the internet will remain as I described it above – “an enormous network of independently operated computers, on which virtually anyone can publish virtually anything.” To make it ‘safe’ will require fundamentally altering the nature of that network and siloing it to a degree where it will no longer be recognizable as the internet.

And guess what? Even if we do that, you’ll still have to parent your kids. You’ll still have to shepherd them through their early years – and you’ll still have to let go of being a shepherd when they become adults. The internet age didn’t change any of that, either.

If you believe the answer will come from the government, if you believe legislation like the KIDS Act or the Innocence Act will make the world (or even just the internet) a substantially safer place, knock yourself out. Write to your representatives and demand that they pass those laws – and then see what happens.

I’ll tell you what isn’t going to happen: Your job as a parent isn’t going to get easier. The sooner you accept that and get on with the difficult business of raising a child, the better.

Read More »

There’s a “Porn Lesson” to Take from Lindsey Vonn’s Olympic Experience (No, Really) by Stan Q. Brick

Lindsay Vonn

When champion skier Lindsey Vonn experienced a terrible crash on what turned out to be her final run in the women’s downhill skiing event at the Winter Olympics in Milan earlier this month, maybe there were a few people out there thinking she shouldn’t have been permitted to take the risk of running the race, given that she already had a torn ACL injury in her left knee. But if a significant number of people felt that way, they seem to have kept it to themselves, for the most part.

Instead, the dominant reaction to Vonn’s knowing acceptance of added risk rightfully has been to praise her bravery, determination and champion spirit. As Madison Chapman wrote for Newsweek, “Winner or not, Vonn is the ideal Olympic champion. Her grit and resilience helped me shed my own fear of risk and learn to see myself as a champion over adversity after my cancer treatment and subsequent knee injury. She may not have clinched gold, but Lindsay Vonn reminded us all how to live.”

I’ve always been fascinated by the way people view the act of taking a physical risk, be it in the context of competitive skiing, climbing a mountain or something as fundamental managing one’s personal health. I’ve long believed that the question of whether something is safe to do is a different question than whether ought to be allowed to do it. As I see it, it’s not complicated; adults should be allowed to take informed risks – including a litany of risks I would never take, myself.

Doubtlessly, one reason Vonn found so much support for her decision is the competitive context. She was attempting to win a gold medal, an achievement for which there’s a very limited window of opportunity, one that only comes around every four years – and only for so many cycles in an athlete’s career.

Make no mistake, though; the reason Vonn’s decision, the Olympic Games themselves and Vonn’s injuries are global news is because sports are popular entertainment – and big business.

In other words, while we support Vonn’s chosen form of risk taking because competition is deemed a worthy enterprise by a significant portion of the human population, we also support it because we accept, at least in the context of sport, that people have a right to risk bodily harm in the process of entertaining other people.

We’re not consistent about this acceptance of risk for entertainment’s sake, of course. The response to people taking risks in the context of porn is less enthusiastic. Sometimes it inspires proposals specifically designed to deter peoplefrom plying their trade in adult entertainment.

I’m not saying I think social media should light up with words of encouragement every time a porn star gets nominated for an award, or when an adult content creator releases a new clip (although that would be nice). But maybe, if society can applaud people for risking grievous bodily harm while competing on the Olympic stage, society can at least also manage to avoid shaming people and subjecting them to paternalistic government regulation when the risks they take involve other, less celebrated forms of entertainment.

Read More »

Debanking Explained: How Politics, Policy, and Perception Shape Account Closures

A board with the word debanking.

The Cato Institute has published a report on debanking. You can find it here: https://www.cato.org/policy-analysis/understanding-debanking-evaluating-governmental-operational-political-religious?utm_source=social&utm_medium=email&utm_campaign=Cato%20Social%20Share

Here’s what the report is all about:

Debanking can be a frustrating and deeply unsettling experience. One day everything seems fine, and the next, a notice arrives giving just 30 days to withdraw funds and find a new financial institution. Confusion quickly turns into anxiety. Bills were paid, nothing appeared out of the ordinary — so what changed? A call to the bank rarely brings clarity, and the response is often the same: no further details can be provided. Customers are left with more questions than answers.

On the other side of the conversation, bankers are frequently constrained by strict confidentiality requirements. Even frontline staff may not have access to the underlying reasons for an account closure. Financial institutions operate within a framework of anti–money laundering, know your customer, and countering the financing of terrorism regulations — commonly referred to as AML, KYC, and CFT. While these rules are standard practice within the industry, they remain largely invisible to the public, creating a disconnect that fuels frustration on both sides.

For those looking to address the growing concern around debanking, some argue that meaningful change will require greater transparency. That could mean reconsidering the confidentiality that surrounds account closures, removing reputational risk as a regulatory factor, and reevaluating the Bank Secrecy Act framework that effectively places financial institutions in the role of investigative gatekeepers.

Read More »

Good News: Sometimes Adult Businesses DO Get Treated Like Everyone Else by Stan Q. Brick

Judge's gavel

In a country where it seems like lawsuits get filed at the drop of a hat – particularly if the hat is quite hard, quite heavy and falls on someone’s toes, causing both physical injury and extreme emotional distress – the fact that our courts do make plaintiffs jump through at least a minimal set of hoops can be something of a comfort.

For example, if I get into a fender bender with someone in California, but that person lives in New York, they probably can’t haul me into court in New York to make me face a lawsuit there, simply because New York happens to be the plaintiff’s state of domicile. They’d likely have to sue me in California, due to the way the courts handle the question of personal jurisdiction.

As you may have heard, a district court in Kansas applied this logic in dismissing a couple lawsuits filed against companies that operate adult websites, because a plaintiff there alleged those sites are not complying with the state’s age verification requirements for adult sites.

Among other things, judge in the case, U.S. District Judge Holly Teeter, wrote in her decision dismissing a lawsuit against Titan Websites that “merely intending that users accessing its content be able to do so from a wide geographic area is not the same as purposefully directing one’s activities at a forum.”

“Technical steps taken to make a universally accessible website easier for all users to access no matter where they are located is no more purposeful direction than the act of setting up the website in the first place,” Teeter added. “And just like the act of setting up a website, were the indiscriminate use of a CDN or other technologies to indiscriminately facilitate content delivery enough, ‘then the defense of personal jurisdiction, in the sense that a State has geographically limited judicial power, would no longer exist.’”

Teeter also wrote that her reasoning “does not mean that a website owner’s use of a CDN is never relevant” and “does not mean that a website owner’s use of a CDN could never show purposeful direction.”

“It does mean that more is needed to determine how the CDN is used and whether the CDN is being used to target a forum or an immediate region of which the forum is a part,” Teeter wrote. “The Court need not dissect the contours to resolve this case. Here, Plaintiff simply alleges that a CDN is being used and that the CDN has servers near the forum because logically it must. Defendant responds with evidence that it uses a third-party web-hosting service and that it does not know or care where the CDNs are located. This record is not enough to carry Plaintiff’s admittedly light burden.”

This dismissal of this case, as well as Teeter’s decision dismissing an identical case against a company called ICF Technology, is certainly good news for other adult businesses that might find themselves hailed into court over alleged violations of a state’s age verification law. They are not, of course, the end of the story.

The plaintiff is likely to appeal these decisions, whereupon the matter will go to the Tenth Circuit Court of Appeals. I’m no lawyer and I don’t have much to offer in terms of a prediction as to how the Tenth Circuit might ultimately rule. I just know that I don’t have much confidence in how the next court up the chain, the U.S. Supreme Court, might rule, should they take up the question.

Having found the age verification law passed in Texas to be constitutional, it wouldn’t surprise me one bit if SCOTUS decided that merely being accessible in a state creates a sufficient “minimum contact” with any given state for a court there to assert personal jurisdiction.

Still, at least for the time being, Teeter’s decisions represent something of a victory for the porn side of the War on Porn. Whether that victory is lasting or ephemeral remains to be seen. Fingers crossed.

Read More »

Adult Creators Keep Getting Debanked — And the Fallout Goes Far Beyond Them

Financial discrimination

Your bank may never send you a memo about it, but it’s quietly shaping your life.

Every time you click “buy now,” a small army of institutions decides whether that purchase gets to exist. And for adult creators, that army has been steadily tightening its grip. For years, people in the industry have been warning about financial discrimination and debanking — the sudden closure of accounts, the polite but devastating “we can no longer do business with you.” It’s happening more often now. And it’s happening quietly.

“I don’t know what could happen next or when it might happen,”

Adult VTuber, journalist, and activist Ana Valens says. In just two weeks last November, nearly every platform she relied on either removed her content or suspended her outright. “While my Patreon and Ko-fi were reinstated, I’ve spent the past two months waiting for the other shoe to drop — another Patreon ban, my PayPal deactivated, and so on.” She reached out for explanations. Most platforms couldn’t clearly articulate how she’d violated their terms. Ko-fi didn’t respond until repeated messages finally led to reinstatement.

That kind of uncertainty lingers. It’s like walking on ice that might crack at any moment.

“Deplatforming and debanking are an occupational hazard for any adult content creator,” says Gina, a co-founder of PeepMe, a startup that set out to build a worker-owned creator marketplace. PeepMe was imagined as an alternative to OnlyFans and Patreon — a space where creators could hold equity, elect a democratic board, and receive quarterly profit-sharing dividends.

Gina requested that a pseudonym be used, given her continued work adjacent to the adult industry and the very real fear of financial fallout. “Even still, I’ve never seen someone banned on so many sites before [as Ana has been],” she says.

And it’s not just adult creators feeling the pressure. Companies in oil and gas, cryptocurrency, tobacco, and firearms have also raised concerns about politically motivated debanking. The pushback has grown loud enough that U.S. regulators are now stepping in, attempting to rein in financial discrimination.

Who’s Blocking My Buying?

When you make an online purchase, your money doesn’t travel in a straight line. It passes through layers of gatekeepers. The pipeline often looks like this:

  1. Platform (merchant) websites: where creators earn income — YouTube, Patreon, Etsy, DoorDash, Steam.

  2. Payment processors: companies that route the transaction between card networks and banks — PayPal, Stripe.

  3. Card networks: Visa, American Express, Mastercard — the rule-makers that standardize how buyers and sellers interact.

  4. Your bank and the seller’s bank: Wells Fargo, Bank of America, and so on.

Each step has discretion. Beyond preventing illegal activity, these institutions can decide what kinds of money they’re willing to touch.

“The rules set by card networks are sometimes vague,” says Dr. Val Webber, a postdoctoral researcher at Dalhousie University’s Sexual Health and Gender Research Lab. Mastercard’s June 2025 rules restrict “any Transaction that […] in the sole discretion of [Mastercard], may damage the goodwill of [Mastercard] or reflect negatively on the [brand].”

“In the sole discretion” is doing a lot of work there.

Last summer, Steam and itch.io removed or deindexed adult games after pressure from payment processors and card networks. Steam cited pressure from Mastercard, conveyed through processors like Stripe. Stripe told itch.io, “Stripe is currently unable to support sexually explicit content due to restrictions placed on them by their banking partners, despite card networks generally supporting adult content.” Stripe’s prohibited business list includes “pornography and other mature audience content (including literature, imagery, and other media) designed for the purpose of sexual gratification.”

Mastercard later denied involvement. In August 2025, the company stated, “Mastercard has not evaluated any game or required restrictions of any activity on game creator sites and platforms, contrary to media reports and allegations.”

Meanwhile, Valens saw her articles disappear from Vice. “My suspicion is that it was easy for a financial company to flag me as high risk as a punitive measure for my content, or my activism work,” she says. Attempts to obtain comment from Vice were unsuccessful.

Who Can Get Debanked?

“We have lots of data to show that people in the adult industry face financial discrimination in the form of their accounts being closed, being denied mortgages, business loans, and other banking services — despite banks often not being able to substantiate legal reasons related to these individual accounts,” says Maggie MacDonald, a PhD researcher at the University of Toronto.

The tension escalated in December 2020 when Visa and Mastercard cut ties with Pornhub, citing child sexual abuse material (CSAM). “Our adult content standards allow for legal adult activity created by consenting individuals or studios,” Mastercard said at the time. “Merchants must have controls to monitor, block and remove unlawful content from being posted.” Pornhub denied hosting illegal content and emphasized the harm to “the hundreds of thousands of models who rely on [their] platform for their livelihoods.”

But here’s the inconsistency that nags at people: X continues to process payments despite widespread reports of CSAM and non-consensual deepfake content. No sweeping financial freeze there.

Watching major platforms lose payment relationships makes smaller startups tread lightly. “We just can’t afford to lose our ability to do business with these financial companies,” Gina says. “Stripe takes only 2.9 percent from businesses they’re willing to work with, while high-risk processors willing to take on adult content can charge up to 15 percent.”

That difference can sink a company before it starts.

“Losing a relationship with card networks is a risk payment processors can’t afford, and losing relationships with payment processors is a risk that platform websites can’t afford,” explains Webber. “In the end, the responsibility of ensuring their content stays within the lines of these oftentimes unclear rules trickles down to each individual creator. Because ultimately, content creators are more expendable to platforms than payment processors and card networks.”

One justification often cited is chargebacks — when customers reverse credit card transactions. Gina isn’t convinced.

“Locking out entire industries makes less and less sense as fraud detection technology advances,” she says. “Payment processors and card networks already have processes to step in when an individual business has a high rate of chargebacks, there’s no reason to block out a whole industry.” Mastercard recently announced expanded generative AI fraud-detection tools, building on already sophisticated monitoring systems.

“We also haven’t seen the claim of high-chargebacks in adult content substantiated anywhere in terms of measured data,” adds MacDonald. “As a researcher, that makes me suspicious of the criteria these companies are using behind the scenes.”

The Evolving Landscape of Banking Regulations

In February 2025, the Free Speech Coalition filed a statement with the U.S. House Committee on Financial Services, calling for due process protections, objective risk assessments, and explicit recognition that lawful adult businesses do not inherently present financial crime risk. Blocking entire industries without individualized evaluation, the statement argued, is regulatory overreach with serious implications for free speech.

Multiple efforts are underway in the United States to limit financial institutions from denying service for reasons beyond legal violations. In August 2025, President Donald Trump issued an executive order directing regulators to investigate and reverse politically motivated debanking. Bank regulators have begun removing “reputational risk” from compliance criteria, and proposed Senate legislation would impose civil fines on banks and card networks that avoid entire categories of customers.

“Card networks and payment processors began by blocking pornography, but they’ve moved into other online industries as well,” says Webber. “The line in the sand continues to shift, and it has recently expanded to video game creators and streamers as well. We don’t know how these rules might evolve, and what type of online content might be next.”

Valens has spent months urging customers to call Mastercard, Visa, PayPal, and Stripe to question purchase restrictions and account freezes. Visa points to its policies for combating illegal activity; PayPal requires pre-approval for adult materials, similar to tobacco; Stripe states it does not support adult content.

“Private companies have been deputized to decide how we can earn and spend our money,” says MacDonald. “Anyone who is ideologically misaligned with any of these companies faces the risk of losing their livelihood.”

That’s the part that lingers.

It’s not just about porn, or games, or activism. It’s about the invisible committee that votes on your transactions — and whether one day, without warning, they decide you don’t get a vote at all.

Read More »

Conservative Lawmakers Push Porn Taxes — Critics Call It Unconstitutional Speech Policing

Taxes

The war on porn doesn’t look like a war anymore. It looks like a line item on a tax form.

As age-verification laws keep tightening their grip on the adult industry — and, quietly, on the broader idea of free speech online — an Utah lawmaker has proposed something new. Or maybe not new. Just sharper. A bill introduced last month would slap a tax on porn sites operating in the state.

Introduced by state senator Calvin Musselman, a Republican, the bill would impose a 7 percent tax on total receipts “from sales, distributions, memberships, subscriptions, performances, and content amounting to material harmful to minors that is produced, sold, filmed, generated, or otherwise based” in Utah. If passed, the bill would go into effect in May and would also require adult sites to pay a $500 annual fee to the State Tax Commission. Per the legislation, the money made from the tax will be used by Utah’s Department of Health and Human Services to provide more mental health support for teens.

Musselman did not respond to a request for comment.

There’s a certain rhythm to this moment in American politics. Conservative lawmakers across the country are circling adult content with renewed intensity. In September, Alabama became the first state to impose a porn tax on adult entertainment companies (10 percent) after passing age-verification mandates requiring users to upload ID before viewing explicit material. Pennsylvania lawmakers are weighing a bill that would add a 10 percent tax on “subscriptions to and one-time purchases from online adult content platforms,” even though digital products are already subject to a 6 percent sales tax, two state senators wrote in a memo in October. Arizona floated a similar idea back in 2019, when state senator Gail Griffin proposed taxing adult content distributors to help fund the border wall during Donald Trump’s first term. Meanwhile, 25 states have passed some form of age verification.

It’s not just about taxes. For years, efforts to criminalize or restrict sex work have ebbed and flowed, usually intensifying during moments of heightened online surveillance and censorship. But targeted taxes have struggled to gain widespread traction. Why? Because their legality is murky at best.

“This kind of porn tax is blatantly unconstitutional,” says Evelyn Douek, an associate professor of law at Stanford Law School. “It singles out a particular type of protected speech for disfavored treatment, purely because the legislature doesn’t like it—that’s exactly what the First Amendment is designed to protect against. Utah may not like porn, but as the Supreme Court affirmed only last year, adults have a fully protected right to access it.”

Utah, Alabama, and Pennsylvania are among 16 states that have adopted resolutions declaring porn a public health crisis. “We realize this is a bold assertion not everyone will agree on, but it’s the full-fledged truth,” Utah governor Gary Herbert tweeted in 2016 after signing the resolution. Utah’s long-running campaign against explicit material stretches back decades. In 2001, it became the first state to appoint an obscenity and pornography complaints ombudsman — a position colloquially known as the “porn czar.” That role was eliminated in 2017.

The industry, for its part, has been trying to keep up with the shifting rules. “Age restriction is a very complex subject that brings with it data privacy concerns and the potential for uneven and inconsistent application for different digital platforms,” Alex Kekesi, vice president of brand and community at Pornhub, said in a previous interview. In November, the company urged Google, Microsoft, and Apple to implement device-based age verification across their operating systems and app stores. “We have seen several states and countries try to impose platform-level age verification requirements, and they have all failed to adequately protect children.” To comply with new age gate mandates, Pornhub has blocked access in 23 states.

Critics argue that these policies were never truly about protecting children in the first place. In 2024, a video leaked by the Centre for Climate Reporting showed Russell Vought, a Trump ally and Project 2025 coauthor, describing age verification laws as a “back door” to a federal porn ban.

There’s a strange irony here. Platforms like OnlyFans and Pornhub helped mainstream digital sex work, bringing it out of the shadows and into subscription dashboards and creator analytics. But that visibility has made it easier to regulate, track, and now tax. As more states experiment with tariffs tied specifically to sexual content, creators — not lawmakers — are likely to feel the immediate impact.

The skewed ideology of cultural conservatism that is taking shape under Trump 2.0 wants to punish sexual expression, says Mike Stabile, director of public policy at the Free Speech Coalition, a trade association for the adult industry in the US. “When we talk about free speech, we generally mean the freedom to speak, the ability to speak freely without government interference. But in this case, free also means not having to pay for the right to do so. A government tax on speech limits that right to those who can afford it.”

OnlyFans states that it complies with tax requirements in the jurisdictions where it operates, and creators are responsible for managing their own tax affairs. Pornhub, which is currently blocked in Utah and Alabama, did not respond to a request for comment.

Douek notes that while the Supreme Court recently upheld age-verification laws in Texas — allowing states to regulate minors’ access to explicit material — “a porn tax does nothing to limit minors’ access to this speech—it simply makes it more expensive to provide this content to adults.” A 2022 report from Common Sense Media found that 73 percent of teens aged 13 to 17 have watched adult content online. Young people often encounter explicit material on social media platforms such as X and Snap, sometimes intentionally, often accidentally. A survey last year from the UK’s Office of the Children’s Commissioner reported that 59 percent of minors are exposed to porn unintentionally, primarily via social media, up from 38 percent the year before.

In Alabama, as in Utah’s proposal, tax revenue is earmarked for behavioral health services, including prevention, treatment, and recovery programs for young people.

Alabama state representative Ben Robbins, the bill’s Republican sponsor, said in an interview last year that adult content was “a driver in causing mental health issues” in the state. It’s an argument that surfaces again and again in debates about a nationwide porn ban. Some research suggests adolescent exposure may correlate with depression, lower self-esteem, or normalization of violence, but health professionals have not reached consensus.

With lawmakers reframing the conversation around underage harm, Stabile argues that the principle at stake is bigger than porn itself. Content-specific taxes on speech, he notes, have repeatedly been struck down as unconstitutional censorship.

“What if a state decided that Covid misinformation was straining state health resources and taxed newsletters who promoted it? What if the federal government decided to require a costly license to start a podcast? What if a state decided to tax a certain newspaper it didn’t like?” he says. “Porn isn’t some magical category of speech separate from movies, streaming services, or other forms of entertainment. Adult businesses already pay taxes on the income they earn, just as every other business does. Taxing them because of imagined harms is not only dangerous to our industry, it sets a dangerous precedent for government power.”

And that’s the quiet part that lingers.

When governments start deciding which kinds of speech deserve a surcharge, the debate stops being about porn. It becomes about who gets to speak freely — and who has to pay extra for the privilege.

Read More »

The Human Cost of Overregulation by Morley Safeword

Age verification

Over the decades I’ve worked in the adult entertainment business, it has struck me many times how concerned the industry’s critics appear to be about the welfare of those of us who work in the industry – and how quickly that concern turns to consternation and scorn, should we insist that we’re doing what we do gladly and of our own free will.

“Nonsense,” the critics say, “these poor souls only think they are engaging in this depravity willingly; the truth is they have been brainwashed, coerced, cajoled and manipulated into believing they want to participate in this filth.”

Granted, not a lot of people have spilled ink along these lines to fret over the wellbeing of freelance writers like me. I think we’re counted as being among the exploiters, rather than the exploited, or perhaps as enablers of exploitation. Still, there’s no denying I derive my living, meager though it may be, from adult entertainment, even if all I do is write about it, rather than perform in or film it.

While many of the regulations aimed at the adult industry are couched as attempts to protect minors from the alleged harm of viewing pornography, when these measures are discussed by their proponents, “success” is often defined as making the adult industry retreat from their jurisdiction altogether. If a site like Pornhub blocks visitors from an entire state, including all the adults in that state who are still legally entitled to access the site even under newly established age verification mandates, those who cooked up the laws often describe this development as a sign the law is “working.” As I’ve written before, the chilling effect is a feature of these measures, not a bug.

By the same token, if a new law or regulation makes it harder for adult content creators to make their own movies, distribute their own photos or perform live on webcams, that too is something to be celebrated by the legislators and activists who champion those regulations.

Gone is all thought or discussion of the wellbeing of adult content creators and performers, once the potential cause of harm is the law itself. This holds true of purported “anti-trafficking” statutes. While sex workers themselves largely oppose measures like FOSTA/SESTA and say the law has made them less safe, not more, the proponents and sponsors of such legislation don’t want to hear it. Yes, these paternalistic politicos and crusading critics will protect these wayward adults from themselves, even if it kills them.

I can only imagine that if a state legislator from any of the dozens of states that have passed age verification requirements were to learn that adult content creators (and the platforms that host their work) are having a harder time earning a living under these new regulatory schemes, their response would be brief and callous: “Good,” they’d probably say, “now they can go out and look for more respectable work!”

And what happens when former porn performers do find work in other fields? The stigma of porn follows them. They get fired. They are told their mere presence in a classroom is disruptive. They are hounded on social media. They are treated like pariahs by the very people who supposedly care about their welfare.

A law or regulation can be well-intended and still do harm. I don’t doubt some of the politicians involved in crafting age-verification laws and other purportedly protective regulations believe they are doing things in the best interests of both minors and the adults who work in porn, or in the broader world of sex work. But it’s hard to believe they truly care about the latter two when there’s so little thought given to the potential negative impact on them during the crafting of these laws.

As more states toy with the idea of establishing a “porn tax,” will any of them pause to consider the impact on the human beings targeted by such taxes? I’d strongly advise not trying to hold your breath while waiting for that manner of concern to be expressed.

Read More »