A board with the word debanking.

Debanking Explained: How Politics, Policy, and Perception Shape Account Closures

The Cato Institute has published a report on debanking. You can find it here: https://www.cato.org/policy-analysis/understanding-debanking-evaluating-governmental-operational-political-religious?utm_source=social&utm_medium=email&utm_campaign=Cato%20Social%20Share

Here’s what the report is all about:

Debanking can be a frustrating and deeply unsettling experience. One day everything seems fine, and the next, a notice arrives giving just 30 days to withdraw funds and find a new financial institution. Confusion quickly turns into anxiety. Bills were paid, nothing appeared out of the ordinary — so what changed? A call to the bank rarely brings clarity, and the response is often the same: no further details can be provided. Customers are left with more questions than answers.

On the other side of the conversation, bankers are frequently constrained by strict confidentiality requirements. Even frontline staff may not have access to the underlying reasons for an account closure. Financial institutions operate within a framework of anti–money laundering, know your customer, and countering the financing of terrorism regulations — commonly referred to as AML, KYC, and CFT. While these rules are standard practice within the industry, they remain largely invisible to the public, creating a disconnect that fuels frustration on both sides.

For those looking to address the growing concern around debanking, some argue that meaningful change will require greater transparency. That could mean reconsidering the confidentiality that surrounds account closures, removing reputational risk as a regulatory factor, and reevaluating the Bank Secrecy Act framework that effectively places financial institutions in the role of investigative gatekeepers.

Read More »
Prostitute

Colorado Democrats Propose Bill to Decriminalize Consensual Sex Work

Colorado lawmakers are once again stepping into one of the most complicated policy debates out there — how the law should treat sex work. This time, a group of Democratic legislators has introduced a bill that could decriminalize “commercial consensual sex among consenting adults,” a move that’s already stirring strong reactions on both sides of the aisle.

State Sens. Nick Hinrichsen of Pueblo County and Lisa Cutter of Jefferson County joined House Reps. Lorena Garcia and Rebekah Stewart in sponsoring SB26-097. The measure aims to decriminalize “commercial consensual sex among consenting adults,” a phrase that sounds clinical but carries decades of debate, stigma, and lived reality behind it.

If passed, the bill would repeal several existing criminal offenses, including prostitution, soliciting for prostitution, keeping a place of prostitution, patronizing a prostitute, and “a prostitute making a display.” It would also eliminate charges tied to pandering when the act involves knowingly arranging or offering to arrange circumstances that allow someone to engage in prostitution.

That doesn’t mean everything disappears from the criminal code. The legislation makes clear that penalties for pandering involving menacing or intimidation would remain, with fines ranging from $5,000 to $10,000 in addition to other court-imposed penalties. Pimping — defined as living off another person’s earnings from commercial sexual activity — would continue to be classified as a class three felony.

Protections for minors are also spelled out. Young people who are victims of human trafficking would retain immunity from criminal liability or juvenile delinquency proceedings. Law enforcement officers who suspect trafficking involving a minor would still be required to report it immediately, and trafficking victims seeking help through emergency services or medical providers would remain shielded from prosecution for prostitution.

Licensing, however, introduces another layer of scrutiny. Criminal history would play a significant role in determining whether someone qualifies for a license, with the bill stating, “For purposes of determining good moral character, the local licensing authority may consider the criminal record of all applicants, including, but not limited to, any conviction or guilty plea to a charge based on acts of dishonesty, fraud, deceit, OR sexual misconduct of any kind, whether or not the acts were committed in this state.”

The proposal also envisions formal contracts for commercial sex transactions. Under the bill, anyone purchasing services would be required to sign a written agreement outlining the names and addresses of both parties, the type of services, duration, payment amount, and any special conditions. Escort bureaus would keep copies of these contracts, submit them to local licensing authorities, and treat them as open public records — a detail that feels likely to spark its own set of privacy debates.

Local governments would have the option to adopt resolutions authorizing the licensure of massage facilities, though applicants with convictions related to human trafficking or money laundering would be denied licenses outright.

Supporters of decriminalization often point to research suggesting improved health and safety outcomes for sex workers when criminal penalties are lifted. Reviews of studies across high-income countries indicate that decriminalized environments are linked to better access to health resources and safer practices, while criminalized settings have shown higher rates of drug use and lower condom utilization.

Opposition has been swift and vocal. Colorado Republicans criticized the measure as “a green light for exploitation, commodifying bodies, and fueling human trafficking in a state already ranking high for it.” The argument taps into longstanding fears about whether legalization protects workers or inadvertently expands underground activity.

Some critics also pushed back against what they see as a statewide mandate overriding local autonomy. Concerns were raised that municipalities would lose the ability to enforce their own restrictions or opt out of the framework entirely.

Pitkin County Republicans questioned the timing and priorities behind the bill, suggesting it was an unusual approach to addressing fiscal challenges. In a Facebook post, the group asked, “Is this what they should be focused on? Do they think that the taxes from legal prostitution will be the answer to our $850 million budget deficit? Do we want our state to be funded by drugs and sex? If passed, we will join Nevada and be one of the only two states that have legalized prostitution. And by the way, the bill as written does not allow cities or counties to have their own criminal bans on such activity.”

Weld County Commissioner Scott James echoed concerns about local authority, accusing lawmakers of a “power grab” and warning that local governments would ultimately be left to manage the fallout.

Right now, Nevada remains the only state where prostitution is fully legalized, though even there it’s confined to certain counties and tightly regulated through licensed brothels. Maine took a partial step in 2023 by decriminalizing aspects of sex work, hinting at a gradual shift rather than a sweeping national change.

Data from the National Human Trafficking Hotline adds another layer to the conversation. In 2024, the highest numbers of identified cases were reported in California, Texas, Florida, New York, and Illinois, with Colorado ranking 20th nationwide — a statistic that both supporters and opponents interpret through very different lenses.

If SB26-097 clears the legislative process, it would take effect on July 1, 2026. And like most laws that touch on morality, economics, and personal autonomy all at once, its impact would likely ripple far beyond the text itself. Sometimes legislation isn’t just about rules — it’s about which stories a society chooses to believe, and which ones it’s finally ready to reconsider.

Read More »
Wisconsin flag

Wisconsin Age Verification Bill Advances Without Anti-VPN Measures

There’s always that moment in a legislative debate where the fine print becomes the real story. Not the headline promise, not the talking points — but the quiet clause that disappears or survives. In Wisconsin this week, that moment revolved around VPNs, privacy, and the ever-evolving chess match between lawmakers and the internet.

The Wisconsin state Senate moved forward with a bill requiring adult websites to verify users’ ages, but not without a notable shift. Lawmakers approved an amendment stripping out language that would have forced sites to block virtual private network traffic — a detail that felt small on paper but huge in practice.

Critics of state-level age verification laws often point to a glaring reality: the internet doesn’t respect borders. Because these laws apply only within a single state, users can simply flip on a VPN and appear somewhere else entirely. That workaround has become almost folklore at this point, and the growing media spotlight on VPN use has nudged lawmakers across the country to explore ways of closing that loophole.

Originally, Assembly Bill 105 tried to do exactly that. The measure included a requirement that operators “prevent persons from accessing the website from an internet protocol address or internet protocol address range that is linked to or known to be a virtual private network system or virtual private network provider.” But one of the bill’s co-sponsors later introduced an amendment removing the provision — a move that quietly reshaped the bill’s practical impact.

On Wednesday, the Senate agreed to that amendment. The Assembly followed suit almost immediately, approving the revised language and clearing the path for the bill — now without its VPN clause — to head to the governor’s desk.

Interestingly, the VPN issue barely surfaced during the session’s discussion. Instead, the debate that did emerge centered on privacy and data concerns. Two senators, one Democrat and one Republican, voiced opposition to the broader bill, warning about intrusiveness, data retention, and the uneasy feeling many people have when personal information becomes the price of access.

Still, the VPN question isn’t going away. Whether Wisconsin chose to sidestep it or simply delay the conversation, the broader tension remains unresolved. Other states are watching closely, and some are already taking a more aggressive stance.

West Virginia’s SB 498, for example, includes language that would explicitly prohibit platforms from allowing users to bypass age checks through VPNs, proxies, or other anonymizing tools. That proposal is waiting for its first committee hearing, but its existence alone signals where parts of the legislative map may be heading.

Meanwhile, Indiana has taken the fight into the courts. The state is suing Aylo, arguing that the company and its affiliates failed to prevent users from skirting age verification through VPNs and location spoofing. Indiana’s statute doesn’t explicitly mention VPNs, yet the lawsuit claims the company is still in violation “because Indiana residents, including minors, can still easily access the Defendants’ websites with a VPN IP or proxy address from another jurisdiction or through the use of location spoofing software.”

If West Virginia’s bill gains traction — or if Indiana prevails in court — it could ripple outward, encouraging other states to push harder on the VPN front. And that’s the strange rhythm of these debates: each state becomes a test case, each lawsuit a signal flare. The technology moves fast, the laws try to catch up, and somewhere in between sits the user, toggling settings and wondering how much privacy is left in the process.

Read More »
Stripper

Oklahoma Weighs New Bill to Require Licensing for Strip Club Performers

There’s something about the quiet march of legislation that rarely feels quiet. It slips in through committee hearings and procedural language, but underneath it all, you can almost hear the ripple effect coming. That’s the vibe surrounding a new proposal moving through Oklahoma — one aimed squarely at strip clubs, performers, and the rules that shape their livelihoods.

State lawmakers in the Oklahoma House of Representatives are weighing a measure that would require licensing for strip clubs and raise the legal minimum age to work as an “exotic entertainer.” On paper, it reads like a technical update. In reality, it touches careers, identities, and the already complicated world of adult entertainment work.

The proposal, known as the “Entertainer Safety and Verification Act,” cleared committee with bipartisan and unanimous support. Legislative records show a surprisingly unified front — the kind that usually signals momentum, whether people are ready for it or not.

House Bill 3832 was reviewed by the House Business Committee earlier this week. Republican state Rep. Stan May, serving as majority caucus chair, introduced the measure with a stated goal of countering “human trafficking” within adult entertainment venues. It’s a familiar justification, one that tends to carry emotional weight and political traction in equal measure.

Under the bill, an “exotic entertainer” is broadly defined as anyone performing in a sexually oriented business — from dancers to other erotic performers. The legislation would require performers to obtain a license from the Oklahoma Alcoholic Beverage Laws Enforcement (ABLE) Commission and, notably, be at least 21 years old.

That age requirement alone represents a significant shift. Currently, Oklahoma allows individuals as young as 18 to work as exotic entertainers. To secure a license under the proposed rules, a performer would need to be a U.S. citizen, meet the new age threshold, and have no convictions related to human trafficking, indecent exposure, or prostitution-related offenses. Performing without a license could lead to misdemeanor charges, carrying penalties of up to a $500 fine, a year in county jail, or both.

Repeat violations raise the stakes. Subsequent offenses would still be classified as misdemeanors but could result in fines up to $1,000 and the same potential jail time. Meanwhile, businesses employing unlicensed performers — or failing to keep the required records — would face administrative fines of $5,000 per incident.

Those penalties escalate quickly. Additional violations could trigger $10,000 fines per offense and the suspension of all ABLE-issued licenses for at least a year, a consequence that could effectively shut down operations and ripple through staff and performers alike.

The legal exposure doesn’t stop at fines. Owners and managers who knowingly employ unlicensed entertainers or fail to maintain proper documentation could face felony charges, with penalties of up to $1,000 and prison sentences ranging from one to three years. Further violations would increase both fines and prison time, and offenders would be barred from owning or operating adult clubs featuring licensed dancing.

Oklahoma isn’t alone in this approach. Florida enacted similar legislation, raising the minimum age for dancers from 18 to 21 and attaching criminal penalties for noncompliance. Laws like these tend to arrive in waves, often framed as protective measures but leaving lingering questions about autonomy, economic impact, and unintended consequences.

As it stands, HB 3832 appears poised to move forward through the state legislature, buoyed by bipartisan backing and the expectation of amendments along the way. And that’s the thing about bills like this — they rarely land exactly as introduced, yet the direction they point in is unmistakable. A tightening grip, a shifting standard, and a reminder that in the adult industry, change doesn’t knock politely. It just shows up and rearranges the room.

Read More »
Shein logo

Shein Under EU Investigation Over CSAM and ‘Child-Like’ Doll Claims

BRUSSELS—The European Commission announced Monday that it will initiate a Digital Services Act investigation into popular e-commerce platform Shein amid claims that the company hosts “adult” and “age-restricted” content and engages in the marketing and sale of “child-like sex dolls” and child sexual abuse material (CSAM).

The European Commission, the executive arm of the European Union (EU), is responsible for enforcing the Digital Services Act (DSA), a regulatory framework for online safety that includes age verification and measures to counter illegal activities. It’s the rulebook for the modern internet marketplace—less Wild West, more guarded storefront. If something wouldn’t fly in a physical shop window, the thinking goes, it shouldn’t get a free pass online either.

“In the EU, illegal products are prohibited—whether they are on a store shelf or on an online marketplace,” said Henna Virkkunen, the executive vice president for tech sovereignty, security and democracy, in a press statement announcing the Shein investigation. “The Digital Services Act keeps shoppers safe, protects their well-being, and empowers them with information about the algorithms they are interacting with. We will assess whether Shein is respecting these rules and their responsibility.”

There’s something striking about that phrasing—store shelf or online marketplace. It’s a reminder that regulators are increasingly done pretending the internet is some separate universe. If anything, it’s more powerful. More intimate. And when minors are allegedly exposed to content they shouldn’t be seeing? That’s when patience runs thin.

Much of the investigation stems from French law enforcement efforts, including claims from a government watchdog that Shein and other fast fashion applications that are popular, namely Temu and Ali Express, were exposing minors to pornography. French authorities have been circling these platforms for a while now, particularly as they balloon in popularity among younger users who treat them like digital shopping malls.

Shein was scrutinized by the French government as it was about to open its first brick-and-mortar location in central Paris at the BHV department store. There’s an irony there—stepping into the physical world just as digital scrutiny tightens. It’s almost cinematic. One foot in glossy retail space, the other in regulatory quicksand.

Shein was also facing a ban in French digital spaces, but quickly avoided such action, as reported by the wire service Agence France-Presse via Deutsche Welle. The reprieve, at least for now, doesn’t mean the heat is off. It just means the next chapter is unfolding elsewhere.

“The Commission will now carry out an in-depth investigation as a matter of priority,” the EU says. “The opening of formal proceedings does not prejudge the outcome.”

That line—does not prejudge the outcome—feels carefully calibrated. It’s the regulatory equivalent of keeping a poker face. Serious, but measured. Accusations are one thing. Findings are another.

Other accusations made against Shein by the bloc’s leadership include addictive use, age-inappropriate design, and a lack of transparency, as mandated by the DSA. Those are broader critiques, and they hint at something bigger than a single listing or claim. They point to how platforms are built—how they nudge, how they design, how they quietly steer behavior. Anyone who’s ever lost an hour to infinite scroll knows exactly what that means.

“The DSA does not set any legal deadline for bringing formal proceedings to an end,” notes the EU. “The duration of an in-depth investigation depends on several factors, including the complexity of the case, the extent to which the company concerned cooperates with the Commission, and the exercise of the rights of defense.”

In other words: this could take a while.

And in that space—between allegation and outcome—something bigger hangs in the air. Not just the future of one platform, but the question of how far governments are willing to go to police the digital storefronts we wander through every day.

Because once regulators start treating online marketplaces exactly like physical ones, there’s no putting that genie back in the bottle.

Read More »
Judge's gavel

Good News: Sometimes Adult Businesses DO Get Treated Like Everyone Else by Stan Q. Brick

In a country where it seems like lawsuits get filed at the drop of a hat – particularly if the hat is quite hard, quite heavy and falls on someone’s toes, causing both physical injury and extreme emotional distress – the fact that our courts do make plaintiffs jump through at least a minimal set of hoops can be something of a comfort.

For example, if I get into a fender bender with someone in California, but that person lives in New York, they probably can’t haul me into court in New York to make me face a lawsuit there, simply because New York happens to be the plaintiff’s state of domicile. They’d likely have to sue me in California, due to the way the courts handle the question of personal jurisdiction.

As you may have heard, a district court in Kansas applied this logic in dismissing a couple lawsuits filed against companies that operate adult websites, because a plaintiff there alleged those sites are not complying with the state’s age verification requirements for adult sites.

Among other things, judge in the case, U.S. District Judge Holly Teeter, wrote in her decision dismissing a lawsuit against Titan Websites that “merely intending that users accessing its content be able to do so from a wide geographic area is not the same as purposefully directing one’s activities at a forum.”

“Technical steps taken to make a universally accessible website easier for all users to access no matter where they are located is no more purposeful direction than the act of setting up the website in the first place,” Teeter added. “And just like the act of setting up a website, were the indiscriminate use of a CDN or other technologies to indiscriminately facilitate content delivery enough, ‘then the defense of personal jurisdiction, in the sense that a State has geographically limited judicial power, would no longer exist.’”

Teeter also wrote that her reasoning “does not mean that a website owner’s use of a CDN is never relevant” and “does not mean that a website owner’s use of a CDN could never show purposeful direction.”

“It does mean that more is needed to determine how the CDN is used and whether the CDN is being used to target a forum or an immediate region of which the forum is a part,” Teeter wrote. “The Court need not dissect the contours to resolve this case. Here, Plaintiff simply alleges that a CDN is being used and that the CDN has servers near the forum because logically it must. Defendant responds with evidence that it uses a third-party web-hosting service and that it does not know or care where the CDNs are located. This record is not enough to carry Plaintiff’s admittedly light burden.”

This dismissal of this case, as well as Teeter’s decision dismissing an identical case against a company called ICF Technology, is certainly good news for other adult businesses that might find themselves hailed into court over alleged violations of a state’s age verification law. They are not, of course, the end of the story.

The plaintiff is likely to appeal these decisions, whereupon the matter will go to the Tenth Circuit Court of Appeals. I’m no lawyer and I don’t have much to offer in terms of a prediction as to how the Tenth Circuit might ultimately rule. I just know that I don’t have much confidence in how the next court up the chain, the U.S. Supreme Court, might rule, should they take up the question.

Having found the age verification law passed in Texas to be constitutional, it wouldn’t surprise me one bit if SCOTUS decided that merely being accessible in a state creates a sufficient “minimum contact” with any given state for a court there to assert personal jurisdiction.

Still, at least for the time being, Teeter’s decisions represent something of a victory for the porn side of the War on Porn. Whether that victory is lasting or ephemeral remains to be seen. Fingers crossed.

Read More »
A board with the word debanking.

When Women’s Wellness Gets Labeled “Adult,” the Bank Account Disappears

It starts with a quiet email. No warning. No phone call. Just a notice that your account is under review—or worse, closed.

That’s what’s happening to women’s health and sexual wellness companies across the United Kingdom and Europe. Not because they’ve broken laws. Not because they’ve done anything shady. But because somewhere in a compliance department, someone ticked the wrong box and labeled them “adult content.”

New research from two U.K.-based advocacy groups, CensHERship and The Case For Her, digs into what’s really going on. And it’s uncomfortable reading. The report argues that women’s health innovators are being misclassified in financial systems, triggering debanking and account shutdowns that can stall or even sink young companies.

“What we find is that misclassification, over-compliance, cultural discomfort, and outdated policy language combine to create structural barriers for women’s health innovation, and that the identified structural barriers tend to fall into two forms,” the research explains.

Those two forms? “Misclassification” and “misunderstanding.”

Misclassification is described as “where women’s health and sexual wellbeing are misread as adult content. This is the most visible and well-documented form of bias.”

Misunderstanding, on the other hand, is “where women’s health is overlooked as too new, complex, or unfamiliar to fit existing risk templates. These cases are harder to surface because they are often resolved quietly or never formally recorded.”

That second one hits differently. It’s the kind of bias that doesn’t shout. It just shrugs. Too new. Too complex. Too awkward. Next.

Some of the companies affected are household names in the space, including SheSpot, a widely recognized brand. These aren’t fringe operators. They’re mainstream wellness businesses trying to build products around bodies that, frankly, have been underserved for generations.

The report goes further: “Because most financial institutions have never explicitly defined women’s health or FemTech within their risk frameworks, systems default to the nearest analogue—typically adult content, vice categories, or other ‘sensitive’ sectors.”

That line sticks. Systems default. That’s how it happens. No villain twirling a mustache. Just outdated templates and risk models built in eras when women’s sexual health was barely discussed in polite company. So instead of creating a new category, institutions drag these companies into old ones—adult content, vice, high-risk.

It’s lazy. It’s structural. And it’s expensive.

Both CensHERship and The Case For Her argue that outdated classification systems, cultural discomfort, and unconscious bias are creating real barriers to growth. Being labeled “adult content” or “adult services” doesn’t just sound insulting—it places these businesses in a “high-risk sector” alongside firearms manufacturing and tobacco cultivation and marketing.

Think about that for a second. A company developing pelvic health tools or hormone-tracking tech ends up sitting in the same risk bucket as cigarette production.

This isn’t just semantics. Risk labels determine whether you can open a bank account, process payments, attract investors, or scale internationally. When the system quietly decides your innovation is morally adjacent to vice, you feel it everywhere.

Honestly, it raises a bigger question: if financial institutions can’t tell the difference between sexual wellness and adult entertainment, what does that say about the frameworks we’re still operating under?

Maybe the real issue isn’t that women’s health is “too new.” Maybe it’s that the systems judging it are too old.

Read More »
Financial discrimination

Adult Creators Keep Getting Debanked — And the Fallout Goes Far Beyond Them

Your bank may never send you a memo about it, but it’s quietly shaping your life.

Every time you click “buy now,” a small army of institutions decides whether that purchase gets to exist. And for adult creators, that army has been steadily tightening its grip. For years, people in the industry have been warning about financial discrimination and debanking — the sudden closure of accounts, the polite but devastating “we can no longer do business with you.” It’s happening more often now. And it’s happening quietly.

“I don’t know what could happen next or when it might happen,”

Adult VTuber, journalist, and activist Ana Valens says. In just two weeks last November, nearly every platform she relied on either removed her content or suspended her outright. “While my Patreon and Ko-fi were reinstated, I’ve spent the past two months waiting for the other shoe to drop — another Patreon ban, my PayPal deactivated, and so on.” She reached out for explanations. Most platforms couldn’t clearly articulate how she’d violated their terms. Ko-fi didn’t respond until repeated messages finally led to reinstatement.

That kind of uncertainty lingers. It’s like walking on ice that might crack at any moment.

“Deplatforming and debanking are an occupational hazard for any adult content creator,” says Gina, a co-founder of PeepMe, a startup that set out to build a worker-owned creator marketplace. PeepMe was imagined as an alternative to OnlyFans and Patreon — a space where creators could hold equity, elect a democratic board, and receive quarterly profit-sharing dividends.

Gina requested that a pseudonym be used, given her continued work adjacent to the adult industry and the very real fear of financial fallout. “Even still, I’ve never seen someone banned on so many sites before [as Ana has been],” she says.

And it’s not just adult creators feeling the pressure. Companies in oil and gas, cryptocurrency, tobacco, and firearms have also raised concerns about politically motivated debanking. The pushback has grown loud enough that U.S. regulators are now stepping in, attempting to rein in financial discrimination.

Who’s Blocking My Buying?

When you make an online purchase, your money doesn’t travel in a straight line. It passes through layers of gatekeepers. The pipeline often looks like this:

  1. Platform (merchant) websites: where creators earn income — YouTube, Patreon, Etsy, DoorDash, Steam.

  2. Payment processors: companies that route the transaction between card networks and banks — PayPal, Stripe.

  3. Card networks: Visa, American Express, Mastercard — the rule-makers that standardize how buyers and sellers interact.

  4. Your bank and the seller’s bank: Wells Fargo, Bank of America, and so on.

Each step has discretion. Beyond preventing illegal activity, these institutions can decide what kinds of money they’re willing to touch.

“The rules set by card networks are sometimes vague,” says Dr. Val Webber, a postdoctoral researcher at Dalhousie University’s Sexual Health and Gender Research Lab. Mastercard’s June 2025 rules restrict “any Transaction that […] in the sole discretion of [Mastercard], may damage the goodwill of [Mastercard] or reflect negatively on the [brand].”

“In the sole discretion” is doing a lot of work there.

Last summer, Steam and itch.io removed or deindexed adult games after pressure from payment processors and card networks. Steam cited pressure from Mastercard, conveyed through processors like Stripe. Stripe told itch.io, “Stripe is currently unable to support sexually explicit content due to restrictions placed on them by their banking partners, despite card networks generally supporting adult content.” Stripe’s prohibited business list includes “pornography and other mature audience content (including literature, imagery, and other media) designed for the purpose of sexual gratification.”

Mastercard later denied involvement. In August 2025, the company stated, “Mastercard has not evaluated any game or required restrictions of any activity on game creator sites and platforms, contrary to media reports and allegations.”

Meanwhile, Valens saw her articles disappear from Vice. “My suspicion is that it was easy for a financial company to flag me as high risk as a punitive measure for my content, or my activism work,” she says. Attempts to obtain comment from Vice were unsuccessful.

Who Can Get Debanked?

“We have lots of data to show that people in the adult industry face financial discrimination in the form of their accounts being closed, being denied mortgages, business loans, and other banking services — despite banks often not being able to substantiate legal reasons related to these individual accounts,” says Maggie MacDonald, a PhD researcher at the University of Toronto.

The tension escalated in December 2020 when Visa and Mastercard cut ties with Pornhub, citing child sexual abuse material (CSAM). “Our adult content standards allow for legal adult activity created by consenting individuals or studios,” Mastercard said at the time. “Merchants must have controls to monitor, block and remove unlawful content from being posted.” Pornhub denied hosting illegal content and emphasized the harm to “the hundreds of thousands of models who rely on [their] platform for their livelihoods.”

But here’s the inconsistency that nags at people: X continues to process payments despite widespread reports of CSAM and non-consensual deepfake content. No sweeping financial freeze there.

Watching major platforms lose payment relationships makes smaller startups tread lightly. “We just can’t afford to lose our ability to do business with these financial companies,” Gina says. “Stripe takes only 2.9 percent from businesses they’re willing to work with, while high-risk processors willing to take on adult content can charge up to 15 percent.”

That difference can sink a company before it starts.

“Losing a relationship with card networks is a risk payment processors can’t afford, and losing relationships with payment processors is a risk that platform websites can’t afford,” explains Webber. “In the end, the responsibility of ensuring their content stays within the lines of these oftentimes unclear rules trickles down to each individual creator. Because ultimately, content creators are more expendable to platforms than payment processors and card networks.”

One justification often cited is chargebacks — when customers reverse credit card transactions. Gina isn’t convinced.

“Locking out entire industries makes less and less sense as fraud detection technology advances,” she says. “Payment processors and card networks already have processes to step in when an individual business has a high rate of chargebacks, there’s no reason to block out a whole industry.” Mastercard recently announced expanded generative AI fraud-detection tools, building on already sophisticated monitoring systems.

“We also haven’t seen the claim of high-chargebacks in adult content substantiated anywhere in terms of measured data,” adds MacDonald. “As a researcher, that makes me suspicious of the criteria these companies are using behind the scenes.”

The Evolving Landscape of Banking Regulations

In February 2025, the Free Speech Coalition filed a statement with the U.S. House Committee on Financial Services, calling for due process protections, objective risk assessments, and explicit recognition that lawful adult businesses do not inherently present financial crime risk. Blocking entire industries without individualized evaluation, the statement argued, is regulatory overreach with serious implications for free speech.

Multiple efforts are underway in the United States to limit financial institutions from denying service for reasons beyond legal violations. In August 2025, President Donald Trump issued an executive order directing regulators to investigate and reverse politically motivated debanking. Bank regulators have begun removing “reputational risk” from compliance criteria, and proposed Senate legislation would impose civil fines on banks and card networks that avoid entire categories of customers.

“Card networks and payment processors began by blocking pornography, but they’ve moved into other online industries as well,” says Webber. “The line in the sand continues to shift, and it has recently expanded to video game creators and streamers as well. We don’t know how these rules might evolve, and what type of online content might be next.”

Valens has spent months urging customers to call Mastercard, Visa, PayPal, and Stripe to question purchase restrictions and account freezes. Visa points to its policies for combating illegal activity; PayPal requires pre-approval for adult materials, similar to tobacco; Stripe states it does not support adult content.

“Private companies have been deputized to decide how we can earn and spend our money,” says MacDonald. “Anyone who is ideologically misaligned with any of these companies faces the risk of losing their livelihood.”

That’s the part that lingers.

It’s not just about porn, or games, or activism. It’s about the invisible committee that votes on your transactions — and whether one day, without warning, they decide you don’t get a vote at all.

Read More »
Taxes

Conservative Lawmakers Push Porn Taxes — Critics Call It Unconstitutional Speech Policing

The war on porn doesn’t look like a war anymore. It looks like a line item on a tax form.

As age-verification laws keep tightening their grip on the adult industry — and, quietly, on the broader idea of free speech online — an Utah lawmaker has proposed something new. Or maybe not new. Just sharper. A bill introduced last month would slap a tax on porn sites operating in the state.

Introduced by state senator Calvin Musselman, a Republican, the bill would impose a 7 percent tax on total receipts “from sales, distributions, memberships, subscriptions, performances, and content amounting to material harmful to minors that is produced, sold, filmed, generated, or otherwise based” in Utah. If passed, the bill would go into effect in May and would also require adult sites to pay a $500 annual fee to the State Tax Commission. Per the legislation, the money made from the tax will be used by Utah’s Department of Health and Human Services to provide more mental health support for teens.

Musselman did not respond to a request for comment.

There’s a certain rhythm to this moment in American politics. Conservative lawmakers across the country are circling adult content with renewed intensity. In September, Alabama became the first state to impose a porn tax on adult entertainment companies (10 percent) after passing age-verification mandates requiring users to upload ID before viewing explicit material. Pennsylvania lawmakers are weighing a bill that would add a 10 percent tax on “subscriptions to and one-time purchases from online adult content platforms,” even though digital products are already subject to a 6 percent sales tax, two state senators wrote in a memo in October. Arizona floated a similar idea back in 2019, when state senator Gail Griffin proposed taxing adult content distributors to help fund the border wall during Donald Trump’s first term. Meanwhile, 25 states have passed some form of age verification.

It’s not just about taxes. For years, efforts to criminalize or restrict sex work have ebbed and flowed, usually intensifying during moments of heightened online surveillance and censorship. But targeted taxes have struggled to gain widespread traction. Why? Because their legality is murky at best.

“This kind of porn tax is blatantly unconstitutional,” says Evelyn Douek, an associate professor of law at Stanford Law School. “It singles out a particular type of protected speech for disfavored treatment, purely because the legislature doesn’t like it—that’s exactly what the First Amendment is designed to protect against. Utah may not like porn, but as the Supreme Court affirmed only last year, adults have a fully protected right to access it.”

Utah, Alabama, and Pennsylvania are among 16 states that have adopted resolutions declaring porn a public health crisis. “We realize this is a bold assertion not everyone will agree on, but it’s the full-fledged truth,” Utah governor Gary Herbert tweeted in 2016 after signing the resolution. Utah’s long-running campaign against explicit material stretches back decades. In 2001, it became the first state to appoint an obscenity and pornography complaints ombudsman — a position colloquially known as the “porn czar.” That role was eliminated in 2017.

The industry, for its part, has been trying to keep up with the shifting rules. “Age restriction is a very complex subject that brings with it data privacy concerns and the potential for uneven and inconsistent application for different digital platforms,” Alex Kekesi, vice president of brand and community at Pornhub, said in a previous interview. In November, the company urged Google, Microsoft, and Apple to implement device-based age verification across their operating systems and app stores. “We have seen several states and countries try to impose platform-level age verification requirements, and they have all failed to adequately protect children.” To comply with new age gate mandates, Pornhub has blocked access in 23 states.

Critics argue that these policies were never truly about protecting children in the first place. In 2024, a video leaked by the Centre for Climate Reporting showed Russell Vought, a Trump ally and Project 2025 coauthor, describing age verification laws as a “back door” to a federal porn ban.

There’s a strange irony here. Platforms like OnlyFans and Pornhub helped mainstream digital sex work, bringing it out of the shadows and into subscription dashboards and creator analytics. But that visibility has made it easier to regulate, track, and now tax. As more states experiment with tariffs tied specifically to sexual content, creators — not lawmakers — are likely to feel the immediate impact.

The skewed ideology of cultural conservatism that is taking shape under Trump 2.0 wants to punish sexual expression, says Mike Stabile, director of public policy at the Free Speech Coalition, a trade association for the adult industry in the US. “When we talk about free speech, we generally mean the freedom to speak, the ability to speak freely without government interference. But in this case, free also means not having to pay for the right to do so. A government tax on speech limits that right to those who can afford it.”

OnlyFans states that it complies with tax requirements in the jurisdictions where it operates, and creators are responsible for managing their own tax affairs. Pornhub, which is currently blocked in Utah and Alabama, did not respond to a request for comment.

Douek notes that while the Supreme Court recently upheld age-verification laws in Texas — allowing states to regulate minors’ access to explicit material — “a porn tax does nothing to limit minors’ access to this speech—it simply makes it more expensive to provide this content to adults.” A 2022 report from Common Sense Media found that 73 percent of teens aged 13 to 17 have watched adult content online. Young people often encounter explicit material on social media platforms such as X and Snap, sometimes intentionally, often accidentally. A survey last year from the UK’s Office of the Children’s Commissioner reported that 59 percent of minors are exposed to porn unintentionally, primarily via social media, up from 38 percent the year before.

In Alabama, as in Utah’s proposal, tax revenue is earmarked for behavioral health services, including prevention, treatment, and recovery programs for young people.

Alabama state representative Ben Robbins, the bill’s Republican sponsor, said in an interview last year that adult content was “a driver in causing mental health issues” in the state. It’s an argument that surfaces again and again in debates about a nationwide porn ban. Some research suggests adolescent exposure may correlate with depression, lower self-esteem, or normalization of violence, but health professionals have not reached consensus.

With lawmakers reframing the conversation around underage harm, Stabile argues that the principle at stake is bigger than porn itself. Content-specific taxes on speech, he notes, have repeatedly been struck down as unconstitutional censorship.

“What if a state decided that Covid misinformation was straining state health resources and taxed newsletters who promoted it? What if the federal government decided to require a costly license to start a podcast? What if a state decided to tax a certain newspaper it didn’t like?” he says. “Porn isn’t some magical category of speech separate from movies, streaming services, or other forms of entertainment. Adult businesses already pay taxes on the income they earn, just as every other business does. Taxing them because of imagined harms is not only dangerous to our industry, it sets a dangerous precedent for government power.”

And that’s the quiet part that lingers.

When governments start deciding which kinds of speech deserve a surcharge, the debate stops being about porn. It becomes about who gets to speak freely — and who has to pay extra for the privilege.

Read More »
Ofcom logo

Motherless.com Parent Hit With Major Ofcom Fine for AV Noncompliance

It’s the kind of enforcement story that lands with a thud, not a gasp. No surprise raid. No dramatic shutdown. Just a regulator, a spreadsheet, and a very large number at the bottom of the page.

Kick Online Entertainment S.A., the parent company of the controversial adult tube site Motherless.com, has been fined by the U.K.’s digital regulator Ofcom for failing to comply with age-verification requirements under the country’s Online Safety Act. The penalty tops £800,000 — just shy of $1.1 million — and it’s not the only fine involved.

According to a statement issued Tuesday, Ofcom said Kick Online repeatedly failed to meet age-verification obligations despite multiple attempts by the regulator to engage with the company. Motherless.com had previously been flagged several times as noncompliant, and those warnings, it seems, went nowhere.

On top of that, Kick Online is accused of ignoring formal information requests from the regulator, triggering an additional £30,000 fine — roughly $40,876. It’s the regulatory equivalent of getting ticketed for speeding and then fined again for refusing to pull over.

“Having highly effective age checks on adult sites to protect children from pornographic content is non-negotiable. Any company that fails to meet this duty—or engage with us—can expect to face robust enforcement action, including significant fines,” said Suzanne Cater, Ofcom’s director of enforcement.

“We continue to investigate other sites under the U.K.’s age check rules and will take further action where necessary,” Cater added. Ofcom also noted that while Kick Online has made attempts to implement age checks, those measures fall short of the “highly effective” standard required under the Online Safety Act. Close doesn’t count anymore.

A significant portion of the penalties relates to noncompliance spanning July through December 2025 — months that now read less like a timeline and more like a paper trail. And as enforcement ramps up, one thing feels increasingly clear: regulators aren’t asking politely anymore.

Read More »