Business Attacks

U.S. OCC Releases Preliminary Report on Debanking

OCC Debanking Report

Some mornings, the news hits you like a jolt of cold water — shocking at first, then oddly clarifying. That’s how it felt when the U.S. Office of the Comptroller of the Currency (OCC) dropped a preliminary report on debanking, finally calling out what so many in the adult industry have been living with for years. It’s strange to feel victorious over a problem you never should’ve had in the first place, but here we are, holding something that looks a lot like validation.

The OCC names nine major banks — the kind everyone’s parents told them were “safe,” including Chase, Wells Fargo, and Bank of America — for potentially violating the President’s Executive Order against discriminating toward people engaged in “lawful business activities.” Reading it, I had one of those moments where you want to underline every other sentence because someone, somewhere in government, actually said the quiet part out loud. The OCC states that the adult industry, among others, was subjected to “inappropriate distinctions” when trying to access basic financial services:

“The OCC’s preliminary findings show that, between 2020 and 2023, these nine banks made inappropriate distinctions among customers in the provision of financial services on the basis of their lawful business activities by maintaining policies restricting access to banking services… For example, the OCC identified instances where at least one bank imposed restrictions on certain industry sectors because they engaged in ‘activities that, while not illegal, are contrary to [the bank’s] values.’ Sectors subjected to restricted access included oil and gas exploration, coal mining, firearms, private prisons, tobacco and e-cigarette manufacturers, adult entertainment, and digital assets.”

Seeing adult entertainment listed there — not as a punchline, not as an afterthought, but as a recognized target of discrimination — is surreal. It’s proof that the federal government isn’t just aware of the problem; it’s saying, pretty plainly, that the problem matters. That we matter. And for once, the burden shifts off the people running these businesses and onto the banks that have quietly punished them under the guise of “values.”

This marks the first time in a long time that the adult industry isn’t shouting into the void. The OCC has confirmed that we’re covered under the Executive Order. Banks now know that the old playbook — the one where they shut down accounts for “reputational risk” and shrug — might actually put them on the wrong side of federal policy.

There’s still a road ahead, of course. In the coming weeks and months, the OCC will move into the rule-making phase, and that’s where the shape of all this becomes real. We’ll learn more as they flesh out the details, and so will everyone who’s been denied a basic checking account simply for doing legal work that made some executive squeamish. But for the first time in years, there’s a crack of daylight. Maybe — just maybe — we’re watching the beginning of the end of a discrimination problem that never should’ve existed in the first place.

And honestly? It’s about time the people holding the money had to explain themselves.

Read More »

Denver Says Strippers Are Workers — and Must Be Treated That Way

Stripper

Sometimes a ruling hits the news and you can almost feel the collective exhale from people who’ve been waiting far too long to be treated like… well, workers. That’s what happened in Denver, where a district judge decided that strip club entertainers — dancers, performers, the people who carry the whole night on their shoulders — are employees. Real employees. Which means they’re covered by the city’s labor protections and the standard minimum hourly wage that everyone else in the consolidated city and county is supposed to receive.

The decision backs up what a Denver Auditor’s Office hearing officer had already determined while digging into wage theft claims at several strip clubs around town — including big names like the local Rick’s Cabaret branch, PT’s Showclub, PT’s Showclub Centerfold and the Diamond Cabaret. You know the places: neon lights, velvet corners, the kind of spots that thrive on performance yet somehow never wanted to acknowledge the performers as actual staff.

Attorneys for the clubs kept insisting that entertainers were exempt from Denver’s employment-law umbrella, as if the entire job existed in some hazy loophole. But the ruling cut through that fog. It confirmed what the Auditor’s Office has been saying for years: strippers and erotic performers deserve the same wage protections as anyone else clocking in across the city.

“Our office enforces wage theft laws for all industries and protects anyone performing work in Denver. Adult entertainment workers are no different, and we are pleased the courts agree,” said Timothy M. O’Brien, the elected city auditor who oversees Denver Labor. And honestly, hearing that felt like a long-overdue moment of someone naming the obvious.

Denver Labor, by the way, does its job through law enforcement, education and certified audits — the kind of behind-the-scenes work most people don’t think about until it hits the headlines or their paycheck.

“Entertainers are workers and, therefore, are entitled to the fundamental protections of Denver’s wage ordinances,” reads an informational page on O’Brien

Read More »

Strike 3 Fires Back, Dismissing Meta’s “Personal Use” Claim in AI Battle

Meta logo

There’s a strange kind of déjà vu when a major tech company gets accused of dipping into the piracy pool — that uneasy feeling like we’ve been here before, even though each case somehow feels bigger than the last. That’s the energy swirling around the latest clash between Vixen Media Group owner Strike 3 Holdings and Meta, after Meta asked a federal judge in Northern California to toss their lawsuit.

Strike 3 didn’t pull any punches in its original complaint. They say Meta didn’t just accidentally stumble across their movies online — they accuse the company of deliberately scooping up VMG content from pirate sites to help train its AI systems. And they’re specific about it. BitTorrent, they say, is where those downloads came from. It’s the kind of accusation that makes you imagine a massive tech company hunched over a torrent client, pretending to be invisible.

Meta’s response in October? A very different story — one that almost sounds like a shrug. They told the court that if videos were downloaded, the timing and the number of those downloads “point solely to personal use, not AI training.” And then came the eyebrow-raising line:

“Meta contractors, visitors, or employees may have used Meta’s internet access over the years to download videos for personal use.”

In other words: Hey, people do stuff on our Wi-Fi. Not our problem.

Meta insisted the download activity — the volume, the pattern — just didn’t look like anything tied to large-scale AI development and “is plainly indicative of private personal use.”

They also brushed aside Strike 3’s attempt to link non-Meta IP addresses to the company, calling it a stretch.

Strike 3 fired back this week, and you can almost hear the disbelief between the lines. Their response to the “employees were just downloading porn” defense was blunt:

“Meta’s excuse that employees must be infringing Plaintiffs’ copyrights for ‘personal use’ does not fit the facts.”

What they say does fit the facts?

Something far more coordinated — something algorithmic. They point to what they call “unique patterns of Meta’s piracy,” arguing those patterns don’t look like a person casually searching for adult videos. Instead, Strike 3 claims the behavior “not only indicate[s] the use of a universal ‘script’ (likely developed, supplied, or encouraged by Meta), but also show[s] that the BitTorrent downloads are ‘for AI training data and not for personal use.’”

They bolster that argument with another recent copyright case: Kadrey v. Meta Platforms, Inc.

In that suit, Meta “admitted to using BitTorrent to obtain digital copies” of books for AI training — and Strike 3 alleges that the same IP addresses used in the book-related downloads also appear in their own infringement logs. According to them, Meta even acknowledged in that case that it masked its corporate IPs to avoid detection.

And then comes one of the more striking accusations (no pun intended):

“Plaintiffs were able to correlate Meta’s infringement through its Corporate IPs to six hidden data centers that were used to engage in massive infringement of both Plaintiffs’ and the Kadrey plaintiffs’ works as well. The scale of these infringements is staggering and are ‘beyond what a human could consume.’”

Strike 3 says activity from those masked IP addresses mirrors the activity from Meta’s known corporate IP blocks — a kind of digital fingerprinting that, in their telling, points to one conclusion: these downloads were never about personal viewing habits. “Those ends,” they write, “were to train Meta’s AI and not for the personal use of its employees.”

And if the court believes that interpretation?

Meta wouldn’t just be facing another infringement spat. The numbers involved are enormous — the complaint lists at least 2,396 allegedly infringed movies. Multiply that by the maximum statutory damages of $150,000 per work, and you land at a number that barely feels real: $359.4 million.

But the money, oddly enough, may not be the part that echoes the loudest. This lawsuit drops the adult industry directly into the center of the broader fight over whether training an AI model on copyrighted material counts as “fair use” — or whether it’s just a slicker version of old-school piracy wearing a futuristic badge.

No one really knows yet how courts will treat AI training that leans on protected works. Everyone’s waiting for the first big ruling — the one that will set the tone for all the cases piling up behind it. And maybe that’s the real story here: not just whether Meta downloaded some videos, but whether we’re watching the early days of a legal shift that’s going to redraw the boundaries of ownership, creativity, and machine learning.

Because if AI is the future, then this messy, uncomfortable question is coming along for the ride whether Silicon Valley likes it or not.

Read More »

Discord Confirms Data Breach Exposed Government ID Photos of 70,000 Users

Discord logo

SAN FRANCISCO — Discord, the popular chat and community platform, confirmed that one of its third-party vendors experienced a major data breach that exposed the personal information of about 70,000 users, including photos of government-issued identification cards.

The affected vendor was responsible for processing age-verification submissions and appeals on behalf of Discord. The company has not yet named the vendor but indicated that the breach was the result of a cyberattack exploiting a Zendesk instance, allegedly part of an extortion attempt targeting both the vendor and Discord.

Early reports suggested that roughly 1.5 terabytes of data were stolen—around 2.2 million images tied to age-verification records. However, Discord said the actual scope was smaller than initially claimed.

“This was not a breach of our internal systems,” a Discord spokesperson told The Verge. “The attack targeted a third-party service we use to support our customer service operations. Approximately 70,000 users may have had government-ID photos exposed, which the vendor used for age-related appeal reviews.”

The company added that all affected users have been notified. “We’ve secured the affected systems, ended our relationship with the compromised vendor, and continue to cooperate with law enforcement, data protection authorities, and external security experts,” the spokesperson said. “We take our responsibility to protect user data seriously and understand the concern this may cause.”

Discord also disclosed that other personal details—including names, usernames, email addresses, IP addresses, and the last four digits of some users’ credit cards—were included in the compromised data.

While Discord remains best known for its role in gaming culture and online communities, it has also become a hub for artists, streamers, and adult creators who use the platform to interact with fans and build digital communities. The service allows users over 18 to share adult-oriented material within designated, age-restricted spaces.

Read More »

Strike 3 Holdings Sues Meta Over Alleged Use of Porn Content in AI Training

Strike 3 Holdings, a company that describes its films as “high quality,” “feminist,” and “ethical” adult videos, has filed a lawsuit against Meta in federal court in California, accusing the tech giant of infringing its copyrights by using Strike 3’s content to train artificial intelligence models. The complaint, filed in July, claims Meta has been torrenting and seeding the company’s videos since 2018. Supporting exhibits and details were unsealed last week.

According to the lawsuit, Meta sought Strike 3’s content because it offered angles and extended uninterrupted scenes that are “rare in mainstream movies and TV,” allegedly giving Meta an edge in developing what CEO Mark Zuckerberg calls AI “superintelligence.”

“They have an interest in getting our content because it can give them a competitive advantage for the quality, fluidity, and humanity of the AI,” said Christian Waugh, an attorney for Strike 3.

The filing alleges Meta BitTorrented and distributed 2,396 of Strike 3’s copyrighted videos, making them accessible to minors since the BitTorrent protocol does not include age verification. The complaint further asserts that Meta used the adult videos “for distribution as currency to support its downloading of a vast array of other content necessary to train its AI models.”

The exhibits list not only Strike 3 titles but also mainstream television shows such as Yellowstone, Modern Family, The Bachelor, South Park, and Downton Abbey. They also include pornographic videos produced by others that appear to feature very young actors, with titles such as ExploitedTeens, Anal Teens, and EuroTeenErotica. In addition, the list contains files related to weapons (3D Gun Print, Gun Digest Shooter’s Guide to the AR-15) and political material (Antifa’s Radical Plan and Intellectual Property Rights in Cyberspace).

Using adult content as training data is “a public relations disaster waiting to happen,” said Matthew Sag, a professor of law specializing in artificial intelligence at Emory University. “Imagine a middle school student asks a Meta AI model for a video about pizza delivery, and before you know it, it’s porn.”

Strike 3 says it identified the alleged violations through infringement-detection systems it operates and traced activity to 47 Meta-affiliated IP addresses. The company is seeking $350 million in statutory damages.

Christopher Sgro, a Meta spokesperson, said: “We’re reviewing the complaint, but we don’t believe Strike’s claims are accurate.”

The lawsuit draws attention to Meta’s V-JEPA 2 “world model,” released in June, which the company says was trained on 1 million hours of “internet video,” a term the complaint highlights as vague. Zuckerberg has described Meta’s goal as putting “the power of superintelligence into people’s hands to direct it toward what they value in their own lives.”

According to the complaint, Meta executives deliberately approved the use of pirated material, with Zuckerberg’s sign-off. Nearly every major AI company faces similar copyright suits.

“The case being presented against Meta is perhaps the case of the century because of the sheer scope of infringement,” Waugh said, adding that the unsealed exhibits represent only “a thin slice of the pie.”

AI companies often defend themselves by claiming that their technologies are “transformative” and thus protected under fair use. Former President Donald Trump voiced support for this view in July, saying: “You can’t be expected to have a successful AI program when every single article, book, or anything else that you’ve read or studied you’re supposed to pay for.”

In June, U.S. District Court Judge Vince Chhabria ruled that Meta did not break the law in training its AI models on the works of 13 authors in a separate case, Kadrey v. Meta. However, he clarified that the decision “stands only for the proposition that these plaintiffs made the wrong arguments and failed to develop a record in support of the right one.”

That leaves the door open for Strike 3 to mount a stronger case. “The best version of their argument is: This is a fundamental problem because, by going to these pirate websites, you are undermining the market for access,” Sag explained.

Waugh argued that the dispute underscores a broader issue. “It doesn’t matter if it’s a four-sentence poem or adult entertainment. There is no appetite in this country for what AI companies appear to be doing, which is making money off the backs of rights holders who never gave permission for it.”

Read More »

Bad Orange Deplatformed by YouTube, Loses Bank Account Access

YouTube logo

SHERIDAN, Wyo. — Bad Orange, a creator of sensual audio stories designed for women, announced that its YouTube channel — which had close to 200,000 followers — was recently removed. In addition, the company’s bank account was closed. Bad Orange is pushing back against these moves, calling them both unfair and hypocritical.

“It is confusing, to put it politely, why we are being subjected to these arbitrary standards in this day and age,” said Larry, a classically trained actor who performs under the pseudonym Daddy Sounds.

He pointed to the broader media landscape to highlight what he views as inconsistent standards. “At a moment when the average issue of Slate contains advice letters about extreme kink and The Daily Beast runs ads for Lovehoney’s Advent calendars, which they call ‘an erotic journey of passion, play, and connection,’ complete with sex toys, it’s rather puzzling why our content is viewed as intolerable,” Larry said. “It’s as if they hate women or something.”

Larry compared Bad Orange’s situation to that of the gaming industry, where adult-oriented game creators have seen thousands of accounts shut down by payment processors, devastating their businesses.

“It’s not right. It’s hypocritical, and we’re going to take action,” he added.

Although Bad Orange managed to establish an alternative method for processing payments, the company described the transition as a time-consuming and difficult administrative process.

Sensual audio content has been gaining popularity among women seeking erotic experiences outside the realm of mainstream adult video.

“Voice content is far more intimate and engaging than video,” Larry explained. “Like radio, it stimulates the listener’s imagination and visualization abilities, which is usually vastly more interesting than anything a camera can record.”

For more details, visit the Bad Orange website.

Read More »

Visa Just Made Chargebacks Twice as Dangerous for OnlyFans Creators

Visa Logo

Read More »

Twitch Introduces Age Verification for UK Users

LOS ANGELES — Twitch has begun verifying the ages of users logging in from United Kingdom IP addresses, according to a Tuesday report from Dexerto.

The streaming platform is the latest to implement strict age verification procedures following the rollout of the Online Safety Act. Under the new rules, U.K. regulator Ofcom requires digital platforms to ensure users pass either a facial scan or submit personal information before gaining access.

“Twitch and k-ID (a third-party vendor we partner with to verify your age) do not store your face scan video selfies,” the company explained. “The video selfie used for facial age estimation is analyzed entirely on your device and will never leave it.”

The move has already drawn backlash. Some Twitch users described the measure as “dystopian,” while others suggested they might stop using the platform altogether.

Adult content creators have long used Twitch to expand their mainstream reach. One prominent example is Amouranth, who streams on both Twitch and Kick and is also an award-winning adult content creator.

Read More »

Colombian Court Rules in Favor of Esperanza Goméz in Instagram Suspension Case

Esperanza Gomez

BOGOTÁ, Colombia — Colombia’s Constitutional Court has ruled in favor of adult performer Esperanza Gómez in her dispute with Meta over repeated suspensions of her Instagram account.

The court determined that Meta failed to apply its standards equally, noting that Gómez’s profile had not been treated the same as others with similar content.

“If social media platforms use offline activities as criteria for content moderation, they must clearly state these criteria in their community standards,” the ruling stated. “Due process must also be allowed to reasonably challenge the social media platform’s decision.”

Gómez first filed her case in 2022, arguing that Instagram’s repeated deactivations cost her millions of followers and harmed her “right to work.”

In the decision, authored by Judge Natalia Ángel Cabo, the court ordered Meta to take three corrective measures: establish a visible electronic channel for judicial notifications in Colombia, ensure moderation policies are available in Spanish on a unified website, and revise Instagram’s terms of use and privacy policy so users have clear avenues to contest moderation actions.

Following the ruling, Gómez celebrated the outcome on X. “I continued without listening to the people who told me that I would never win a lawsuit against a giant, and today we are triumphing,” she wrote. “We must know how to defend our rights when they are violated.”

Read More »

Bluesky Implements Age Checks for Users in South Dakota and Wyoming

Bluesky logo

Bluesky confirmed Wednesday that it has begun implementing age verification for users in Wyoming and South Dakota, citing compliance with newly enacted state-level statutes on age checks and social media safety.

“Bluesky users in South Dakota and Wyoming can choose from multiple methods to verify their age. We believe this approach currently strikes the right balance,” the company said in a statement. “Bluesky will remain available to users in these states, and we will not need to restrict the app for everyone.”

The platform added that it intends to keep its community informed as it adapts to shifting regulatory requirements.

To manage compliance, Bluesky is utilizing Kids Web Services (KWS), an age assurance system developed by Epic Games, the company behind Fortnite.

The move follows last month’s announcement that users in Mississippi would be blocked from accessing Bluesky altogether due to legal challenges surrounding the state’s social media age verification law, which is currently the subject of federal litigation.

Bluesky has become a favored platform among adult creators and studios, in part because of its relatively permissive approach to nudity and explicit content.

Read More »