Business Attacks

GitHub Purges Adult Game Developers, Offers No Explanation

Anime two women

Something strange started rippling through a small, niche corner of the internet not long ago. Developers who build mods and plugins for hentai games and even interactive sex toys began waking up to missing repositories, locked accounts, and dead links. GitHub, the place many of them had treated like home base, had quietly pulled the rug out. No warning. No explanation. Just… gone.

From conversations within the community, the rough headcount quickly took shape: somewhere between 80 and 90 repositories, representing the work of roughly 40 to 50 people, vanished in a short window. Many of the takedowns seemed to cluster around late November and early December. A large number of the affected accounts belonged to modders working on games from Illusion, a now-defunct Japanese studio known for titles that mixed gameplay with varying degrees of erotic content. One banned account alone reportedly hosted contributions from more than 30 people across 40-plus repositories, according to members of the modding scene.

What made the situation feel especially surreal was the silence. Most suspended developers say they were never told which rule they’d broken—if any. Their accounts simply stopped working. Several insisted they’d been careful to stay within GitHub’s acceptable use guidelines, avoiding anything overtly explicit. The code was functional, technical, sometimes cheeky in naming, but never pornographic. At least, not in the way most people would define it.

“Amongst my repositories there were no explicitly sexual names or images anywhere in the code or the readme, the most suggestive naming would be on the level of referencing the dick as ‘the men thing’ or referencing the sex as ‘huffing puffing,’” one developer, Danil Zverev, told me. He makes plugins for an Illusion game called Koikatsu. Zverev said he’s been using Github for this purpose since 2024, but on November 18, his Github page was “completely deleted,” he said. “No notifications anywhere, simply a 404 error when accessing the page and inability to log in on the web or in the mobile app. Also it does not allow me to register a new account with the same name or email.”

The timing raised eyebrows. GitHub had updated its acceptable use policies in October 2025, adding language that forbids “sexually themed or suggestive content that serves little or no purpose other than to solicit an erotic or shocking response, particularly where that content is amplified by its placement in profiles or other social contexts.” The policy explicitly bars pornographic material and “graphic depictions of sexual acts including photographs, video, animation, drawings, computer-generated images, or text-based content.”

At the same time, the policy leaves room for interpretation. “We recognize that not all nudity or content related to sexuality is obscene. We may allow visual and/or textual depictions in artistic, educational, historical or journalistic contexts, or as it relates to victim advocacy,” GitHub’s terms of use state. “In some cases a disclaimer can help communicate the context of the project. However, please understand that we may choose to limit the content by giving users the option to opt in before viewing.”

The Anti-Porn Crusade That Censored Steam and Itch.io Started 30 Years Ago

Keywords and tags have never been a useful metric for distilling nuance. Pushing for regulations based on them is repeating a 30-year history of porn panic online.

SAMANTHA COLE

Zverev didn’t bother appealing. He said writing to support felt pointless and chose instead to move on to another platform. Others tried to fight it—and found themselves stuck in limbo.

A developer who goes by VerDevin, known for Blender modding guides, utility tools, and plugins for the game Custom Order Maid 3D2, said users began reporting trouble accessing his repositories in late October. Oddly, he could still see his account when logged in, but not when browsing while logged out.

“Turned out, as you already know, that my account was ‘signaled’ and I had to purposefully go to the report section of Github to learn about it. I never received any notifications, by mail or otherwise,” VerDevin told me. “At that point I sent a ticket asking politely for clarifications and the proceedings for reinstatement.”

The response from GitHub Trust & Safety was vague and procedural: “If you agree to abide by our Terms of Service going forward, please reply to this email and provide us more information on how you hope to use GitHub in the future. At that time we will continue our review of your request for reinstatement.”

VerDevin replied the next day, agreeing to comply and offering to remove whatever GitHub considered inappropriate—despite still not knowing what that was. “I did not take actual steps toward it as at that point I still didn’t know what was reproach of me,” they said.

A full month passed before GitHub followed up. “Your account was actioned due to violation of the following prohibition found in our Acceptable Use Policies: Specifically, the content or activity that was reported included multiple sexually explicit content in repositories, which we found to be in violation of our Acceptable Use Policies,” GitHub wrote.

“At that point I took down several repositories that might qualify as an attempt to show good faith (like a plugin named COM3D2.Interlewd),” they said. GitHub restored the account on December 17—weeks later, and just one day after additional questions were raised about the ban—but never clarified which content had triggered the action in the first place.

Requests for explanation went unanswered. Even when specific banned accounts were flagged to GitHub’s press team, the response was inconsistent. Some accounts were reinstated. Others weren’t. No clear reasoning was ever shared.

The whole episode highlights a problem that feels painfully familiar to anyone who’s worked on the edges of platform rules: adult content policies that are vague, inconsistently enforced, and devastating when applied without warning. These repositories weren’t fringe curiosities—they were tools used by potentially hundreds of thousands of people. The English-speaking Koikatsu modding Discord alone has more than 350,000 members. Another developer, Sauceke, whose account was suspended without explanation in mid-November, said users of his open-source adult toy mods are now running into broken links or missing files.

“Perhaps most frustratingly, all of the tickets, pull requests, past release builds and changelogs are gone, because those things are not part of Git (the version control system),” Sauceke told me. “So even if someone had the foresight to make mirrors before the ban (as I did), those mirrors would only keep up with the code changes, not these ‘extra’ things that are pretty much vital to our work.”

GitHub eventually reinstated Sauceke’s account on a Tuesday—seven weeks after the original suspension—following renewed questions about the bans. Support sent a brief note: “Thank you for the information you have provided. Sorry for the time taken to get back to you. We really do appreciate your patience. Sometimes our abuse detecting systems highlight accounts that need to be manually reviewed. We’ve cleared the restrictions from your account, so you have full access to GitHub again.”

Even so, the damage lingers. In Sauceke’s account and others, including the IllusionMods repository, release files remain hidden. “This makes the releases both inaccessible to users and impossible to migrate to other sites without some tedious work,” Sauceke said.

Accounts may come back. Repositories might be restored. But for many developers, the trust is already gone—and that’s the kind of thing that doesn’t reinstall quite so easily.

GitHub isn’t just another code host—it’s the town square for open-source developers. For adult creators especially, who are used to being quietly shoved to the margins everywhere else, visibility there actually matters. It’s how people find each other, trade ideas, and build something that feels bigger than a solo side project. “It’s the best place to build a community, to find like-minded people who dig your stuff and want to collaborate,” Sauceke said. But if this wave of bans stretches beyond hentai game and toy modders, they warned, it could trigger a slow exodus. Some developers aren’t waiting around to find out, already packing up their repositories and moving them to GitGoon, a platform built specifically for adult developers, or Codeberg, a nonprofit, Berlin-based alternative that runs on a similar model.

Read More »

Nevada’s Legal Sex Workers Claim They’re Being Muted on X

Beautiful woman on bed

It didn’t happen slowly. It wasn’t subtle. Within the past month, legal Nevada sex workers have been hit with a sudden, sweeping wave of account suspensions on X, the platform once known as Twitter. Not for doing anything illegal. Not for soliciting crimes. These are licensed workers, operating in the only state where brothel-based sex work is legal — and yet their voices are vanishing from a platform that once wrapped itself in the language of free speech.

That promise came straight from Elon Musk himself when he set his sights on buying Twitter. At the time, he framed the platform as something almost sacred, saying:

“Free speech is the bedrock of a functioning democracy, and Twitter is the digital town square.”

He followed that with another line meant to reassure skeptics:

“By ‘free speech’ I simply mean that which matches the law.”

By that definition, Nevada sex work clearly qualifies.

Prostitution is legal in Nevada when it takes place inside licensed brothels, as outlined in Nevada Revised Statutes 201.354. Counties are empowered to license and regulate these brothels under Nevada Revised Statutes 244.345, while workers comply with additional health and safety standards set by the state. At present, six counties operate legal brothels. This isn’t a loophole or a gray area — it’s a fully regulated, lawful industry.

At first glance, it might look like X is simply enforcing broad rules around adult content. But the reality cuts deeper. When legal Nevada sex workers lose their accounts, they’re erased from public conversation — conversation that increasingly lives and breathes on platforms like X. What’s left behind is a so-called “digital town square” where only certain voices are allowed to stay standing.

Nevada sex workers understand exactly what’s at stake when they’re shut out. Not long ago, anti-brothel groups attempted to dismantle the legal system through ballot initiatives. When voters heard directly from sex workers, those efforts failed — decisively. In the 2018 Lyon County referendum, for example, nearly 80 percent of voters rejected a proposed brothel ban.

That wasn’t an accident. When sex workers are able to speak publicly, explain how the licensed system actually functions, and share their lived experiences, people listen. Voters learn about the safeguards, the structure, and why legal brothels exist in the first place — not from headlines or fear campaigns, but from the people inside the system.

Silencing those voices on X means the public hears less from those with firsthand knowledge. Anti-sex-work narratives remain visible, amplified, and largely unchallenged. The workers most affected by stigma and policy decisions fade into the background.

This isn’t just about clumsy algorithms sweeping up adult content. It’s about who gets to participate in conversations that can shape laws, livelihoods, and lives. Platforms don’t just host debate — they quietly curate it by deciding who stays and who disappears.

When licensed Nevada sex workers are removed from social media, the public square stops reflecting reality. The debate tilts. The story becomes one-sided. And the people whose livelihoods are on the line — most of them women — lose the chance to speak for themselves.

Maybe that’s the most unsettling part. If this can happen to a group operating legally, transparently, and within the law, it raises an uncomfortable question: who’s next when an algorithm decides a voice is inconvenient?

Read More »

U.S. OCC Releases Preliminary Report on Debanking

OCC Debanking Report

Some mornings, the news hits you like a jolt of cold water — shocking at first, then oddly clarifying. That’s how it felt when the U.S. Office of the Comptroller of the Currency (OCC) dropped a preliminary report on debanking, finally calling out what so many in the adult industry have been living with for years. It’s strange to feel victorious over a problem you never should’ve had in the first place, but here we are, holding something that looks a lot like validation.

The OCC names nine major banks — the kind everyone’s parents told them were “safe,” including Chase, Wells Fargo, and Bank of America — for potentially violating the President’s Executive Order against discriminating toward people engaged in “lawful business activities.” Reading it, I had one of those moments where you want to underline every other sentence because someone, somewhere in government, actually said the quiet part out loud. The OCC states that the adult industry, among others, was subjected to “inappropriate distinctions” when trying to access basic financial services:

“The OCC’s preliminary findings show that, between 2020 and 2023, these nine banks made inappropriate distinctions among customers in the provision of financial services on the basis of their lawful business activities by maintaining policies restricting access to banking services… For example, the OCC identified instances where at least one bank imposed restrictions on certain industry sectors because they engaged in ‘activities that, while not illegal, are contrary to [the bank’s] values.’ Sectors subjected to restricted access included oil and gas exploration, coal mining, firearms, private prisons, tobacco and e-cigarette manufacturers, adult entertainment, and digital assets.”

Seeing adult entertainment listed there — not as a punchline, not as an afterthought, but as a recognized target of discrimination — is surreal. It’s proof that the federal government isn’t just aware of the problem; it’s saying, pretty plainly, that the problem matters. That we matter. And for once, the burden shifts off the people running these businesses and onto the banks that have quietly punished them under the guise of “values.”

This marks the first time in a long time that the adult industry isn’t shouting into the void. The OCC has confirmed that we’re covered under the Executive Order. Banks now know that the old playbook — the one where they shut down accounts for “reputational risk” and shrug — might actually put them on the wrong side of federal policy.

There’s still a road ahead, of course. In the coming weeks and months, the OCC will move into the rule-making phase, and that’s where the shape of all this becomes real. We’ll learn more as they flesh out the details, and so will everyone who’s been denied a basic checking account simply for doing legal work that made some executive squeamish. But for the first time in years, there’s a crack of daylight. Maybe — just maybe — we’re watching the beginning of the end of a discrimination problem that never should’ve existed in the first place.

And honestly? It’s about time the people holding the money had to explain themselves.

Read More »

Denver Says Strippers Are Workers — and Must Be Treated That Way

Stripper

Sometimes a ruling hits the news and you can almost feel the collective exhale from people who’ve been waiting far too long to be treated like… well, workers. That’s what happened in Denver, where a district judge decided that strip club entertainers — dancers, performers, the people who carry the whole night on their shoulders — are employees. Real employees. Which means they’re covered by the city’s labor protections and the standard minimum hourly wage that everyone else in the consolidated city and county is supposed to receive.

The decision backs up what a Denver Auditor’s Office hearing officer had already determined while digging into wage theft claims at several strip clubs around town — including big names like the local Rick’s Cabaret branch, PT’s Showclub, PT’s Showclub Centerfold and the Diamond Cabaret. You know the places: neon lights, velvet corners, the kind of spots that thrive on performance yet somehow never wanted to acknowledge the performers as actual staff.

Attorneys for the clubs kept insisting that entertainers were exempt from Denver’s employment-law umbrella, as if the entire job existed in some hazy loophole. But the ruling cut through that fog. It confirmed what the Auditor’s Office has been saying for years: strippers and erotic performers deserve the same wage protections as anyone else clocking in across the city.

“Our office enforces wage theft laws for all industries and protects anyone performing work in Denver. Adult entertainment workers are no different, and we are pleased the courts agree,” said Timothy M. O’Brien, the elected city auditor who oversees Denver Labor. And honestly, hearing that felt like a long-overdue moment of someone naming the obvious.

Denver Labor, by the way, does its job through law enforcement, education and certified audits — the kind of behind-the-scenes work most people don’t think about until it hits the headlines or their paycheck.

“Entertainers are workers and, therefore, are entitled to the fundamental protections of Denver’s wage ordinances,” reads an informational page on O’Brien

Read More »

Strike 3 Fires Back, Dismissing Meta’s “Personal Use” Claim in AI Battle

Meta logo

There’s a strange kind of déjà vu when a major tech company gets accused of dipping into the piracy pool — that uneasy feeling like we’ve been here before, even though each case somehow feels bigger than the last. That’s the energy swirling around the latest clash between Vixen Media Group owner Strike 3 Holdings and Meta, after Meta asked a federal judge in Northern California to toss their lawsuit.

Strike 3 didn’t pull any punches in its original complaint. They say Meta didn’t just accidentally stumble across their movies online — they accuse the company of deliberately scooping up VMG content from pirate sites to help train its AI systems. And they’re specific about it. BitTorrent, they say, is where those downloads came from. It’s the kind of accusation that makes you imagine a massive tech company hunched over a torrent client, pretending to be invisible.

Meta’s response in October? A very different story — one that almost sounds like a shrug. They told the court that if videos were downloaded, the timing and the number of those downloads “point solely to personal use, not AI training.” And then came the eyebrow-raising line:

“Meta contractors, visitors, or employees may have used Meta’s internet access over the years to download videos for personal use.”

In other words: Hey, people do stuff on our Wi-Fi. Not our problem.

Meta insisted the download activity — the volume, the pattern — just didn’t look like anything tied to large-scale AI development and “is plainly indicative of private personal use.”

They also brushed aside Strike 3’s attempt to link non-Meta IP addresses to the company, calling it a stretch.

Strike 3 fired back this week, and you can almost hear the disbelief between the lines. Their response to the “employees were just downloading porn” defense was blunt:

“Meta’s excuse that employees must be infringing Plaintiffs’ copyrights for ‘personal use’ does not fit the facts.”

What they say does fit the facts?

Something far more coordinated — something algorithmic. They point to what they call “unique patterns of Meta’s piracy,” arguing those patterns don’t look like a person casually searching for adult videos. Instead, Strike 3 claims the behavior “not only indicate[s] the use of a universal ‘script’ (likely developed, supplied, or encouraged by Meta), but also show[s] that the BitTorrent downloads are ‘for AI training data and not for personal use.’”

They bolster that argument with another recent copyright case: Kadrey v. Meta Platforms, Inc.

In that suit, Meta “admitted to using BitTorrent to obtain digital copies” of books for AI training — and Strike 3 alleges that the same IP addresses used in the book-related downloads also appear in their own infringement logs. According to them, Meta even acknowledged in that case that it masked its corporate IPs to avoid detection.

And then comes one of the more striking accusations (no pun intended):

“Plaintiffs were able to correlate Meta’s infringement through its Corporate IPs to six hidden data centers that were used to engage in massive infringement of both Plaintiffs’ and the Kadrey plaintiffs’ works as well. The scale of these infringements is staggering and are ‘beyond what a human could consume.’”

Strike 3 says activity from those masked IP addresses mirrors the activity from Meta’s known corporate IP blocks — a kind of digital fingerprinting that, in their telling, points to one conclusion: these downloads were never about personal viewing habits. “Those ends,” they write, “were to train Meta’s AI and not for the personal use of its employees.”

And if the court believes that interpretation?

Meta wouldn’t just be facing another infringement spat. The numbers involved are enormous — the complaint lists at least 2,396 allegedly infringed movies. Multiply that by the maximum statutory damages of $150,000 per work, and you land at a number that barely feels real: $359.4 million.

But the money, oddly enough, may not be the part that echoes the loudest. This lawsuit drops the adult industry directly into the center of the broader fight over whether training an AI model on copyrighted material counts as “fair use” — or whether it’s just a slicker version of old-school piracy wearing a futuristic badge.

No one really knows yet how courts will treat AI training that leans on protected works. Everyone’s waiting for the first big ruling — the one that will set the tone for all the cases piling up behind it. And maybe that’s the real story here: not just whether Meta downloaded some videos, but whether we’re watching the early days of a legal shift that’s going to redraw the boundaries of ownership, creativity, and machine learning.

Because if AI is the future, then this messy, uncomfortable question is coming along for the ride whether Silicon Valley likes it or not.

Read More »

Discord Confirms Data Breach Exposed Government ID Photos of 70,000 Users

Discord logo

SAN FRANCISCO — Discord, the popular chat and community platform, confirmed that one of its third-party vendors experienced a major data breach that exposed the personal information of about 70,000 users, including photos of government-issued identification cards.

The affected vendor was responsible for processing age-verification submissions and appeals on behalf of Discord. The company has not yet named the vendor but indicated that the breach was the result of a cyberattack exploiting a Zendesk instance, allegedly part of an extortion attempt targeting both the vendor and Discord.

Early reports suggested that roughly 1.5 terabytes of data were stolen—around 2.2 million images tied to age-verification records. However, Discord said the actual scope was smaller than initially claimed.

“This was not a breach of our internal systems,” a Discord spokesperson told The Verge. “The attack targeted a third-party service we use to support our customer service operations. Approximately 70,000 users may have had government-ID photos exposed, which the vendor used for age-related appeal reviews.”

The company added that all affected users have been notified. “We’ve secured the affected systems, ended our relationship with the compromised vendor, and continue to cooperate with law enforcement, data protection authorities, and external security experts,” the spokesperson said. “We take our responsibility to protect user data seriously and understand the concern this may cause.”

Discord also disclosed that other personal details—including names, usernames, email addresses, IP addresses, and the last four digits of some users’ credit cards—were included in the compromised data.

While Discord remains best known for its role in gaming culture and online communities, it has also become a hub for artists, streamers, and adult creators who use the platform to interact with fans and build digital communities. The service allows users over 18 to share adult-oriented material within designated, age-restricted spaces.

Read More »

Strike 3 Holdings Sues Meta Over Alleged Use of Porn Content in AI Training

Strike 3 Holdings, a company that describes its films as “high quality,” “feminist,” and “ethical” adult videos, has filed a lawsuit against Meta in federal court in California, accusing the tech giant of infringing its copyrights by using Strike 3’s content to train artificial intelligence models. The complaint, filed in July, claims Meta has been torrenting and seeding the company’s videos since 2018. Supporting exhibits and details were unsealed last week.

According to the lawsuit, Meta sought Strike 3’s content because it offered angles and extended uninterrupted scenes that are “rare in mainstream movies and TV,” allegedly giving Meta an edge in developing what CEO Mark Zuckerberg calls AI “superintelligence.”

“They have an interest in getting our content because it can give them a competitive advantage for the quality, fluidity, and humanity of the AI,” said Christian Waugh, an attorney for Strike 3.

The filing alleges Meta BitTorrented and distributed 2,396 of Strike 3’s copyrighted videos, making them accessible to minors since the BitTorrent protocol does not include age verification. The complaint further asserts that Meta used the adult videos “for distribution as currency to support its downloading of a vast array of other content necessary to train its AI models.”

The exhibits list not only Strike 3 titles but also mainstream television shows such as Yellowstone, Modern Family, The Bachelor, South Park, and Downton Abbey. They also include pornographic videos produced by others that appear to feature very young actors, with titles such as ExploitedTeens, Anal Teens, and EuroTeenErotica. In addition, the list contains files related to weapons (3D Gun Print, Gun Digest Shooter’s Guide to the AR-15) and political material (Antifa’s Radical Plan and Intellectual Property Rights in Cyberspace).

Using adult content as training data is “a public relations disaster waiting to happen,” said Matthew Sag, a professor of law specializing in artificial intelligence at Emory University. “Imagine a middle school student asks a Meta AI model for a video about pizza delivery, and before you know it, it’s porn.”

Strike 3 says it identified the alleged violations through infringement-detection systems it operates and traced activity to 47 Meta-affiliated IP addresses. The company is seeking $350 million in statutory damages.

Christopher Sgro, a Meta spokesperson, said: “We’re reviewing the complaint, but we don’t believe Strike’s claims are accurate.”

The lawsuit draws attention to Meta’s V-JEPA 2 “world model,” released in June, which the company says was trained on 1 million hours of “internet video,” a term the complaint highlights as vague. Zuckerberg has described Meta’s goal as putting “the power of superintelligence into people’s hands to direct it toward what they value in their own lives.”

According to the complaint, Meta executives deliberately approved the use of pirated material, with Zuckerberg’s sign-off. Nearly every major AI company faces similar copyright suits.

“The case being presented against Meta is perhaps the case of the century because of the sheer scope of infringement,” Waugh said, adding that the unsealed exhibits represent only “a thin slice of the pie.”

AI companies often defend themselves by claiming that their technologies are “transformative” and thus protected under fair use. Former President Donald Trump voiced support for this view in July, saying: “You can’t be expected to have a successful AI program when every single article, book, or anything else that you’ve read or studied you’re supposed to pay for.”

In June, U.S. District Court Judge Vince Chhabria ruled that Meta did not break the law in training its AI models on the works of 13 authors in a separate case, Kadrey v. Meta. However, he clarified that the decision “stands only for the proposition that these plaintiffs made the wrong arguments and failed to develop a record in support of the right one.”

That leaves the door open for Strike 3 to mount a stronger case. “The best version of their argument is: This is a fundamental problem because, by going to these pirate websites, you are undermining the market for access,” Sag explained.

Waugh argued that the dispute underscores a broader issue. “It doesn’t matter if it’s a four-sentence poem or adult entertainment. There is no appetite in this country for what AI companies appear to be doing, which is making money off the backs of rights holders who never gave permission for it.”

Read More »

Bad Orange Deplatformed by YouTube, Loses Bank Account Access

YouTube logo

SHERIDAN, Wyo. — Bad Orange, a creator of sensual audio stories designed for women, announced that its YouTube channel — which had close to 200,000 followers — was recently removed. In addition, the company’s bank account was closed. Bad Orange is pushing back against these moves, calling them both unfair and hypocritical.

“It is confusing, to put it politely, why we are being subjected to these arbitrary standards in this day and age,” said Larry, a classically trained actor who performs under the pseudonym Daddy Sounds.

He pointed to the broader media landscape to highlight what he views as inconsistent standards. “At a moment when the average issue of Slate contains advice letters about extreme kink and The Daily Beast runs ads for Lovehoney’s Advent calendars, which they call ‘an erotic journey of passion, play, and connection,’ complete with sex toys, it’s rather puzzling why our content is viewed as intolerable,” Larry said. “It’s as if they hate women or something.”

Larry compared Bad Orange’s situation to that of the gaming industry, where adult-oriented game creators have seen thousands of accounts shut down by payment processors, devastating their businesses.

“It’s not right. It’s hypocritical, and we’re going to take action,” he added.

Although Bad Orange managed to establish an alternative method for processing payments, the company described the transition as a time-consuming and difficult administrative process.

Sensual audio content has been gaining popularity among women seeking erotic experiences outside the realm of mainstream adult video.

“Voice content is far more intimate and engaging than video,” Larry explained. “Like radio, it stimulates the listener’s imagination and visualization abilities, which is usually vastly more interesting than anything a camera can record.”

For more details, visit the Bad Orange website.

Read More »

Visa Just Made Chargebacks Twice as Dangerous for OnlyFans Creators

Visa Logo

Read More »

Twitch Introduces Age Verification for UK Users

LOS ANGELES — Twitch has begun verifying the ages of users logging in from United Kingdom IP addresses, according to a Tuesday report from Dexerto.

The streaming platform is the latest to implement strict age verification procedures following the rollout of the Online Safety Act. Under the new rules, U.K. regulator Ofcom requires digital platforms to ensure users pass either a facial scan or submit personal information before gaining access.

“Twitch and k-ID (a third-party vendor we partner with to verify your age) do not store your face scan video selfies,” the company explained. “The video selfie used for facial age estimation is analyzed entirely on your device and will never leave it.”

The move has already drawn backlash. Some Twitch users described the measure as “dystopian,” while others suggested they might stop using the platform altogether.

Adult content creators have long used Twitch to expand their mainstream reach. One prominent example is Amouranth, who streams on both Twitch and Kick and is also an award-winning adult content creator.

Read More »