thewaronporn

The War on Porn was created because of the long standing assault on free speech in the form of sexual expression that is porn and adult content.

New Federal Bills Target Repeal of Section 230

US Congress

Something old — and foundational — is back on the chopping block in Washington. This week, members of Congress introduced two separate bills that would dismantle Section 230 of the Communications Decency Act, the legal backbone that protects interactive computer services — including adult platforms — from being held liable for user-generated content.

On Tuesday, Rep. Harriet Hageman of Wyoming introduced HR 6746, known as the Sunset to Reform Section 230 Act. The proposal doesn’t tinker or trim around the edges. It simply adds one stark sentence to the law: “This section shall have no force or effect after December 31, 2026.”

A day later, Sen. Lindsey Graham of South Carolina followed with his own bill, S 3546, which would also repeal Section 230 — this time, two years after enactment.

An Attempt to Gain Leverage

The delayed timelines aren’t accidental. Rather than calling for an immediate repeal, lawmakers appear to be using the threat itself as leverage — a ticking clock meant to force reluctant stakeholders to the negotiating table.

On the right, critics argue that platforms hide behind Section 230 while censoring conservative speech. Their goal is to limit platforms’ ability to moderate content as they see fit. On the left — and sometimes crossing party lines — lawmakers rail against “Big Tech” for profiting from illegal or harmful material, pushing for stricter moderation by making platforms legally responsible for what users post.

In a statement, Hageman warned that “outside interests” would try to block reform efforts.

“We must therefore find a way to force the issue through the reauthorization process,” she said.

Sen. Richard Blumenthal of Connecticut, one of several Democratic co-sponsors of Graham’s bill, echoed that sentiment. Supporting S 3546, he framed it as a pressure tactic designed to corner tech companies. The bill, he said, would “force Big Tech to the table with a bold demand: either negotiate sensible reforms now or lose your absolute immunity forever.”

Others backing the legislation — Graham included — spoke less about leverage and more about repeal as an end goal. Either way, the ripple effects would land hard on the adult industry.

Potential Consequences

Once Section 230 is opened up, even slightly, it becomes easy to imagine targeted carve-outs — the same way FOSTA/SESTA stripped protections from sites accused of “unlawfully promoting and facilitating” prostitution or sex trafficking.

Industry attorney Lawrence Walters didn’t mince words. “The modern adult industry is largely dependent on Section 230, which allows for operation of fan sites, cam sites and adult platforms,” he said. “If this bill is passed, or an adult industry carve-out is adopted, these business models are threatened. This frontal assault on Section 230 immunity should be a source of great concern to the adult industry and online freedom, generally.”

Those concerns aren’t new. Back in 2024, Free Speech Coalition Executive Director Alison Boden warned that altering or repealing Section 230 would unleash chaos.

“I think that it would cause a further crackdown on sexual content,” Boden said. “If there was a carve-out of Section 230 for ‘obscenity,’ the same way that FOSTA/SESTA carved out ‘human trafficking,’ that would have serious implications.”

And the odds of an adult-specific carve-out feel higher now than ever, given the broader political climate.

The Supreme Court has already signaled a shift, ruling in Free Speech Coalition v. Paxton that laws restricting access to adult content may be subject to a less rigorous standard of review. During his first term, President Trump attempted — unsuccessfully — to repeal Section 230 through an amendment to an unrelated bill. His FCC chair pick, Brendan Carr, has openly called for gutting Section 230 protections and previously helped author Project 2025’s “Mandate for Leadership,” which controversially asserted that pornography “has no claim to First Amendment protection.” Graham’s bill is also bipartisan, with co-sponsors that include influential Democrats like Dick Durbin and Amy Klobuchar.

A carve-out aimed at adult platforms would function as a de facto repeal of Section 230 for the industry. Any site hosting user-generated content would suddenly be exposed to legal liability — and a flood of civil lawsuits would almost certainly follow.

While the First Amendment protects legal speech, Section 230 has long acted as a shield against attempts to suppress that speech through litigation. Without it, unpopular expression — and adult content sits squarely in that category — becomes an easy target.

Industry attorney Corey Silverstein put it bluntly: losing Section 230 would be “catastrophic.”

“It would mean that internet service providers, search engines, and every interactive website could be left responsible for the actions of its users,” Silverstein said. “That is simply untenable, and these businesses would not be able to exist out of fear of being sued out of existence.”

And that’s the quiet reality behind the rhetoric. Strip away the politics, the soundbites, the threats meant to scare platforms into compliance — and what’s left is a question no one seems eager to answer: what happens to online speech when the shield disappears?

Read More »

What Would Ethical Age Verification Online Actually Look Like?

age verification

Age-verification laws are spreading fast, and on paper they sound simple enough: if a website hosts explicit content — and sometimes even if it doesn’t — it has to check that visitors are over 18, usually by collecting personal data. Lawmakers say it’s about protecting kids. Full stop.

But scratch the surface and things get messy. Privacy experts keep waving red flags about what happens when sensitive personal data starts piling up on servers. And this year, several studies quietly dropped an uncomfortable truth: these laws don’t actually seem to stop minors from accessing porn at all.

So the uncomfortable question hangs in the air — is age verification, the way it’s currently done, ethical? And if not, what would ethical age-verification even look like? When experts were asked, their answers kept circling back to the same idea: device-level filters.

Current age-verification systems

Right now, most laws — from state-by-state mandates in the U.S. to the UK’s Online Safety Act — put the burden on platforms themselves. Websites are expected to install age checks and sort it out. And, honestly, it hasn’t gone well.

“Age gating, especially the current technology that is available, is ineffective at achieving the goals it seeks to achieve, and minors can circumvent it,” said Cody Venzke, senior policy counsel for the ACLU.

A study published in November showed what happens next. Once these laws go live, searches for VPNs shoot up. That’s usually a sign people are sidestepping location-based restrictions — and succeeding. Searches for porn sites also rise, suggesting people are hunting for platforms that simply don’t comply.

The ethics get even murkier. Mike Stabile, director of public policy at the Free Speech Coalition, didn’t mince words. “In practice, they’ve so far functioned as a form of censorship.”

Fear plays a huge role here. When people worry their IDs might be stored, processed, or leaked — and we’ve already seen IDs exposed, like during October’s Discord hack — they hesitate. Adults back away from legal content. That same November study argued that the cost to adults’ First Amendment rights doesn’t outweigh the limited benefits for minors.

“Unfortunately, we’ve heard many of the advocates behind these laws say that this chilling effect is, in fact, good. They don’t want adults accessing porn,” Stabile said.

And for some lawmakers, that’s not a bug — it’s the feature. Project 2025, the blueprint tied to President Trump’s second term, openly calls for banning porn altogether and imprisoning creators. One of its co-writers, Russell Vought, was reportedly caught on a secret recording in 2024 calling age-verification laws a porn ban through the “back door.”

But there is another path. And it doesn’t start with websites at all.

An ethical age assurance method?

“Storing people’s actual birth dates on company servers is probably not a good way to approach this, especially for minors… you can’t change your birth date if it gets leaked,” said Robbie Torney, senior director of AI programs at Common Sense Media.

“But there are approaches that are privacy-preserving and are already established in the industry that could go a long way towards making it safer for kids to interact across a wide range of digital services.”

It also helps to separate two terms that often get lumped together. Age verification usually means confirming an exact age — showing ID, scanning documents, that sort of thing. Age assurance, Torney explained, is broader. It’s about determining whether someone falls into an age range without demanding precise details.

One real-world example is California’s AB 1043, set to take effect in 2027.

Under that law, operating systems — the software running phones, tablets, and computers — will ask for an age or birthday during setup. The device then creates an age-bracket signal, not an exact age, and sends that signal to apps. If someone’s underage, access is blocked. Simple. And notably, it all happens at the device level.

That approach has been recommended for years by free-speech advocates and adult platforms alike.

“Any solution should be easy to use, privacy-preserving, and consumer-friendly. In most cases, that means the verification is going to happen once, on the device,” Stabile said.

Sarah Gardner, founder and CEO of the child-safety nonprofit Heat Initiative, agreed. “Device-level verification is the best way to do age verification because you’re limiting the amount of data that you give to the apps. And many of the devices already know the age of the users,” she said.

Apple already does some of this. Its Communication Safety feature warns children when they send or receive images containing nudity through iMessage and gives them ways to get help. The company recently expanded protections for teens aged 13–17, including broader web content filters.

So yes, the technology exists. And in 2027, at least in California, device makers will have to use it.

But there’s a catch. AB 1043 doesn’t apply to websites — including adult sites. It only covers devices and app stores.

“Frankly, we want AB 1043 to apply to adult sites,” Stabile said. “We want a signal that tells us when someone is a minor. It’s the easiest, most effective way to block minors and doesn’t force adults to submit to biometrics every time they visit a website.”

Last month, Pornhub sent letters to Apple, Google, and Microsoft urging them to enable device-level age assurance for web platforms. Those letters referenced AB 1043 directly.

Venzke said the ACLU is watching these discussions closely, especially when it comes to privacy implications.

Will device-level age assurance catch on?

Whether tech giants will embrace the idea is still an open question. Microsoft declined to comment. Apple pointed to recent updates around under-18 accounts and a child-safety white paper stating, “The right place to address the dangers of age-restricted content online is the limited set of websites and apps that host that kind of content.”

Google struck a similar tone, saying it’s “committed to protecting kids online,” and highlighted new age-assurance tools like its Credential Manager API. At the same time, it made clear that certain high-risk services will always need to invest in their own compliance tools.

Torney thinks the future probably isn’t either-or. A layered system, where both platforms and operating systems share responsibility, may be unavoidable. “This has been a little bit like hot potato,” he said.

No system will ever be perfect. That part’s worth admitting out loud. “But if you’re operating from a vantage point of wanting to reduce harm, to increase appropriateness, and to increase youth wellbeing,” Torney said, “a more robust age assurance system is going to go much farther to keep the majority of teens safe.”

And maybe that’s the real shift here — moving away from blunt tools that scare adults and don’t stop kids, toward something quieter, smarter, and a little more honest about how people actually use the internet.

Read More »

British Adult Creator Bonnie Blue Back in the UK Following Bali Deportation

Bonnie Blue

It still feels a little surreal when you read it out loud: British adult content creator Bonnie Blue is back in the UK after being deported from Bali and handed a 10-year ban from Indonesia. A police raid on a rented studio put an abrupt end to her trip, with the 26-year-old detained alongside a group of international travelers as authorities launched what they called a crackdown on potential pornography violations.

Bonnie Blue, 26, isn’t exactly new to attention. She’s built an international following through social media and subscription platforms, leaning into controversy with the kind of provocative marketing that either hooks you instantly or makes you roll your eyes. Her so-called promotional tours—often aimed at young adult audiences—have always lived right on that thin line between spectacle and scandal. This time, the line snapped.

During the raid, Indonesian police seized cameras, vehicles, and other equipment. Badung Police alleged that Blue and the group were playing a sex game in which the “winner would sleep with Bonnie Blue.” Later, though, a local Bali outlet reported that “no pornographic activities or acts have been found in the collaborative content.” That gap between accusation and evidence has hung over the story ever since.

Authorities also accused Blue of promoting a “BangBus” tour in Bali, supposedly involving explicit content with barely legal Australian visitors. BangBus, for anyone who’s somehow missed it, is a long-running adult franchise built around mobile productions—vans, buses, the whole voyeur-on-wheels concept. It’s been copied, remixed, and rebranded so many times that it’s basically a genre now, which only added fuel to the fire.

Because Indonesia’s anti-pornography laws are notoriously strict, the case quickly went global. Headlines speculated about serious prison time. Cameras followed every move. In court, Blue didn’t exactly play the role people expected—laughing, chatting with spectators, even sucking on a lollipop. In the end, the punishment was almost absurdly small compared to the buildup: a fine she later described as $20, followed by deportation.

Back on British soil, Blue wasted no time addressing the drama. “I’m rich and have good lawyers – did you really think I’d face jail time?” she told reporters. She said she was “excited” to show people “what got me into all this trouble,” joking that she needed to “recuperate” her “huge losses” from the $20 fine. It was classic Bonnie—half bravado, half wink.

In her first Instagram video after returning home, Blue pointed the finger elsewhere. “The girl that organized this whole trip for me, she was like: ‘Oh, I’ll sort security, hotels, lawyer, flights, everything,’” Blue said. She claimed the organizer “charged me £75,000 ($150,000) – she has taken a big chunk of the money and then has reported me to the police.” If true, it turns the whole saga into something closer to betrayal than bad planning.

Blue says she plans to tell “the whole story” to her fans soon. Whether that ends up feeling like a confession, a clapback, or just another chapter in a career built on shock value is anyone’s guess—but one thing’s certain: this isn’t the last time people will be talking about her.

Read More »

UK Lords Vote to Ban Depictions of ‘Choking’ in Adult Content

UK flag

LONDON — There’s a particular hush that settles over a chamber when lawmakers realize they’re about to redraw a line that can’t easily be erased. That moment arrived in the House of Lords, where members agreed to amendments to the pending Crime and Policing Bill that would make depicting “choking” in pornography illegal and classify it as a “priority offense” under the Online Safety Act.

On Dec. 9, the Lords voted to approve Amendments 294 and 295, measures that would turn the possession or publication of “pornographic images of strangulation or suffocation” into a criminal offense. The language is blunt. Intentionally so.

If the bill becomes law with those amendments intact, the consequences are no small thing. Possessing “choking” material could carry a prison sentence of up to two years, while publishing it could result in as much as five years behind bars. The stakes, suddenly, feel very real.

Parliamentary Under-Secretary of State Baroness Alison Levitt, speaking for the government, told fellow Lords that the law would hinge on whether the strangulation or suffocation depicted is “explicit and realistic.” It wouldn’t need to be real, she said—just convincing enough.

“For example, it can be acted or posed,” she explained. “Or the image may be AI-generated — provided that the people in the image look real to a reasonable person.”

Designating choking content as a priority offense, Levitt added, “will oblige platforms to take the necessary steps to stop this harmful material appearing online.” In regulatory terms, that’s a heavy label—one that triggers active duties, not polite suggestions.

At the moment, the “priority offense” category is reserved for some of the darkest corners of the internet, including CSAM and terrorism-related material. Sliding choking depictions into that same bracket signals how seriously lawmakers want platforms to take this.

The push to ban portrayals of nonfatal strangulation didn’t materialize overnight. It picked up steam after the release of a pornography review in February that recommended prohibiting adult content deemed “degrading, violent and misogynistic.” By June 19, the government made its position unmistakably clear, confirming its intention to outlaw content involving strangulation.

Baroness Gabrielle Bertin, a Conservative peer who served as the independent lead reviewer on that pornography review, welcomed the amendments with unmistakable conviction.

“This is not just another amendment,” Bertin said. “It is a light-bulb moment, a recognition that what has been normalized for too long is neither safe nor acceptable.”

Not every proposed expansion made it through. The government rejected other amendments that would have criminalized additional forms of adult content, including a proposal to

Read More »

House Panel Pushes Federal Age-Verification Bill Forward

US Congress

WASHINGTON — The vote landed quietly, but the implications didn’t. A U.S. House subcommittee moved Thursday to amend the SCREEN Act, pushing it one step closer to becoming federal law and putting site-based age verification for adult content squarely on a national track. With the amendment approved, the bill now heads to the full Committee on Energy and Commerce for review.

Behind the scenes, the Subcommittee on Commerce, Manufacturing, and Trade has been busy stitching together a broader package of online safety legislation, with the Shielding Children’s Retinas from Egregious Exposure on the Net (SCREEN) Act sitting near the center of it all. This isn’t a sudden burst of concern—it’s been building, piece by piece, vote by vote.

When Republican Sen. Mike Lee of Utah and Rep. Mary Miller of Illinois first introduced the bill in February, the enforcement teeth were already baked in. Violations of the SCREEN Act would be treated as violations of the Federal Trade Commission Act, meaning they’d fall under the umbrella of unfair or deceptive acts or practices. In plain terms: civil penalties of up to $10,000 per violation. That kind of number tends to get people’s attention.

On Thursday, Rep. Craig Goldman of Texas framed his amended version as familiar territory. Speaking to the subcommittee, he said it “follows the same playbook as Texas,” a nod to HB 1181—the state age-verification law that ignited the Supreme Court case Free Speech Coalition v. Paxton.

“It updates the SCREEN Act to align it with the successful Texas statute, and federalizes it across the country,” Goldman said. “The protections that the courts have already upheld for children in Texas should not stop at our border. Every child in America deserves the same consistent standard of safety as a child in Texas has. We must protect children from harmful online content, and we can accomplish this better by updating the SCREEN Act.”

The amended version tweaks the bill’s language in several places, though not in ways that clearly mirror Texas’ law line for line. That distinction matters more than it might sound like at first glance.

“This is not a mirror of the Texas law,” industry attorney Lawrence Walters observed. “It would likely render the Texas law unenforceable due to federal preemption.”

Federal law already tends to trump state law, but the amendment doesn’t leave much room for interpretation. It includes an explicit declaration of supremacy, stating, “No State or political subdivision of a State may prescribe, maintain, or enforce any law, rule, regulation, requirement, standard, or other provision having the force and effect of law, if such law, rule, regulation, requirement, standard, or other provision relates to the provisions of this Act.”

Right now, roughly half of U.S. states have their own age-verification laws on the books. If the SCREEN Act becomes law, those state rules wouldn’t just coexist—they’d be overridden.

For now, the amendment cleared the subcommittee by a voice vote, sending it onward to the full Committee on Energy and Commerce. It’s not the final word, but it’s a clear signal. The question isn’t whether this debate is going national—it’s how much ground will shift once it does.

Read More »

Indiana Lawsuit Claims Aylo Failed to Stop VPN Access to Adult Sites

Todd Rokita

INDIANAPOLIS—There’s something oddly familiar about this moment. Indiana Attorney General Todd Rokita, a Republican, stepped forward this week to announce that his office has sued Aylo, its affiliated companies, and ownership group Ethical Capital Partners. The accusation: that they violated the state’s age-verification laws by failing to fully block users who access sites through virtual private networks, or VPNs.

Aylo, the parent company behind Pornhub and several other free and premium adult platforms, has already shut the door on Indiana entirely. The company has blocked all Indiana IP addresses, choosing to withdraw from the state’s digital landscape rather than implement the sweeping age-verification requirements that took effect earlier this year.

“We know for a fact, from years of research, that adolescent exposure to pornography carries severe physical and psychological harms,” Rokita said in a statement released by his office.

“It makes boys more likely to perpetrate sexual violence and girls more likely to be sexually victimized. Yet, despite such realities, these defendants seem intent on peddling their pornographic perversions to Hoosier kids,” Rokita continued, explaining why his office brought the lawsuit. The framing is dramatic—but it also sidesteps the core issue quietly doing the real work here.

In the legal complaint filed in state court, Rokita advances a theory that places responsibility on Aylo not just for blocking Indiana users, but for failing to block access even when users disguise their location through VPNs or proxy servers that make them appear to be outside the state.

In other words, the argument treats the existence of VPNs themselves as a kind of open tunnel—one that, in the state’s view, leads minors straight to content that would otherwise be unavailable under Indiana’s age-verification rules.

Corey Silverstein, an attorney who represents adult-industry clients, said the lawsuit is unsurprising—and deeply troubling.

“It was just a matter of time before one of these state Attorneys General tested this theory,” Silverstein said. “We are going to monitor the case very carefully.”

He added, “I see substantial roadblocks for the government’s case, but, again, I’m not surprised because the states have been emboldened by the Supreme Court decision in Free Speech Coalition v. Paxton. Going after a VPN service provider would be a stretch, and Section 230 [of the Communications Decency Act of 1996] would stop it.

“That’s a dangerous concept, though, because what’s next? Power companies? Landlords that lease data center space?”

From a legal standpoint, the case itself feels thin—almost delicate in how much weight it tries to carry.

“As of the date of this filing, defendants’ websites (Aylo) identified above restrict access by users whose devices purport

Read More »

Indonesia Orders Deportation of Performer Bonnie Blue, Imposes 10-Year Ban

Bonnie Blue

It started the way these stories often do — with a sudden knock, a raid, and the kind of attention no one wants while working overseas. Indonesian immigration officials say adult film performer Bonnie Blue will be deported and barred from reentering the country for at least 10 years, after authorities concluded she violated her visa by producing commercial content in Bali. Immigration chief Heru Winarko said Blue and her three-person crew crossed a clear line between tourism and paid work.

Blue and her team, which includes Australian comedian Julian Woods, are expected to be deported once a separate police investigation wraps up. That probe centers on allegations involving the purchase of an unregistered vehicle and driving without the proper license — issues that, while mundane on paper, can snowball quickly when you’re a foreigner under scrutiny.

The situation escalated last week when the 26-year-old British performer was arrested alongside 17 male tourists during a raid on a studio in Badung. Fourteen Australian men were ultimately released without charge, but authorities continued examining whether Blue and three others had violated the terms of their residence permits.

Badung Police Chief Arif Batubara said the group entered Indonesia on tourist visas but used their time to create commercial content — a violation of what those visas allow. During the investigation, officials confiscated passports, a move that tends to make the reality of the situation sink in fast.

After two days of questioning, police said they found no pornographic material during the raid itself. Investigators added that everyone involved acknowledged taking part in the production of reality show content at the studio, a distinction that complicated the case but didn’t make it disappear.

Authorities said the investigation was sparked by public complaints about Blue’s activities during Australia’s “Schoolies” celebrations, which drew attention in a country known for its deeply rooted religious values. Indonesia enforces strict morality laws, including bans on public nudity and sexual activity outside of marriage, and public sensitivity around these issues runs high.

During the search, police seized Blue’s vehicle, cameras, and other production equipment. Had the original pornography allegations held, she could have faced penalties of up to 15 years in prison and fines nearing $541,000 — a reminder of how severe the consequences can be when laws collide with perception.

This isn’t Blue’s first run-in with immigration authorities abroad. She was deported from Fiji in November 2024 alongside another adult performer, after officials there cited the need to protect the country’s integrity.

Blue is scheduled to appear before the Denpasar District Court on December 12 to address the vehicle registration charges. No one involved has been criminally charged over the content creation allegations — at least not yet — leaving the case suspended in that uneasy space between enforcement and interpretation, where so many modern travel stories seem to end.

Read More »

AVS Group Moves to Strengthen Age Verification After Costly Fine

Age verification

It landed like a thud you could feel in your chest. An adult content network suddenly staring down a roughly $1.3 million penalty, and the quiet realization that the rules had shifted while everyone was still arguing about whether they ever would. In the wake of that fine, AVS Group has begun rolling out tougher age checks on some of its sites, responding to U.K. regulator pressure tied to the Online Safety Act.

A spokesperson for the regulator said the company has now put age-assurance tools in place on portions of its network that are “capable of being highly effective at correctly determining whether or not a user is a child.” That phrasing matters. It’s careful. And there’s a catch. The same spokesperson made it clear that further penalties are still on the table until the regulator is fully “satisfied” that changes have been made across every platform named in the investigation.

The fine itself followed an inquiry that found AVS Group had failed to implement robust age checks on 18 adult websites. According to the regulator, some of those sites had no age-assurance systems at all, while others relied on methods that simply didn’t meet the standard of being “highly effective.” It’s the kind of language that sounds dry on paper but carries real weight when money and access are on the line.

And the money isn’t the scariest part. Enforcement powers under the law stretch much further — fines that can climb to 18 million pounds or 10% of qualifying worldwide revenue, whichever hits harder. There’s also the nuclear option: court orders that could force payment providers or advertisers to walk away, or even compel internet service providers to block a site entirely within the U.K. That’s not a slap on the wrist. That’s existential.

A request for comment has been sent to AVS Group’s parent company, TubeCorporate, but no response has been issued so far. Which leaves a lingering question hanging in the air: if this is what partial compliance looks like, what does “satisfied” actually mean — and who’s next to find out the hard way?

Read More »

Australia Tells Search Engines to Blur Pornographic Images

The Australian flag

Some policies don’t land with a thud so much as a slow, unsettling echo — the kind that makes you sit up a little straighter and wonder what pushed things this far. That’s how it felt when Australia’s eSafety commissioner, Julie Inman Grant, announced that search engines across the country will soon be required to blur pornographic and violent images. The rule kicks in on December 27, and you can almost sense the mix of urgency and inevitability behind it.

“We know that a high proportion of this accidental exposure happens through search engines as the primary gateway to harmful content, and once a child sees a sexually violent video, for instance, maybe of a man aggressively choking a woman during sex, they can’t cognitively process, let alone unsee that content,” Grant said.

There’s something chilling about that last part — the idea of “can’t unsee.” It’s true for adults, too, but with kids it lands differently. Grant continued: “From 27 December, search engines have an obligation to blur image results of online pornography and extreme violence to protect children from this incidental exposure, much the same way safe search mode already operates on services like Google and Bing when enabled.”

Blurred results aren’t new. They’ve been tucked inside safe search settings for years, hidden like a fire extinguisher behind a glass panel — available, but only if someone remembers to activate it. What’s shifting now is the weight of responsibility: it’s no longer on parents to flip a switch; the platforms have to build the guardrails themselves. And all of this is happening alongside another major change hitting Australia on December 10, when the country’s social media ban for under-16s comes into force.

Earlier this month, it was estimated that around 150,000 Facebook accounts and 350,000 Instagram accounts belonging to under-16s will disappear from the digital landscape, like entire classrooms quietly going offline. Grant framed the larger moment this way: “These are important societal innovations that will provide greater protections for all Australians, not just children, who don‘t wish to see ‘lawful but awful’ content.”

That phrase — “lawful but awful” — hangs in the air. It’s complicated, a little subjective, and strangely honest. And maybe that’s the real story here: a country trying to redraw the line between what the internet allows and what people can realistically live with.

Read More »

U.S. OCC Releases Preliminary Report on Debanking

OCC Debanking Report

Some mornings, the news hits you like a jolt of cold water — shocking at first, then oddly clarifying. That’s how it felt when the U.S. Office of the Comptroller of the Currency (OCC) dropped a preliminary report on debanking, finally calling out what so many in the adult industry have been living with for years. It’s strange to feel victorious over a problem you never should’ve had in the first place, but here we are, holding something that looks a lot like validation.

The OCC names nine major banks — the kind everyone’s parents told them were “safe,” including Chase, Wells Fargo, and Bank of America — for potentially violating the President’s Executive Order against discriminating toward people engaged in “lawful business activities.” Reading it, I had one of those moments where you want to underline every other sentence because someone, somewhere in government, actually said the quiet part out loud. The OCC states that the adult industry, among others, was subjected to “inappropriate distinctions” when trying to access basic financial services:

“The OCC’s preliminary findings show that, between 2020 and 2023, these nine banks made inappropriate distinctions among customers in the provision of financial services on the basis of their lawful business activities by maintaining policies restricting access to banking services… For example, the OCC identified instances where at least one bank imposed restrictions on certain industry sectors because they engaged in ‘activities that, while not illegal, are contrary to [the bank’s] values.’ Sectors subjected to restricted access included oil and gas exploration, coal mining, firearms, private prisons, tobacco and e-cigarette manufacturers, adult entertainment, and digital assets.”

Seeing adult entertainment listed there — not as a punchline, not as an afterthought, but as a recognized target of discrimination — is surreal. It’s proof that the federal government isn’t just aware of the problem; it’s saying, pretty plainly, that the problem matters. That we matter. And for once, the burden shifts off the people running these businesses and onto the banks that have quietly punished them under the guise of “values.”

This marks the first time in a long time that the adult industry isn’t shouting into the void. The OCC has confirmed that we’re covered under the Executive Order. Banks now know that the old playbook — the one where they shut down accounts for “reputational risk” and shrug — might actually put them on the wrong side of federal policy.

There’s still a road ahead, of course. In the coming weeks and months, the OCC will move into the rule-making phase, and that’s where the shape of all this becomes real. We’ll learn more as they flesh out the details, and so will everyone who’s been denied a basic checking account simply for doing legal work that made some executive squeamish. But for the first time in years, there’s a crack of daylight. Maybe — just maybe — we’re watching the beginning of the end of a discrimination problem that never should’ve existed in the first place.

And honestly? It’s about time the people holding the money had to explain themselves.

Read More »