Netchoice logo

Big Tech Pushes Back Against New Colorado Rules

There’s something almost surreal about the idea of a phone interrupting a teenager’s late-night scrolling to say, Hey, maybe this isn’t great for your brain. That’s what Colorado’s new law is aiming for: gentle on-screen nudges when minors have been on a platform for more than an hour, or when they’re using it between 10 p.m. and 6 a.m. The warnings would have to reference research on brain development and repeat every thirty minutes. It’s the kind of thing that could become background noise — or maybe it could give someone a moment to pause. Hard to say. The provision is supposed to roll out on January 1.

But before any pop-ups appear, the law is already tied up in court. NetChoice, a group representing major social media platforms, has filed a lawsuit arguing that Colorado can’t require companies to deliver the government’s message on their own services. Their point is that social media isn’t just a product — it’s a place where communication happens, where speech happens — and the First Amendment doesn’t let the government compel speech, even in the name of public health. They note that other media industries have voluntary systems for guidance — movie ratings, for example — and that forcing platforms to present specific warnings crosses constitutional lines.

Supporters of the law say the goal is simply to help protect kids, especially given widespread concern about youth mental health. They argue that many parents want something in place to help guard against endless scrolling or disrupted sleep. Opponents respond that the science around social media’s effects is still developing — not nonexistent, but varied, with impacts that differ depending on the individual. Some researchers point to small but measurable effects; others note that social media can be a source of support, identity, creativity, or connection. So the debate becomes less about whether social media is “good” or “bad” and more about who gets to decide how platforms communicate risk — the state, the companies, or families themselves.

Similar legal battles are unfolding in other states, and courts haven’t landed on one clear answer yet. Some laws have been paused, others allowed to proceed while the challenges continue. Colorado’s case will likely turn on some very old constitutional questions — about speech, autonomy, and the limits of state power — applied to a very modern situation. And maybe that’s what makes this moment feel so unsettled: the technology is new, the stakes feel personal, and the rules are still being written.

Read More »

Why is Ofcom trying to censor Americans?

Spiked’s Adam Edwards opines on the Online Safety Act in the UK.

The story centers on U.S. lawyer Preston Byrne, who represents the message board 4chan and is openly defying the UK’s Online Safety Act. When the UK regulator Ofcom issued 4chan a £20,000 fine, Byrne publicly mocked the demand and argued that British law has no legal power over companies and citizens who have no operations or assets in the UK. He views the Online Safety Act as an overreaching censorship regime and says Ofcom is trying to enforce rules outside its jurisdiction by sending threatening letters instead of going through proper international legal channels.

The Online Safety Act requires any online platform accessed by UK users—regardless of where the company is based—to submit risk assessments, reports, and censorship plans, under threat of fines or even jail for executives. While 4chan has refused to comply and likely faces no real consequences because it has no UK presence, larger American companies like Meta and Google do have substantial assets in Britain, making potential enforcement far more serious. This has sparked broader questions about sovereignty, free speech, and whether a foreign government can compel U.S. companies to restrict or monitor content.

To counter the UK’s moves, Byrne has launched both a legal challenge in U.S. federal court and proposed new U.S. legislation called the GRANITE Act, which would allow American companies to sue foreign regulators like Ofcom if they attempt to impose fines or censorship demands. If passed, it could effectively block foreign censorship attempts and even allow U.S. courts to seize foreign government assets in retaliation. Byrne argues that if the UK cannot force U.S. firms to comply, British lawmakers may eventually be forced to reconsider the Online Safety Act altogether.

Read More »
Tennessee flag

Appeals Court Clears Tennessee to Begin Enforcing Age Verification Law

There are court decisions that land with a dull thud, and then there are the ones that feel like a door has quietly been locked behind you. This week’s move from the U.S. Sixth Circuit Court of Appeals sits firmly in the second category.
A three-judge panel just wiped away a lower court’s injunction that had been blocking Tennessee from enforcing its age verification law for adult sites. They didn’t do it with loud fanfare. They simply pointed to a recent Supreme Court decision and said, essentially, their hands are tied. And for anyone who works in digital adult spaces — creators, viewers, small platform operators — the shift is significant.
The law in question sits under the Protect Tennessee Minors Act (PTMA), which requires commercial adult platforms to verify the age of anyone trying to access sexual content. Tennessee lawmakers didn’t write this in a vacuum — they modeled it after Texas’ HB 1181, the same law that’s been bouncing through courts for over a year now. And with the Supreme Court’s ruling in Free Speech Coalition et al. v. Paxton, Tennessee suddenly found a legal green light where, a week earlier, there’d been a flashing red.
The Sixth Circuit judges put it plainly: “[The] Supreme Court has upheld a Texas statute that the Court described as ‘materially similar’ to the one at issue here.”
That line is the hinge. The before and after.
Once the highest court signed off on Texas’ law, Tennessee simply adjusted its own definitions to match. The panel noted that “the state has also since amended the PTMA, the Attorney General says, to track ‘the definitional language at issue in Paxton almost word for word.’”
Meaning: Tennessee rewrote its law to be legally bulletproof — or at least bullet-resistant — by mirroring the one that survived at the highest level.
And that’s where things get complicated, even surreal, depending on who you ask.
Earlier versions of the PTMA had been criticized for being vague, especially in how the state defined “obscenity” and “harmful to minors.” Those categories aren’t just slippery — they’re historically weaponized. What counts as obscene in Nashville might pass without a blink in New York, or Berlin, or literally any corner of the internet where adults talk to each other freely. Lawmakers in Tennessee tightened those definitions after the Supreme Court’s decision, lifting language almost directly from the Texas statute that the justices allowed to stand.
The result? A clearer legal blueprint — but one that still treats the adult internet like a gated amusement park, complete with bouncers and ID scanners.
The Sixth Circuit didn’t rule that age verification is harmless or good policy. They just said that, for now, the district court’s injunction can’t stand while appeals continue. In legal speak, the fight isn’t over. In reality, the pressure just shifted to the people who will have to comply immediately.
And that’s where the human story sits:
•The solo performer who runs her own website.
•The queer couple documenting intimacy as art.
•The niche platforms built by two founders and a rented server rack.
•The teenager who grew up online and suddenly hits a digital checkpoint that assumes danger instead of curiosity.
There’s a narrative underneath these rulings that rarely fits into the court filings:
This isn’t just about sex. It’s about who gets to speak, who gets to control, and who gets to define what is “harmful.”
Tennessee made its move quietly, in the wake of a louder battle in Texas. But the message travels. Other states are already watching, already drafting, already preparing to photocopy the language line by line.
Sometimes the law doesn’t roar.
Sometimes it whispers.
But the effect can be the same.
And the internet is listening.

Read More »
Free Speech Coalition logo

Free Speech Coalition Rolls Out New Age Verification Toolkit for Adult Platforms

Something’s always shifting in the world of compliance — especially when it comes to how adult sites handle age verification. The Free Speech Coalition (FSC), the legal advocacy nonprofit that often ends up doing the industry’s heavy lifting, just rolled out an updated version of its Compliance With U.S. Age Verification Laws: A Toolkit for Adult Websites.

The group explained on its website that the updates were necessary, given how quickly the legal ground keeps moving. What used to be “best practices” a few months ago can suddenly look outdated once new state laws or attorney general actions land on the table.

“Today, we are releasing an updated edition reflecting new legal developments and feedback from stakeholders who’ve put the toolkit into practice,” the post read.

And it’s not a light revision, either. “The key updates in this version include the final language and analysis of the Missouri age verification regulation taking effect November 30th and inclusion of recently-filed attorney-general actions, regulatory notices, and litigation related to age-verification laws,” FSC added.

That sense of urgency runs through every line of the update. “FSC’s guidance to our members continues to develop as state requirements and enforcement actions evolve,” said Alison Boden, executive director of the Free Speech Coalition. “Staying up-to-date is vital as companies make decisions about compliance.”

For those who actually need to keep their sites out of trouble (and their users protected), the new version of Compliance With U.S. Age Verification Laws: A Toolkit for Adult Websites is available for download at FreeSpeechCoalition.com/toolkit

Read More »
Arcom logo

France Launches Investigation Into Shein, Temu, and AliExpress Over Youth Porn Exposure

It’s one thing for a brand to go viral for its prices — it’s another to land under government investigation. France’s Finance Minister, Roland Lescure, just put fast-fashion giant Shein on notice after a watchdog found “child-like” sex dolls being sold on the site. Reuters broke the story, and Lescure didn’t mince words: he called the products “illegal.”

He went even further. “For terrorist acts, drug trafficking, and child pornography, the government has the right to request banning access to the French market,” he said. The threat hit just as Shein opened its first permanent retail store — a glitzy shop in central Paris inside the historic BHV department store. It’s the kind of irony that writes itself.

But this isn’t just about one brand. French regulators are now investigating whether Shein — along with Temu, AliExpress, and Wish — has allowed minors to access pornographic content through their platforms. That’s not just scandalous; it’s illegal. The country’s age-verification laws are strict, and these platforms may have crossed the line.

France’s consumer watchdog, the Directorate-General for Competition, Consumer Affairs and Fraud Control, issued an advisory explaining that “the e-commerce site Shein was selling child-like sex dolls.” They didn’t sit on it — the listings were reported to a public prosecutor.

The agency added that “these activities have been reported to ARCOM, the competent regulatory body in this area, and, in agreement with the public prosecutor, a report has been filed with the platform, urging it to implement appropriate measures promptly.” In other words, this is no warning shot — it’s an official escalation.

Quentin Ruffat, Shein’s head of public affairs in France, tried to strike a cooperative tone when speaking to local radio, as reported by Reuters. He said the company was sharing information with investigators, including the names of vendors and buyers.

“We are in the process of sacking all the offending vendors from the platform,” Ruffat said. Meanwhile, Lescure confirmed he’d submitted a report to ARCOM, noting that Shein qualifies as a “very large online platform” under the European Union’s Digital Services Act — meaning, yes, it’s squarely in the regulators’ crosshairs.

AliExpress, another major e-commerce player, isn’t escaping scrutiny either. It’s being investigated for allegedly distributing pornographic images or depictions of minors — a charge that can lead to five years in prison.

It’s worth remembering that these platforms — Shein, AliExpress, Temu — are backed by massive Chinese corporations, while Wish belongs to a Singaporean parent company. They’ve built empires on accessibility and affordability. But as France is reminding them now, there’s a line between disruption and disregard — and crossing it can get very expensive.

Read More »
age verification

Quick Look: The Status of Age Verification Laws Across the U.S. by Morely Safeword

As you’re likely aware, since you’re reading this site, in recent years there’s been a proliferation of new state laws across the United States that require adult websites to verify the age of users before displaying any content that may be deemed “harmful to minors” to those users.

After the recent Supreme Court decision in Free Speech Coalition v. Paxton, in which the court upheld the age verification mandate passed by the Texas legislature, similar laws in other states are now clearly enforceable. With Missouri’s law poised to take effect later this month, it’s a good time to remind ourselves of the states that have (and haven’t, yet) passed similar laws.

The states with active age verification mandates in place include Alabama, Arizona, Arkansas, Florida, Georgia, Idaho, Indiana, Kansas, Kentucky, Louisiana, Mississippi, Montana, Nebraska, North Carolina, North Dakota, Ohio, Oklahoma, South Carolina, South Dakota, Tennessee, Texas, Utah, Virginia and Wyoming. And as mentioned earlier, Missouri will soon join this list.

Quite a few states have not yet passed age verification laws, at least to date. Those states include Alaska, California, Colorado, Connecticut, Delaware, Hawaii, Illinois, Iowa, Maine, Maryland, Massachusetts, Michigan, Minnesota, Nevada, New Hampshire, New Jersey, New Mexico, New York, Oregon, Pennsylvania, Vermont, Washington, West Virginia and Wisconsin.

Several of the states on the list of those that haven’t passed age verification laws have considered such proposals in the past and may do so again in the future, of course. Doubtlessly, there are at least some legislators in every state who favor the measures and are likely to introduce new bills at some point in the future.

States that haven’t passed age verification laws but have debated them at some point include Colorado, Hawaii, Illinois, Iowa, Maryland, Michigan, Minnesota, Nevada, New Mexico, New York, Oregon and West Virginia.

For much more information on age verification laws around the country – and to keep track of new bills that would establish them in additional states – check out the Age Verification section of the Free Speech Coalition website.

Read More »
Italian flag

Italy Orders Age Checks on Porn Sites, Signaling a New Crackdown on Digital Access

Rome has always been a city of contradictions — history and chaos, beauty and bureaucracy — and now it’s adding another to the list: sex and regulation.

This week, Italy’s media regulator, AGCOM, dropped a quiet but seismic announcement. Starting November 12, every platform that hosts adult content will be required to implement age verification systems — a digital checkpoint meant to keep users under 18 out of explicit spaces. On paper, it sounds simple: protect minors. In reality, it’s a bureaucratic earthquake waiting to happen.

The penalties for noncompliance? Up to €250,000. That’s not a slap on the wrist; that’s a knockout punch for smaller operators who barely make that much in a year. For the giants, it’s more of a warning shot — but one they can’t afford to ignore, especially in a country where digital privacy and moral politics are always in a tug-of-war.

AGCOM didn’t stop there. It also released a preliminary list of 45 adult content providers required to comply — a who’s who of the internet’s most-visited destinations. The message was clear: this isn’t theoretical. It’s happening. The list, the regulator says, will evolve based on how quickly platforms adopt the new rules. Translation? The watchdog is watching — and waiting to see who blinks first.

But what does this actually look like for users? Italy, like many countries flirting with digital ID systems, hasn’t laid out a clear method. Will people have to upload documents? Link to government-issued IDs? Use third-party verification apps that track their age (and maybe more)? No one knows yet, and AGCOM isn’t saying.

And maybe that’s the most Italian part of all this — the gray area between rules and reality. The intention is noble, sure: protect the young. But every time regulators try to police the internet’s most intimate corners, there’s collateral damage — privacy risks, data collection nightmares, and the quiet exodus of users to VPNs and underground mirrors of the same sites they’re trying to block.

For now, everyone’s waiting — platforms scrambling, lawyers reading fine print, users rolling their eyes. Because in the eternal theater that is Rome, even adult sites have to play their part in the latest act of digital morality.

Come November 12, the curtain rises. And whether this new performance turns out to be a tragedy, a farce, or a step toward something better — well, that depends on who’s still watching.

Read More »
Pennsylvania state capitol

A 10% “Titty Tax”? Pennsylvania’s Strange New Plan to Profit From Porn

Pennsylvania lawmakers want to slap a 10 percent tax on porn.

The proposal targets “subscriptions to and one-time purchases from online adult content platforms.” Add that to the state’s existing 6 percent sales tax, and you get what the Free Speech Coalition’s Mike Stabile called the “tiddy tariff.” It’s a catchy name for something that sounds like a moral statement wrapped in a fiscal policy.

Here’s the strange thing — almost nobody pays for porn anymore. The internet made sure of that. So taxing paid porn feels like setting up a toll booth on an abandoned road. You can’t collect money from traffic that’s already gone.

And if this plan actually discourages people from paying for porn, it could end up doing the opposite of what lawmakers claim to want. Paying for porn isn’t just about access — it’s about ethics. When viewers pay creators or production companies, they’re supporting people who work legally and consensually. They’re also helping make sure performers are paid and protected.

Platforms that allow direct payments to performers give sex workers something rare in this business — control. They decide what to shoot, how to do it, and where it goes. Reputable studios verify age and consent. All that takes structure and funding. Make it harder to earn money from ethical content, and you push people toward the shady, unregulated side of the web.

The bill comes from state senators Marty Flynn (D–Scranton) and Joe Picozzi (R–Philadelphia). “In the near future, we will be introducing legislation to impose an additional 10% tax on subscriptions to and one-time purchases from online adult content platforms,” they wrote in an October 15 memo. “This tax will be applied in addition to the Commonwealth’s existing 6% sales and use tax, ensuring that Pennsylvania captures revenue from this rapidly growing sector of the digital economy.”

What’s unclear is who would actually pay. Would it hit consumers directly, or the platforms and creators? Either way, the pain rolls downhill. Platforms pass costs to users. Users buy less. Creators — often independent and working without safety nets — earn less.

The money would go into the state’s general fund, supposedly to make these “platforms contribute their fair share.” That line always makes me raise an eyebrow. “Fair share” of what?

Maybe Flynn and Picozzi imagine this hitting only the big companies — the nameless giants raking in cash. But that’s not how the modern porn economy works. Much of it now comes from small creators: individuals or couples filming at home, uploading content, building communities, and surviving off direct sales. They’re entrepreneurs, not conglomerates.

So while the state gets a symbolic win and a few extra dollars, the people who actually make the content — the ones they’re claiming to regulate — will take the hit.

Taxing porn isn’t just about numbers. It’s about how we treat speech and labor we find uncomfortable. And no matter how you spin it, this tax looks less like fairness and more like judgment dressed up as revenue.

Read More »
Age verification

Welcome to the Dumbed-Down Internet Era by Stan Q. Brick

I was browsing the membership area of an adult site earlier this week, having “verified my age” during a previous visit, when I came across a curious scene. Halfway down the main page of the membership area was a row of banner ads for other sites, a row of ads I’ve scrolled past so many times the messages on them hardly register anymore.

But on this day, the look of this section was quite different than before. Instead of ads for other porn sites, two of the six ads were displaying messages telling me that they couldn’t show the content of the ads, due to the age-verification laws now in effect in my home state.

This was bizarre, frankly. It was a little like being asked to show my ID at the front door of a nightclub to gain entry, then having to show it again when I reached the bar, only instead of showing it to the bartender. I’d probably need to show it to the beer distributor.

Compliance by adult sites with the age verification law my home state has passed is very inconsistent, thus far. One thing I’ve noticed is the more prominent and high profile the brand, the more likely it is the company is either requiring its users to go through the age verification process or outright blocking traffic from the state.

The converse also appears to be true; the lesser known (and less likely to be legitimate) the adult site, the less likely it is to be complying with the state age verification laws proliferating around the United States.

Put another way, state governments are unintentionally (one hopes it’s unintentional, at least) funneling traffic to adult sites that are on the more questionable end of the legal spectrum, whether the laws being flaunted are age verification requirements, intellectual property laws, revenge porn laws or all the above.

Meanwhile, the adults among us who don’t find their porn by blind browsing of whatever free porn site crosses our path, the age verification requirements are repeatedly inconveniencing and irritating us as we merely try to make the most of subscriptions that were active before these laws were even cooked up.

Look, I’m not against age verification. I don’t mind the idea of making people show ID to access porn at all. That’s how things have worked in the offline world for ages, after all. What I’m against is the reality of how age verification is being handled.

What these age verification laws have handed us is a dumbed-down internet, one where in the interest of (ineffectively) “protecting children,” everyone is being treated like a child – at least where porn is concerned. If what you’re after is extreme violence or hate speech, there’s no age verification barrier to worry about, because apparently that sort of content doesn’t harm kids at all.

This special focus on porn might not last, though. And I wish the reason for the change was that legislatures around the country are going to come to their senses and stop trying to tame the internet on the sort of vain quest that even Don fucking Quixote would know to be utter folly.

Instead, what you can expect are more laws like the “Texas App Store Accountability Act,” which is currently being challenged in court by students who think maybe it’s not reasonable to require them to get permission from their parents before they download any app.

“Texas has passed a law presumptively banning teenagers – and restricting everyone else – from accessing vast online libraries of fully protected speech,” the complaint argues.

Sounds familiar, eh?

Even if you believe age verification laws for porn sites are a good idea, do you really want to see them spread out and cover everything online that might potentially be bad for kids to access?

Give that one some real thought before you answer. Unfortunately, that’s something our elected representatives are unlikely to do.

Read More »
Russian flag

Russian Politicians Push to End Anonymous Access to Adult Sites

There’s a certain irony in watching a country famous for its secrets start talking about taking anonymity away.

Two Russian lawmakers are pushing for a new system that would force citizens to prove who they are before viewing adult content online. Yevgeny Masharov, a member of the Civic Chamber’s Commission for Public Review of Bills, says adult material “distorts behavior patterns” in young people — and that the only fix is to make everyone show ID before clicking “enter.” Passports, driver’s licenses, even bank data — all fair game, apparently — to prove you’re not a minor.

It’s a bold vision, if not a little unsettling. Because once a government starts asking for your personal documents just to browse the internet, where does that end?

Andrei Svintsov, another official from the State Duma’s Committee on Information Policy, predicts that online anonymity in Russia won’t last more than five years anyway. In his words, “Every internet user will register with some specialized identifier.” Translation: everyone’s digital life, tied neatly to their real identity. No masks, no aliases, no shadows left to hide in.

A third voice, Deputy Anton Nemkin, doesn’t completely disagree but sounds more cautious — maybe even uneasy. He admits that protecting minors and creating a “safer digital environment” are important goals, but warns that the cure shouldn’t create new diseases. Leaking personal data, strangling online businesses, or making life miserable for ordinary users could easily be part of the fallout if this rushes ahead without strong safeguards.

He’s right to worry. Systems like these rarely arrive fully secure or transparent. And once a government has a database connecting citizens to what they watch online, the line between protection and surveillance gets awfully thin.

The idea might start as a shield for children. But if history’s any guide, shields have a way of turning into nets.

Read More »