Political Attacks

Italy’s ‘Porn Tax’ Just Got Real — And Now It Applies to Every Creator

Italian flag

ROME — There’s that quiet moment every creator dreads — the one where your phone buzzes with news you didn’t even know you should be worried about yet. In Italy right now, that buzz is coming courtesy of the nation’s tax authority, which has ruled that the country’s 25% “ethical tax” on income from adult content applies not just to major studios or big websites, but also to smaller, independent creators working online.

The so-called “ethical tax” dates back to 2005 and targets any net income earned through the production, distribution, sale, or performance of pornographic material, along with content said to incite violence. When it was introduced, the idea of bedroom lighting setups, DIY ring lights, and subscription fan platforms wasn’t even part of the conversation, so the burden landed almost entirely on traditional adult companies.

A recent ruling from the Agenzia Entrate’s Taxpayers Division Central Directorate for Small and Medium Enterprises now makes it official: creators using premium social media and fan platforms fall under the same tax obligation — even if they operate under Italy’s flat-tax system for small businesses and freelancers.

That system, called the regime forfettario, covers freelancers reporting annual income of €85,000 or less. Which means tens of thousands of Italian creators who thought they were safely tucked into that bracket could suddenly see 25% of their earnings siphoned off if the revenue agency classifies their content as pornographic. A quarter of their income — gone. And you can almost hear the collective pause across the creator community as people do that math in their heads, wondering how something written two decades ago just reached straight into their modern, very personal livelihoods.

Read More »

Missouri Becomes the Latest State to Treat Online Adults Like Children by Stan Q. Brick

Missouri flag

Citizens of Missouri who frequent adult websites will find the internet has changed for them when they wake up this Sunday morning, towards the end of the long Thanksgiving weekend.

Why will the internet be different for citizens of Missouri as of that morning? Because Sunday is November 30, the day the state’s new age-verification mandate begins for websites covered by the “Missouri Merchandising Practices Act.”

Under the law, websites on which a “substantial portion” of the content (33% or more) is deemed “pornographic for minors” must employ a “reasonable age verification method” to assure anyone accessing such content is an adult.

On its face, requiring adult sites to verify the age of their visitors may not seem like such an unreasonable proposition. But, as the saying goes, “the devil is in the details.”

For starters, making adults jump through hoops to enter a brick-and-mortar adult video store, or requiring people to show ID when purchasing a porn mag at a convenience store is one thing, storing and cross-referencing their personally identifying information is quite another.

When a clerk at an adult shop or any store that sells age-restricted materials checks your ID, they look at it, they look at you, they check the date of birth listed on the ID document and then you both get on with your lives. Minutes later, that same clerk probably couldn’t tell you much about the customer they’d just served, other than “I checked his ID, it looked legit and he’s 55 freaking years old, dude.”

When I scan my ID on the behest of an age-verification provider…who the fuck knows what happens to that data? Sure, some of these state laws prohibit vendors from storing and sharing that data, but do you trust them to follow the law? How many times do we need to haul tech companies before Congress (or watch them get fined by the FCC) to admit they interpreted the law in some “nuanced” way that permits them to hold on to and use our personal data before we get wise to their sneaky ways?

The data collected by age-verification services is valuable to them. They aren’t going to abstain from using it in every profitable way possible, regardless of what the law says. They will find ways to interpret the law such that they can sell, rent out or permit third-party cross-referencing of the data, mark my words. Some of these companies won’t be domiciled in the United States – and they will give just about as big a shit about U.S. law as any other business located outside the jurisdiction of the U.S. does, accordingly.

Of course, none of this will bother the politicians who pass these laws, because this isn’t about protecting kids – and it sure as hell isn’t about protecting the privacy of adults who like to watch porn. This is about a larger antipathy towards adult entertainment and a desire to discourage anyone and everyone from looking at porn, not just minors.

Consider what Missouri Attorney General Catherine Hanaway had to say in September about the new law in her state: “We are holding powerful corporations accountable, respecting women and victims of human trafficking, and helping ensure that minors are shielded from dangerous, sexually explicit material.”

Notice that the bit about “helping ensure that minors are shielded” comes last on the list? That’s not a coincidence.

Someone also needs to explain to me how making people show ID at the door when they watch porn is in any way helping “women and victims of human trafficking.” Let’s assume a person has been trafficked for the purpose of performing in porn (something that truly doesn’t happen often at all, despite a constant stream of political rhetoric to the contrary); how does making viewers confirm they’re old enough to watch legal porn help anyone who has been forced into making illegal porn?

The word “trafficking” doesn’t appear in the text of Missouri’s new law. What does appear there is the claim “nothing in this proposed rule limits the ability of adults to view sexually explicit material online,” which is technically true, so long as one doesn’t consider an age-verification requirement a “limit” to any of the adults who would prefer not to hand over the personally identifying information to God-knows-who.

When the Supreme Court ruled in favor of Texas in the challenge to that state’s age-verification mandate, Cecillia Wang, the national legal director of the American Civil Liberties Union, said something that strikes me as being just as true with respect to the Missouri law:

“The legislature claims to be protecting children from sexually explicit materials, but the law will do little to block their access and instead deters adults from viewing vast amounts of First Amendment-protected content.”

She’s right – and the list of adults deterred by such laws is only going to get longer as these laws proliferate.

Welcome to the dumb-downed internet. Please be mindful of the language you use herein; some of your readers might be children!

Read More »

Canada Exempts Adult Content From National Programming Quotas

Canada flag

OTTAWA—Canada’s broadcast regulator has confirmed that “adult programming” won’t be subject to the Canadian culture quota system required under the Online Streaming Act, effectively removing erotic and pornographic content from the law’s promotional framework rooted in national identity and cultural support.

A couple of years ago, testimony from Aylo’s ownership group, Ethical Capital Partners (ECP), which advocated for this very exemption. The Ottawa-based private equity firm acquired Aylo — the company once known as MindGeek — in 2023 and now operates with headquarters in Montreal, Québec.

“Adult programming will not be certified as Canadian programs,” states the official regulation issued by the CRTC on Nov. 18. The commission explains that “adult programming is an element of the production industry that does not require regulatory support for its overall economic stability. […] Further, the certification or non-certification of this type of programming does not impact the Canadian production industry associated with the creation or availability of such content.”

Under the regulator’s interpretation, adult content simply doesn’t meet the criteria for official “Canadian content,” a designation based on the participation of Canadian performers and the use of domestic locations. In practice, most adult productions — including many produced within Aylo’s premium studio network — are filmed in the United States, particularly in California and Florida.

“By recognizing the contributions of a wider range of creators, we are supporting Canadians who help bring our stories to the screen,” said Vicky Eatrides, chairperson and chief executive officer of the CRTC, in a press release announcing the finalized rules governing Canadian content quotas.

“Our decision promotes Canadian talent, encourages new partnerships, and helps keep our creative industries strong for the future,” Eatrides added.

Read More »

Congress to Weigh Federal AV Mandate in Upcoming Child Protection Session

US Congress

There’s a certain kind of political announcement that feels like the temperature in the room suddenly drops — and next week’s hearing in the House does exactly that. A subcommittee is gearing up to review a stack of bills aimed at protecting minors online, including the SCREEN Act, which would make age verification for adult content a federal requirement. It’s the kind of proposal that sounds straightforward until you start asking the obvious questions lurking underneath: Who verifies? How? And at what cost to everyone else’s privacy?

On Tuesday, Congressman Brett Guthrie of Kentucky, who chairs the House Committee on Energy and Commerce, and Congressman Gus Bilirakis of Florida, who leads the Subcommittee on Commerce, Manufacturing, and Trade, revealed the Dec. 2 hearing, pointedly titled “Legislative Solutions to Protect Children and Teens Online.”

“One week from today, this Committee will begin advancing a suite of online safety bills to address the challenges facing our kids in the digital age,” Guthrie and Bilirakis said in a joint statement. “Parents and lawmakers both agree on the importance of enacting meaningful protections that can stand the test of time, so we look forward to this important first step.” You can hear the bipartisan weariness in that last part — everyone wants to “protect kids,” yet nobody can agree on the mechanics.

The hearing will cover 19 different bills circling everything from privacy protections to gaming, messaging, algorithms, bots, and artificial intelligence. It’s a wide net, the kind lawmakers cast when they’re trying to grab the whole digital universe at once. The Kids Online Safety Act — still carrying the scent of its earlier “duty of care” controversies — sits near the top of the list, freshly softened after years of pushback.

But the bill with the sharpest edge for the adult industry is the one with the dramatic, almost sci-fi name: the Shielding Children’s Retinas from Egregious Exposure on the Net (yes, really) — the SCREEN Act.

When Republican Sen. Mike Lee of Utah introduced it earlier this year, the proposal echoed the age-verification mandates spreading through various states like a chain reaction. Under this bill, failing to comply with AV requirements would count as a violation of the Federal Trade Commission Act’s ban on unfair or deceptive practices — a move that opens the door to civil penalties up to $10,000 per violation. Just imagine the math on a busy website; it’s enough to make any operator dizzy.

Lee has long been one of Congress’ most outspoken anti-porn crusaders, consistently pushing measures to revive obscenity prosecutions and criminalize all forms of sex work. His SCREEN Act has drawn support from religious and conservative groups across the ideological map — and interestingly, from the Age Verification Provider’s Association, whose members would undeniably benefit from turning age checks into a global business model.

Earlier this year, industry attorney Corey Silverstein wrote a detailed breakdown of the legislation, pointing out that similar laws have already collided with constitutional limits and lost. History has a funny way of whispering warnings if you’re willing to hear them.

“If it is deemed too broad or restrictive, courts may invalidate it as unconstitutional,” Silverstein noted. “However, if the law is perceived as narrowly tailored — e.g., focusing only on commercial porn sites and clear, workable verification methods — it might survive such challenges.”

And that’s the tension that keeps repeating itself: the desire to shield children, the fear of government overreach, and a legal system forever balancing on the thin edge between protection and intrusion. The coming hearing won’t settle that — but it might reveal just how far lawmakers are willing to push the boundary this time.

Read More »

Germany Temporarily Halts Effort to Block Pornhub, YouPorn

Flag of Germany

Sometimes the quietest rulings carry the loudest echoes. That’s what happened in Düsseldorf, where the Administrative Court stepped in and hit pause on an order that would’ve forced telecom giants to cut off access to Aylo-owned sites Pornhub and YouPorn.

The “network ban,” first floated back in July, was meant to push the platforms into complying with Germany’s strict age-verification demands — the kind that insist on IDs, facial scans, and other methods that feel more sci-fi than practical for most adults just trying to go about their digital lives.

But on Nov. 19, the Administrative Court said not so fast. The judges ruled that the regional media authority can’t pressure ISPs to block the sites while the appeals are still winding their way through the Higher Administrative Court. In other words: everyone needs to sit tight until the bigger legal questions get sorted.

And those questions are heavy. The court pointed to recent decisions from the European Court of Justice suggesting that Germany’s Youth Media Protection Interstate Treaty might clash with overarching EU law. Essentially, the EU’s rules around the free movement of digital services — especially for companies legally based in other member states, like Aylo in Cyprus — can’t be tossed aside unless strict conditions are met. And Germany’s framework may not pass that test anymore.

Jurisdictional Confusion

If you’ve followed the debate around protecting minors online in Europe this past year, you know it’s become a labyrinth of overlapping rules, clashing authorities, and awkward international finger-pointing.

Take Luxembourg, for example. Late in 2024, French officials tried to get their neighbors to help enforce France’s SREN law by going after webcam platform LiveJasmin. Luxembourg didn’t bite. “We cannot circumvent EU rules just because it is maybe a highly sensitive topic,” an Economy Ministry official said — a line that, honestly, deserves to be engraved somewhere in stone.

Around the same time, the European Commission wrapped up its official guidelines on protecting minors under the Digital Services Act. They even rolled out a “white label” age-verification app, something like a template that sites can adopt to meet DSA requirements. The idea sounds tidy on paper; the reality is… well, still unfolding.

France’s media regulator has also been tangled in debates over whether it can enforce its age-verification rules on companies based elsewhere in the EU. Arcom asked Czech regulators for support, but those agencies pushed back, saying they simply don’t have the legal room to apply French law on their home turf.

That particular dispute revolves around WebGroup Czech Republic, the company behind XVideos.com, and NKL Associates, which operates XNXX.com. Both companies appealed to France’s Council of State, arguing that Arcom can’t compel foreign-based sites to comply with French rules without violating the EU’s “country of origin” principle — a core idea in the Directive on Electronic Commerce.

And then there’s the nonbinding opinion dropped in September by an advocate general at the EU’s Court of Justice, suggesting that France can require foreign-based porn sites to apply French age-verification rules. It was the legal equivalent of throwing gasoline on an already lively fire.

XVideos and XNXX aren’t alone. Several platforms have been called out for failing to meet France’s age-verification requirements under the SREN law. If they don’t comply, Arcom has made it clear it’s ready to follow the German playbook and move toward blocking and delisting.

All of this makes the Pornhub/YouPorn litigation in Germany more than just another case file. It’s shaping into a test of which rules will ultimately win out — national laws drafted in response to rising political pressure, or Europe-wide principles baked into the very idea of a digital single market. And whatever the courts decide, it’s going to ripple far beyond one country or a pair of websites.

Read More »

Aylo Pushes Tech Giants to Adopt API-Driven Device Age Verification

Aylo-logo

Something interesting happens when big tech companies get a polite nudge from a company they usually keep at arm’s length. That’s exactly what Aylo — the parent company of Pornhub — just did. The company asked Google, Apple, and Microsoft to open the door to API signals that would let platforms verify a user’s age at the device or operating-system level. The goal? Keeping minors off porn. It’s a request that feels both obvious and strangely overdue, considering how much of the internet already runs through those devices.

Wired revealed last week that Anthony Penhale, Aylo’s chief legal officer, sent separate letters on Nov. 14 to the relevant executives at each company. Those letters were later confirmed by Aylo, whose spokesperson provided them for review.

Aylo has been steadily pushing the idea that age verification should happen at the device level — not slapped awkwardly onto individual sites through clunky pop-ups and ID uploads. It’s a stance that puts the company at odds with most state and regional age-gating laws in the U.S. and E.U., which still rely on site-level verification. Meanwhile, Google, Apple, and Microsoft have been sending mixed signals about how far they’re willing to go with device-based checks.

Most recently, California’s governor, Gavin Newsom, signed a bill requiring age verification in app stores. Google, Meta, and OpenAI endorsed the measure, while major film studios and streaming platforms pushed back, calling the law a step too far.

“We strongly advocate for device-based age assurance, where users’ age is determined once on the device, and the age range can be used to create an age signal sent over an API to websites,” Penhale wrote in his letter to Apple. “Understanding that your Declared Age Range API is designed to ‘help developers obtain users’ age categories’ for apps, we respectfully request that Apple extend this device-based approach to web platforms.”

“We believe this extension is critical to achieving effective age assurance across the entire digital ecosystem and would enable responsible platforms like ours to provide better protection for minors while preserving user privacy,” he added.

Penhale’s letters to Alphabet and Microsoft echoed the same ask: allow website operators — not just app developers — access to the age-related API tools each company already uses within its own ecosystem.

“As a platform operator committed to user safety and regulatory compliance, Aylo would welcome the opportunity to participate in any technical working groups or discussions regarding extending the current age signal functionality to websites,” Penhale wrote in the letter sent to Microsoft.

A Google spokesperson told Wired that Google Play doesn’t “allow adult entertainment apps” and that “certain high-risk services like Aylo will always need to invest in specific tools to meet their own legal and responsibility obligations.” In other words, Google’s not eager to widen the gates.

Developer documentation shows that Apple now turns on content controls by default for new devices registered to under-18 users. Microsoft, for its part, has leaned heavily toward service-level verification — meaning platforms should handle their own age checks rather than relying on the device.

All of this is unfolding while Aylo continues to argue that site-level age verification doesn’t work. The company has pointed to real-world examples of how these systems push users off regulated sites and into murkier, unmonitored corners of the web.

Internal data shows that traffic from the U.K. to Aylo’s platforms dropped more than 77 percent after Ofcom began enforcing new rules under the Online Safety Act. Related documents reviewed privately indicate that users didn’t disappear — they simply migrated to non-compliant, unregulated sites.

At the same time, a court in Germany just offered Aylo a temporary lifeline. On Nov. 19, the Administrative Court of Düsseldorf put a hold on new regulations requiring ISPs to block Pornhub and YouPorn entirely.

The court’s order would have forced ISPs like Deutsche Telekom, Vodafone, and O2 to bar access to the sites over Germany’s age verification laws. For now, those rules are on pause while the High Administrative Court of North Rhine-Westphalia works through appeals on the original network-ban orders.

Interestingly, the Düsseldorf court pointed out that Germany’s enforcement approach under the Youth Media Protection Interstate Treaty contradicts the European Union’s Digital Services Act, which outlines a different vision for age verification.

Aylo is still fighting over its designation as a “very-large online platform” under the DSA — a label that brings intense regulatory scrutiny and a long list of compliance demands. The company’s push for device-based age checks is part of that bigger battle, and it’s hard not to notice the irony: the company everyone expects to resist regulation is the one asking for the kind that might actually work.

Read More »

FSC Warns: North Carolina Law Void­ing Certain Contracts Takes Effect Dec. 1

Free Speech Coalition logo

CHATSWORTH, Calif.—Adult industry trade organization the Free Speech Coalition (FSC) issued the following advisory Monday morning concerning the recently passed record keeping law in North Carolina for adult websites:

North Carolina’s Prevent Exploitation of Women and Minors Act requires operators of adult websites to remove content upon request of a performer, even if the performer signed a valid model release. It also requires producers to collect “explicit written evidence of consent for each act” performed, and claims to apply to content created before the law goes into effect on December 1, 2025.

The law is broad in its application, confusingly written, and appears to contradict existing federal law and long-established legal principles. This post breaks down key provisions of the law, but FSC recommends reviewing the specific language of the law with your legal advisor to evaluate your company’s potential exposure and compliance strategies.

Requirements for content producers
Content producers of “actual or feigned sexual activity” must obtain written consent from each person depicted that includes:

• consent to each sex act depicted in the content.

• a statement giving consent to distribute the content.

• a statement explaining the state’s definition of “coerced consent” and notifying the performer that they may withdraw consent at any time.

While the law doesn’t explicitly mandate the creation of these documents, they are required in order to publish the content online.

Documentation required before upload
Online entities that distribute or publish “pornographic” content are required to obtain the documentation listed above, as well as the age and identity verification records for each performer.

Take Down Provisions
Platforms are required to display a prominent notice giving instructions for how a person can request that content be taken down and:

• Remove any pornographic content on their platform at the request of an individual depicted in it, their authorized representative, or law enforcement within 72 hours of receiving a request.

• If any other individual requests content be taken down, platforms are required to review records related to that content within 72 hours and remove it if it does not meet the requirements of the law regarding documentation and consent.

• Any content that is taken down (including edited or altered versions) must be prevented from being republished.

Enforcement
The state attorney general or an individual depicted in a piece of content can file a lawsuit against parties alleged to have violated the law.

• Penalties accrue on a per-day and per-image basis.

• Private plaintiffs can sue the platform or the uploader for actual damages or $10k per day the image remained on the platform after the 72-hour window, whichever is greater.

• If the attorney general notifies a platform that they are in violation of the requirement to post instructions for taking down content, they have 24 hours to add it before fines of $10k per day begin to accrue.

• If the attorney general notifies a platform that they must take down content, they have 24 hours to remove the it before fines of $5k per day begin to accrue.

Effective Date
The law goes into effect on December 1, 2025 and is effective retroactively (“applies to acts or omissions occurring before, on, or after that date”).

This blog post is a resource provided for informational purposes only. It does not constitute legal advice and should not replace the advice of an experienced legal professional.

FSC members have access to an Industry Professional Directory that includes a list of industry-friendly attorneys.

Read More »

Are Adults Allowed to Take Informed Risks, Or Not? – By Stan Q. Brick

MMA fight

The question, to me at least, truly is as simple as the headline above frames it.

Growing up as a kid in the 1970s, there were a great many things I wasn’t allowed to do yet, but I was told I would be permitted to do once I was old enough to make my own decisions in life. Things like smoking cigarettes, drinking booze, driving cars and yes – shudder – looking at pornographic images.

In most areas of life, the promise I was given as a kid has held up. I don’t happen to smoke, but I could do so, even knowing it’s terrible for me. I do drink occasionally, although ironically enough, a lot less often now that I’m allowed to do so than I did when I was a teenager doing something forbidden by the law.

And yes, I’m still allowed to look at porn. But as I look around the world and contemplate the circumstances in which many other adults currently find themselves, I can’t help but think they live in countries where the government is functioning like parents who can’t face having their kids move out of the house and become fully formed adults.

Earlier this month, it was reported that in the United Kingdom, “online pornography showing strangulation or suffocation is to be made illegal, as part of government plans to tackle violence against women and girls,” as the BBC put it.

Don’t get me wrong; I’m no fan of choking in porn. In fact, seeing it actively makes me cringe and recoil from the screen. (I’m not too fond of people spitting on each other, either, but that’s a whole other kettle of saliva.)

I also understand that choking someone to the point they almost pass out isn’t good for that person or her/his brain. That’s another reason why people arguably shouldn’t engage in “breath play” during sex (or at any other time, for that matter) – but it’s still not a reason for the government to ban it in porn.

Consider this: It’s illegal to do a lot of what we see inside the cage during an Ultimate Fighting Championship match if it takes place outside such a competition. If you get into a bar fight with someone, put them in armbar and break their freaking arm, you’re likely to be prosecuted for aggravated assault. Do the same thing inside the octagon and you’re the winner!

Is it good for kids to watch people break each other’s arms? Probably not. But do you know what sort of depictions aren’t age-restricted in a way that would lead to a criminal (or civil) penalty for those broadcasting the event, or any adult who allows a kid to watch it? You guessed it – mixed martial arts (MMA) fights.

MMA also doesn’t prevent the adults who participate in it from choking each other unconscious, breaking each other’s limbs, giving each other concussions, or otherwise doing grievous bodily harm to each other.

Why isn’t choking in MMA illegal in the UK, but choking in a porn context is about to become so? My guess is it has a lot to do with paternalism and eons-old double standards regarding men and women.

If a man chokes out another man in a competitive context, or a woman does so to another woman, well, that’s just sport, right? If a man chokes a woman to the edge of consciousness in a pornographic context, well obviously there’s an imbalance of power and he’s abusing her – even if she’s lying there demanding “choke me, choke me!”

(That said, it’s not like the UK choking depiction ban will have an exception for choking in scenes pairing women, so maybe it’s as simple as “sport good, sex bad.”)

You also probably won’t see calls in the UK for depictions of choking in a fictional context to be banned. Game of Thrones fans may recall Stannis Baratheon seizing the Red Woman by the throat and choking her after her perceived failures in helping him seize the Iron Throne; I don’t expect we’ll hear calls for that scene to be deleted from UK-facing platforms, in part because both characters have their clothes on throughout this evidently harmless bit of violence against women.

We also likely won’t hear many people complaining that the NFL games played annually in the UK are too violent, too filled with concussions, or somehow unsuitable for viewing by children. After all, they want to sell out those stadiums and create future generations of fans to come support American football. The NFL’s money talks and the typical, paternalistic logic about “protecting children” walks right out the door.

I could go on and on, because the examples are nearly endless. Everywhere you look, sexually explicit depictions are subjected to restrictions that other forms of expression and entertainment simply are not, but that sure seem as though they should be restricted, if the same reasoning is applied.

Again, don’t get me wrong: I don’t want any of the things I’ve listed above to be banned, or even more strictly regulated. I just want consistency in allowing adults to be adults, permitting us to take informed risks and go about our lives with minimal government intrusion into our lives – and I don’t want consistency to come in the form of comprehensive prohibition on the whole lot.

After all, if we try to restrict people to doing only what’s safe and not “bad for them,” do you know what none of us should be doing? Driving to fucking work.

Read More »

Missouri Age-Verification Regulation Takes Effect November 30th

Missouri flag

Missouri’s age-verification regulation, 15 CSR 60-18, kicks in on Sunday, November 30. It arrives quietly, almost like a new rule taped to the front door of the internet—one most people won’t notice until they run into it.

Under Missouri’s rule, any site where at least 33⅓% of content is considered harmful to minors must verify a visitor’s age before letting them in. The state signs off on methods like digital IDs, government-issued identification, or other systems that confirm age through transactional data. If a platform thinks it has a better solution, it can pitch its own—so long as it proves it works just as well.

Violating the rule isn’t just a slap on the wrist. The state treats it as “an unfair, deceptive, fraudulent, or otherwise unlawful practice” under the Missouri Merchandising Practices Act. If regulators decide a violation was done “with the intent to defraud,” it escalates into a Class E felony. Each visit to a non-compliant site counts as a separate offense, with penalties capped at $10,000 per day. There’s no option for private lawsuits; this is the state’s show.

For businesses, the message is simple but unsettling: if you might fall under the rule, read the fine print, understand the liability, and protect yourself. The consequences aren’t theoretical—they’re baked in. And as laws like this multiply, compliance is becoming less about checking a box and more about navigating a moving target with stakes that touch real people and their privacy.

Because once the government decides how adults must prove their age online, the question stops being, Can you follow the rules?

It becomes, What do those rules change about the way we experience the internet at all?

Read More »

FSC Unveils Updated Toolkit to Help Sites Navigate Age-Verification Laws

Free Speech Coalition logo

Earlier this year, a toolkit dropped from the Free Speech Coalition that was supposed to help adult websites navigate the chaos of U.S. age verification laws. On paper, it was about compliance. In reality, it spoke to something bigger—how to follow the law without sacrificing privacy, free expression, or basic human dignity in the process. The updated version arrives after months of legal whiplash and real-world testing, refined by feedback from the people actually living with these requirements. It’s not just a rulebook; it’s a survival guide for an industry being legislated into a corner.

And honestly, it couldn’t have come at a better time.

Laws regulating sexual content online aren’t slowing down. They’re spreading. States are experimenting with different enforcement mechanisms like they’re swapping cocktail recipes—ID uploads here, age-estimation scans there, endless demands for personal data everywhere. What counts as compliance in one state can trigger fines in another. Platforms are stuck either bending to every new rule or blocking entire populations just to avoid liability.

Some people call that safety. Others see it as the invention of a digital checkpoint system where adulthood must be proven over and over again.

The updated toolkit tries to offer a middle path: protect minors without building a surveillance state. That means emphasizing privacy-preserving verification methods, data minimization, and safeguards against turning porn sites into honeypots for identity theft. When your sexual curiosity can be cross-referenced with a government database, it’s not hard to imagine how badly that could go.

But this isn’t just about porn. It’s about how much of yourself you should have to reveal simply to access legal content. If a state can require ID to watch an adult video, why couldn’t it do the same for BDSM forums, queer education sites, or reproductive health information? The slope may not be slippery—it might already be greased.

There’s also the uncomfortable truth that “protecting kids” has become a political Swiss Army knife. Behind the moral language are groups who openly want to make adult content inaccessible altogether, not just to minors. Age verification becomes the first domino rather than the final safeguard. When lawmakers start treating porn the way others treat fentanyl, it’s worth asking who gets to define harm — and who gets punished in the process.

Meanwhile, the people enforcing these laws rarely understand how the internet works. The burden falls on smaller platforms, independent creators, and marginalized workers who already operate under scrutiny. Sex workers were dealing with censorship long before age-verification laws existed. Now, they’re being folded into legislation written by people who’ve never considered how someone pays rent by selling a video clip.

The irony? The more governments tighten restrictions, the faster users migrate to unregulated foreign sites where consent and safety checks don’t exist at all. The “protection” ends up exposing people to worse content, not preventing it.

If lawmakers truly cared about reducing harm, they would fund education, promote ethical production standards, and support platforms that actually moderate content responsibly. Instead, the system encourages the exact opposite: drive traffic to the shadows, then blame the shadows for being dark.

The toolkit is trying to hold the line—compliance without capitulation. It’s a reminder that safety and privacy don’t have to be adversaries. They can coexist, but only if laws are written by people who understand what’s at stake for users and creators.

Because asking adults to prove who they are before they can access legal sexual content isn’t just a technical requirement. It’s a worldview. One where the state sits in the bedroom doorway holding a clipboard, deciding who gets to come inside.

And once that door closes, it rarely opens back up.

Read More »