Commentary

Alabama and North Carolina Laws Spark Bans on Creators and Content Across Adult Platforms

Censorship

It started with a post on X that felt less like an announcement and more like a warning shot. Krystal Davis shared that one of her platforms would no longer accept adult content tied to Alabama or North Carolina—and suddenly a lot of creators across the U.S. were scrambling to figure out what this meant for them.

Some Adult Platforms Are Banning Content and Creators From Alabama and North Carolina

The notice laid out the new rules in blunt terms. The platform will reject:

Any productions shot in Alabama or North Carolina.

Any productions featuring talent who legally reside in those states.

Any productions featuring talent whose ID documents were issued by those states.

And it’s not just some vague future plan. The policy is attached to specific launch dates:

Alabama: applies to content shot on or after October 1, 2024.

North Carolina: applies to content shot on or after December 1, 2025.

Why Adult Platforms Are Banning Content From Alabama and North Carolina

Krystal Davis said her notice came from Adult Empire, and another creator reported getting a similar notice from Adult Time. It wouldn’t be surprising if more platforms quietly follow the same route. In a way, it feels like another one of those “small changes” that’ll end up reshaping the industry before anyone has time to react.

But why this move? Why now?

Both states recently passed sweeping laws regulating adult content online—laws that carry enough legal risk that platforms appear to be choosing exclusion over compliance. Instead of building new legal infrastructure, they’re just geoblocking the problem.

So let’s unpack the laws behind the panic.

Alabama’s HB164: A Strict Age-Verification and Consent Law With Heavy Penalties

Alabama

HB164 went into effect October 1, 2024, packaged as “consumer protection.” On paper, it reads like safety policy. In practice, it puts massive responsibility on platforms hosting adult content.

1. Mandatory Age Verification for All Adult Sites

Any commercial entity that “knowingly and intentionally publishes or distributes sexual material harmful to minors” must verify users are 18+ using a “reasonable age-verification method.”

And those verification services? They must be designed so they can’t retain user data.

If platforms screw up, they’re exposed to:

Civil lawsuits

Up to $10,000 per violation

Penalties under deceptive trade laws

2. Strict Written-Consent Requirements for All “Private Images”

Before publishing any “private image,” platforms need written, notarized consent from every person depicted—and those records have to be stored for five years.

3. Mandatory Warning Labels on Every Page

Not subtle ones either. We’re talking big, government-scripted warnings like:

“Pornography is potentially biologically addictive…”

“Pornography increases the demand for prostitution, child exploitation, and child pornography.”

4. A 10% Tax on Pornography Produced or Sold in Alabama

Section 10 slaps a 10% gross-receipts tax on memberships, subscriptions, and any material produced or sold in the state.

Why Platforms Are Responding by Blocking Alabama

If you’re a platform, you’re staring at:

High legal liability

Restrictions on data handling

Constant compliance demands

A tax on any content tied to the state

And lawsuit exposure for every alleged violation

At some point, it stops being a legal puzzle and starts being a cost-benefit analysis. And Alabama isn’t worth the math.

North Carolina’s HB805: Extremely Broad “Pornographic Image” Verification Rules

North Carolina

HB805 drops December 1, 2025 for adult-content sections, and while the bill covers everything from school libraries to “biological sex” definitions, the part that matters to creators is Article 51A.

This isn’t just strict; it’s procedural overkill.

1. Age and Consent Documentation for Every Person in Every Pornographic Image

Before publishing a pornographic image, platforms must verify:

The person was 18 at the time of creation

Written consent for each sex act performed

Written consent specifically for distribution

And, crucially: consent for performance does not equal consent to distribute

Platforms must collect:

A full consent form with personal details

A matching government ID

2. Mandatory Removal System With 72-Hour Deadlines

If a performer requests removal, platforms must comply within 72 hours—even if consent was properly documented.

If consent is questioned, content must be pulled down temporarily.

Re-uploads? Permanently banned.

3. Massive Civil Penalties

The Attorney General can impose:

Up to $10,000 per day per image for failure to remove

Up to $5,000 per day for publishing violations

Performers can also sue for $10,000 per day per image.

Why Platforms Are Banning North Carolina Content

HB805 basically forces platforms to:

Re-document performers from NC

Handle disputes more aggressively

Maintain permanent blocks on re-uploads

Maintain 1:1 traceable consent for every act in every piece of content

That’s not a tweak—it’s an entirely new compliance department.

You may also notice the bans include things like:

Talent living in those states

Talent whose IDs originate from those states

Content filmed in those states

This is because the laws follow the people and the production location—not just where the content is uploaded. That means:

An NC resident filmed in Las Vegas? Still a risk.

A performer who moved out of Alabama but still has an AL ID? Risk.

A scene shot in Alabama and uploaded from New York? Still covered.

The jurisdiction sticks like glue.

Adult platforms aren’t banning performers because they suddenly want to. They’re doing it because Alabama and North Carolina have created legal terrains where one clerical oversight could turn into six-figure penalties.

Alabama’s HB164 demands notarized consent, strict age verification, no data retention, warning labels, and a 10% tax.

North Carolina’s HB805 requires different consents for each act, ID verification, rapid takedowns, and crushing per-day fines.

Faced with that, some companies are choosing the path of least resistance: eliminating content tied to those states entirely. Will others follow? Probably. Not because they want to—because compliance costs more than creators do.

The laws don’t just restrict porn; they quietly redraw who gets to participate in the industry at all.

Read More »

Porn and Politicians: Still Reliable Clickbait by Stan Q. Brick

James Talarico

And here I thought being outraged by the things politicians say and do had become passe. Apparently not, if your transgressions include (please, those with delicate sensibilities, cover your eyes) following OnlyFans models and escorts on social media.

“‘Devout Christian’ Dem caught following prostitutes, OnlyFans models on social media,” proclaimed the New York Post, which as a publication that backs Donald Trump, clearly demands a higher standard of social media decorum than following an OnlyFans model. As I’m sure the discerning editors of the Post would tell him, James Talarico should stick to more family-friendly online activities, like sharing videos of well-informed patriots who can inform us of important, well-established facts – like Osama bin Laden is still alive.

Naturally, Talarico’s publicity flacks had to deny any meaningful personal interest in the eyebrow raising follows on the part of the good would-be Senator. After his timeline became news, they quickly fired off a statement stating the campaign’s “social media team – including James – follows back and engages with supporters who have large followings and does not investigate their backgrounds.”

To be fair, that explanation is plausible enough, so far as these things go. I can’t help but wonder, though; is part of the reason some of our elected officials are so inclined to support laws restricting and regulating all manner of sex-related things a need to distance themselves from their own desires? Or maybe more to the point, a need to be seen as standing against certain “immoral behaviors,” regardless of whether they truly are against those behaviors?

If you’re James Talarico, I suppose you must put out a statement like the semi-denial offered by his team. Your name isn’t Donald Trump, so you can’t simply say “Those social media follows were planted on my timeline by the Deep State” and expect half the country to believe you.

And I suppose if you’re the New York Post, you can’t go around taking the position it doesn’t matter if some Democrat from Texas likes and/or follows an OnlyFans model, just because the guy you endorsed for President paid hush money to a porn star. The same can be said for the rest of the media that seized on the story; they all have bills to pay and a sex-related scandals are reliable eyeball magnets.

But would the world (or the truth) truly suffer if we were to give a story like Talarico’s little social media snafu the sort of mundane headline it arguably deserves? Would people miss the momentary rush of self-righteous glee that accompanies such a story if we crowned it with something like: “Semi-Famous Texan Likes Pictures of Attractive Women”? Or how about “Guy Who Believes in God Sometimes Also Thinks About Sex”? Or perhaps “Stunner: Would-Be Senator Has Actual Blood in Veins”?

Either way, maybe James Talarico should look at the bright side: at least he isn’t a politician in the UK who was, say, following someone who made porn that depicts strangulation, or he could have much bigger problems on his hands.

Read More »

Open The Age Verification Floodgates!

Ben Suroeste opines on a new age verification service. Here’s a summary:

Brady Mills Agency just rolled out AgeWallet, their shiny new age-verification system they say will help small adult-sector merchants survive the avalanche of new compliance rules. They’re leaning hard on the idea that the Supreme Court’s FSC v. Paxton ruling kicked the industry into fast-forward, and honestly, they’re not wrong—tools like this are going to keep popping up like mushrooms after rain.

What stands out, though, is the quieter warning baked into the announcement: the beginning of a serious crackdown on VPNs. AgeWallet claims it can detect proxies, masked locations, all of it—and force users to re-verify if anything looks off. Great for merchants trying to stay legal. Not so great if you’re living under a government that already decides what you’re allowed to read or watch. For those of us who remember the scrappy, boundary-free internet, this feels less like progress and more like another brick in the wall.

 

Read More »

Safe Bet: Soon, They’ll Try to Ban VPN Use by Stan Q. Brick

Laptop saying age verification

Over on Forbes.com right now, there’s an article making the point that when you read somewhere that traffic from the UK to Pornhub is down 77%, you might want to take that figure with a grain of salt. Or maybe a pillar of the stuff.

Writing for Forbes, Zak Doffman goes further still, suggesting “you can completely ignore” such a claim because “it’s not true.”

“What’s actually happening is that U.K. adults are turning to VPNs to mask their locations,” Doffman writes. “Just as residents of U.S. states affecting bans now pretend to be someplace else. Pornhub makes this as easy as possible.”

The article goes on to cite (perhaps accurately – I’m certainly no expert on VPNs) a variety of reasons why this sudden expansion in VPN use may not be a good thing, including the eye-catching assertion that “VPNs are dangerous.”

“You are trusting all your content to a third-party provider who can see where you are and the websites you visit,” Duffman writes. “At a minimum. There are plenty of reports of rogue VPNs doing much worse than that. In particular, you must avoid free VPNs and Chinese VPNs. Stick to bluechip options.”

Duffman is probably right and his advice on sticking to the name brand VPNs probably makes good sense. But as a guy who misses the era of what people call the “open internet” my concern isn’t so much rogue VPN operators as it is rogue legislators.

As I read Duffman’s piece, I couldn’t help but imagine some elected official somewhere reading the same piece and saying to himself/herself: “OH. MY. GOD. This VPN thing MUST be stopped, whatever it is.” The manner of legislation that follows this sort of epiphany typically tries to solve one problem by creating another. Or maybe several others.

The thing is, it’s not Duffman’s warning about the potential dangers of VPN use that will drive the concern of my hypothetical legislator, not the potential security threat or the nefarious actors out there offering free VPNs.

No, what will get the legislators all fired up and ready to wield their pens again will be the part about the ease of using VPNs to get around their precious, legally mandated age verification walls.

I don’t expect too many legislators will seek to ban VPN use altogether, although doubtlessly there will be some bright bulb somewhere who proposes exactly that. More likely, what they’ll do is add something to an existing age verification statute that prohibits “facilitating the use of technology to circumvent state law” on the part of the adult site, or mandating that adult sites have to do what a lot of paywalled sites do for their own reasons, which is try to detect and defeat VPN use.

As Duffman notes, websites can “look at your browser settings or cellular settings or recognize you from previous visits…. That’s why it’s harder to watch live sports from your usual provider when you’re away from home, their market restrictions try to catch you out. Porn sites do not.”

For the sake of adults in the UK and elsewhere who would rather not hand over their sensitive personal information to a third party just to exercise their right to look at sexually explicit images, here’s hoping porn sites aren’t soon forced to do what they’re currently choosing not to do.

Read More »

Mandatory Age Verification Is Creating a New Security Crisis by John Johnson – Cybersecurity Expert

Discord logo

There’s a quiet rule that’s floated around cybersecurity circles for years: don’t hold onto more data than you’re capable of protecting. Simple, elegant, almost parental in its logic — if you can’t safeguard it, don’t collect it.

But the world doesn’t care about that rule anymore.

Laws around identity and age verification are spreading fast, and they’re forcing companies—whether they’re ready or not—to gather and store the most intimate, high-risk documents a person can hand over. Passports. Driver’s licenses. National IDs. All the things you’d rather keep in your own pocket, not scattered across the servers of whoever happens to run the website you’re using.

And then something like the Discord breach happens.

In early October 2025, The recent data breach involving Discord. Not Discord’s internal systems—one of the partners handling support. Hackers got access to support-ticket data: names, emails, IP addresses, billing info, conversation logs… the usual mess. But tucked inside that mess was something far more sensitive: government-issued IDs.

These were collected for one reason: to prove a user was old enough to be there. To appeal an underage ban. And suddenly, the private documents people reluctantly handed over “just to get their account back,” were sitting in someone else’s hands entirely.

The Trap These Laws Create

Discord didn’t wake up one day deciding it wanted a folder full of driver’s licenses. Companies aren’t hungry for that kind of liability. But regulators have been ramping up age-verification mandates, and the penalties for non-compliance are steep enough to make anyone comply.

You can see the logic in the laws. Protect kids. Keep platforms accountable. Reasonable goals.

But look closely at the side effects:

We’ve built a system where organizations must stockpile some of the most breach-sensitive personal data in existence — even when they have no business storing it, no infrastructure built to protect it, and no desire to be holding it at all.

The old rule of “collect as little as possible” dies the moment a legal mandate requires collecting everything.

One Breach Becomes Everyone’s Problem

And once a company becomes responsible for storing IDs, the risk spreads. Healthcare portals, schools, banks, e-commerce shops, SaaS platforms — anyone providing service to the general public could end up in the same situation.

Every new database of passport scans is a future headline waiting to happen.

And when it happens, the fallout isn’t just personal. It’s financial. Legal. Reputational. You lose customer trust once — and you don’t get it back.

For small companies, one breach can simply end the business.

The MSPs Get Pulled Into the Storm

Managed service providers—MSPs—don’t get to sit this one out. They inherit the problem from every client they support. One MSP breach doesn’t just hit one organization. It hits all of them at the same time.

And the typical MSP environment? It’s a patchwork quilt of tools stitched together over time:

  • One for backups

  • One for endpoint protection

  • Another for vulnerability scanning

  • A different one for patching

  • Another for monitoring

  • And maybe one more to try and tie it all together

Every tool is another doorway. Another password. Another integration that can fail silently. Another shadow corner where data can slip unencrypted or unmonitored.

In an age when MSPs are being asked to guard government IDs, medical files, financial records, and entire networks—you can’t afford those shadows.

The Fix Isn’t “More Tools” — It’s Fewer

The only real path forward is simplification.

Not by removing security controls, but by merging them. Consolidation. Native integration. One platform where backup, protection, monitoring, and recovery exist inside the same ecosystem, speaking the same language, managed from the same place.

When everything runs through a single agent with one control plane:

  • There are fewer gaps.

  • There are fewer weak handoffs.

  • There are fewer places for attackers to slip in unnoticed.

  • And the attack surface shrinks dramatically.

You trade chaos for clarity.

You trade complexity for protection.

The New Reality

That old cybersecurity rule—don’t collect more data than you can protect—wasn’t wrong. It’s just not optional anymore.

The Discord breach isn’t a one-off story. It’s a preview. A warning shot.

Organizations are being legally pushed into storing the exact type of data that attracts attackers the most. And MSPs are being put in charge of securing it at scale.

So the question shifts:

If you no longer get to choose how much data you collect…

you have to be very deliberate about how you protect it.

And that means rethinking the entire structure of how we secure systems—not by addition, but by alignment.

Because now the stakes aren’t abstract. They are literal: your identity, my identity, everyone’s identity.

And someone is always watching for the first loose thread.

Read More »

How to Stay Legally Protected When Policies Get Outdated

Adult Attorney Corey Silverstein talks about how to stay legally protected as an adult website owner. Here’s a summary of the article:

It feels like the adult industry just hit a hard reset. Age verification laws are no longer theoretical — they’re real, enforced, and expensive to ignore. And because every region wants something slightly different, the once-standard “one policy fits everywhere” approach is basically dead. If a site can’t explain exactly how it keeps minors out, it’s already behind.

At the same time, regulators and payment processors are demanding proof that every bit of content is consensual and monitored. The Aylo case didn’t accuse anyone of new wrongdoing, but it sent a clear message: it’s not enough to say you have safeguards — you need documentation, systems, records, and the ability to show them working. Old blanket model releases aren’t enough anymore. Consent now has to be specific, traceable, and ongoing.

And hanging over all of this is data privacy — the silent one that can shut a company down overnight. GDPR and CPRA require clear deletion rights, consent controls, and minimal data collection. Most adult sites still haven’t updated. The takeaway is simple: the old shortcuts are now risks. The companies that survive will be the ones who update before they’re forced to — not after.

Read More »

Why is Ofcom trying to censor Americans?

Spiked’s Adam Edwards opines on the Online Safety Act in the UK.

The story centers on U.S. lawyer Preston Byrne, who represents the message board 4chan and is openly defying the UK’s Online Safety Act. When the UK regulator Ofcom issued 4chan a £20,000 fine, Byrne publicly mocked the demand and argued that British law has no legal power over companies and citizens who have no operations or assets in the UK. He views the Online Safety Act as an overreaching censorship regime and says Ofcom is trying to enforce rules outside its jurisdiction by sending threatening letters instead of going through proper international legal channels.

The Online Safety Act requires any online platform accessed by UK users—regardless of where the company is based—to submit risk assessments, reports, and censorship plans, under threat of fines or even jail for executives. While 4chan has refused to comply and likely faces no real consequences because it has no UK presence, larger American companies like Meta and Google do have substantial assets in Britain, making potential enforcement far more serious. This has sparked broader questions about sovereignty, free speech, and whether a foreign government can compel U.S. companies to restrict or monitor content.

To counter the UK’s moves, Byrne has launched both a legal challenge in U.S. federal court and proposed new U.S. legislation called the GRANITE Act, which would allow American companies to sue foreign regulators like Ofcom if they attempt to impose fines or censorship demands. If passed, it could effectively block foreign censorship attempts and even allow U.S. courts to seize foreign government assets in retaliation. Byrne argues that if the UK cannot force U.S. firms to comply, British lawmakers may eventually be forced to reconsider the Online Safety Act altogether.

Read More »

Quick Look: The Status of Age Verification Laws Across the U.S. by Morely Safeword

age verification

As you’re likely aware, since you’re reading this site, in recent years there’s been a proliferation of new state laws across the United States that require adult websites to verify the age of users before displaying any content that may be deemed “harmful to minors” to those users.

After the recent Supreme Court decision in Free Speech Coalition v. Paxton, in which the court upheld the age verification mandate passed by the Texas legislature, similar laws in other states are now clearly enforceable. With Missouri’s law poised to take effect later this month, it’s a good time to remind ourselves of the states that have (and haven’t, yet) passed similar laws.

The states with active age verification mandates in place include Alabama, Arizona, Arkansas, Florida, Georgia, Idaho, Indiana, Kansas, Kentucky, Louisiana, Mississippi, Montana, Nebraska, North Carolina, North Dakota, Ohio, Oklahoma, South Carolina, South Dakota, Tennessee, Texas, Utah, Virginia and Wyoming. And as mentioned earlier, Missouri will soon join this list.

Quite a few states have not yet passed age verification laws, at least to date. Those states include Alaska, California, Colorado, Connecticut, Delaware, Hawaii, Illinois, Iowa, Maine, Maryland, Massachusetts, Michigan, Minnesota, Nevada, New Hampshire, New Jersey, New Mexico, New York, Oregon, Pennsylvania, Vermont, Washington, West Virginia and Wisconsin.

Several of the states on the list of those that haven’t passed age verification laws have considered such proposals in the past and may do so again in the future, of course. Doubtlessly, there are at least some legislators in every state who favor the measures and are likely to introduce new bills at some point in the future.

States that haven’t passed age verification laws but have debated them at some point include Colorado, Hawaii, Illinois, Iowa, Maryland, Michigan, Minnesota, Nevada, New Mexico, New York, Oregon and West Virginia.

For much more information on age verification laws around the country – and to keep track of new bills that would establish them in additional states – check out the Age Verification section of the Free Speech Coalition website.

Read More »

Welcome to the Dumbed-Down Internet Era by Stan Q. Brick

Age verification

I was browsing the membership area of an adult site earlier this week, having “verified my age” during a previous visit, when I came across a curious scene. Halfway down the main page of the membership area was a row of banner ads for other sites, a row of ads I’ve scrolled past so many times the messages on them hardly register anymore.

But on this day, the look of this section was quite different than before. Instead of ads for other porn sites, two of the six ads were displaying messages telling me that they couldn’t show the content of the ads, due to the age-verification laws now in effect in my home state.

This was bizarre, frankly. It was a little like being asked to show my ID at the front door of a nightclub to gain entry, then having to show it again when I reached the bar, only instead of showing it to the bartender. I’d probably need to show it to the beer distributor.

Compliance by adult sites with the age verification law my home state has passed is very inconsistent, thus far. One thing I’ve noticed is the more prominent and high profile the brand, the more likely it is the company is either requiring its users to go through the age verification process or outright blocking traffic from the state.

The converse also appears to be true; the lesser known (and less likely to be legitimate) the adult site, the less likely it is to be complying with the state age verification laws proliferating around the United States.

Put another way, state governments are unintentionally (one hopes it’s unintentional, at least) funneling traffic to adult sites that are on the more questionable end of the legal spectrum, whether the laws being flaunted are age verification requirements, intellectual property laws, revenge porn laws or all the above.

Meanwhile, the adults among us who don’t find their porn by blind browsing of whatever free porn site crosses our path, the age verification requirements are repeatedly inconveniencing and irritating us as we merely try to make the most of subscriptions that were active before these laws were even cooked up.

Look, I’m not against age verification. I don’t mind the idea of making people show ID to access porn at all. That’s how things have worked in the offline world for ages, after all. What I’m against is the reality of how age verification is being handled.

What these age verification laws have handed us is a dumbed-down internet, one where in the interest of (ineffectively) “protecting children,” everyone is being treated like a child – at least where porn is concerned. If what you’re after is extreme violence or hate speech, there’s no age verification barrier to worry about, because apparently that sort of content doesn’t harm kids at all.

This special focus on porn might not last, though. And I wish the reason for the change was that legislatures around the country are going to come to their senses and stop trying to tame the internet on the sort of vain quest that even Don fucking Quixote would know to be utter folly.

Instead, what you can expect are more laws like the “Texas App Store Accountability Act,” which is currently being challenged in court by students who think maybe it’s not reasonable to require them to get permission from their parents before they download any app.

“Texas has passed a law presumptively banning teenagers – and restricting everyone else – from accessing vast online libraries of fully protected speech,” the complaint argues.

Sounds familiar, eh?

Even if you believe age verification laws for porn sites are a good idea, do you really want to see them spread out and cover everything online that might potentially be bad for kids to access?

Give that one some real thought before you answer. Unfortunately, that’s something our elected representatives are unlikely to do.

Read More »

The Big Chill: If You Can Get People to Self-Censor, You Don’t Need a Ban By Morley Safeword

Cold weather

A couple years back, when many of the state laws requiring age verification on the part of adult websites were merely proposals floating around in state legislatures, a state legislator from one of the few states that already had such a law on its books publicly noted that Pornhub had begun blocking all traffic from the state, which meant the law was “already working” as intended.

I remember seeing a clip online of the legislator saying this and wondering if she knew just how right she was. The law she was talking about may have been presented as a measure to deter kids from accessing online porn, but in truth, it was about making it harder for anyone to access online porn.

While you might not realize it if you were to review some of the blatantly unconstitutional stuff they dream up, most state legislators do know they can’t simply ban speech they don’t like. They know this is true whether the speech they dislike is sexually-explicit material like porn, speech intended to “annoy” or “offend” people who share their sensibilities or speech from Jimmy Kimmel.

Legislators, censorious activists and other vile creatures of darkness also know the next best thing to a ban is an effective effort to cow people into silence, whether by criminal law or civil sanction.

When attorneys and scholars who are familiar with the First Amendment and free speech issues discuss these things, you’ll often hear them reference the “chilling effect,” which in the context of legislation refers to “government unduly deterring free speech and association rights through laws, regulations or actions that appear to target activities protected by the First Amendment,” as the Free Speech Center at MTSU puts it.

When the Supreme Court recently upheld the Texas law requiring websites that offer over a certain amount of content deemed to be “harmful to minors” to verify the age of users before displaying any such content, the court effectively said the chilling effect of the Texas law is insubstantial, easily outweighed by the government’s “compelling interest” in deterring minors from accessing pornography.

A casual observer might think: “Good! Why shouldn’t adult sites be required to do the same thing the store on the corner has to do before selling someone porn?” But the casual observer maybe hasn’t thought this one all the way through.

The casual observer probably hasn’t considered the difference between briefly flashing your ID at a bored store clerk who probably didn’t give it much of a look anyway before waving you onward, and uploading a copy of your government issued ID, with your home address on it, to some third-party age verification service about which you know nothing.

The casual observer also may not have thought much about the websites that aren’t even arguably “porn sites,” but could still host enough content deemed “harmful to minors” to be covered by the law.

Worse still, this chilling effect is attached to a measure that isn’t particularly effective at its stated purpose. When Louisiana’s law was first passed, it took a matter of minutes for users to find a workaround – one that didn’t even require knowing what the acronym “VPN” signifies.

As a beloved old teacher of mine used to say about things of dubious value, these laws appear to be “worth their weight in sawdust” – except when it comes to their chilling effect.

As Hannah Wohl, an associate professor of Sociology at the University of California, Santa Barbara, put it in a post for The Hill, age verification laws “fail to protect minors and threaten the free speech of all Americans in ways that go far beyond pornography.”

“This chilling effect is by design,” Wohl added. “Pornography has long been the canary in the coal mine for other restrictions against free speech. Indeed, some conservatives have acknowledged that age verification laws are a back door to fully criminalizing pornography.

No, age verification laws are not “blocks” or “bans” on pornography. But they certainly are barriers – and we’re not supposed to go around erecting barriers to protected speech in America, willy-nilly. There’s supposed to be a damn good reason (that “compelling interest” of the government’s), some indication that the law serves its purpose in furthering that interest, and an assurance that the law doesn’t burden too much speech that’s legal for adults to consume in restricting minors’ access to that same speech.

Or, as Vera Eidelman, senior staff attorney with the ACLU Speech, Privacy and Technology Project, put it when talking about the Supreme Court’s decision on Free Speech Coalition v. Paxton: “With this decision, the court has carved out an unprincipled pornography exception to the First Amendment. The Constitution should protect adults’ rights to access information about sex online, even if the government thinks it is too inappropriate for children to see.”

Unfortunately, to many of those who support and advocate for these laws, the chilling effect bemoaned by their critics is a design feature, not a bug.

Read More »