Commentary

Age-verification is hurting sex educators and sex workers, studies suggest

Age verification image

Experts have warned that age verification laws could affect access to online content and related work. Preliminary research suggests those effects may be occurring.

Age verification laws vary by state and country, but generally require individuals to provide proof of age, such as a facial scan or government-issued identification, to access certain content. Since 2022, several U.S. states have implemented these laws. Other countries, including the UK, have introduced similar requirements under legislation such as the Online Safety Act.

The sexual freedom nonprofit Woodhull Freedom Foundation reported that approximately one in five sex educators, or 18 percent, say these laws have already affected their work. Among sex educators working in states with age verification requirements, one in three, or 33 percent, reported an impact.

Approximately 60 respondents completed the survey last month. While the sample size is limited, the findings indicate potential effects of age verification measures.

“Age-verification laws are already impacting sex education in the U.S.,” said Woodhull president and CEO Ricci Joy Levy in a press release.

The majority of surveyed sex educators, 73 percent, said they are concerned these laws will affect their work. Additionally, 76 percent said they are concerned the laws could be used to restrict access to sex education and related resources. According to Boston University, 37 percent of U.S. states require school sex education to be medically accurate.

“Again and again, we were told this was only about keeping minors from accessing porn,” Levy said. “Woodhull warned these vague and overly broad policies would also result in censorship of vital, non-explicit information about sex and gender, and the data bear this out. The current age-verification protocols are ripe for abuse, and educators are right to be scared.”

Separate research from adult industry research firm SWR Data reported similar findings regarding adult content creators. Nearly half, 45.2 percent, of 500 respondents surveyed last fall said their income from adult work decreased over the past year. Two-thirds, or 63 percent, said it became more difficult to earn money during that period.

There are several possible explanations for these trends, including broader economic conditions. However, 98 percent of creators who reported decreased income said they experienced challenges related to what they described as the “War on Porn.”

The term “War on Porn” is used to describe a range of efforts to limit or remove adult content online, including age verification laws. Project 2025, a policy proposal associated with President Donald Trump’s second term, includes provisions calling for a ban on pornography and legal penalties for creators. In 2024, Russell Vought, a co-author of Project 2025 and director of the Office of Management and Budget, described age verification as a “back door” to a broader ban.

Most surveyed creators who reported income loss also cited increased social media restrictions and limitations on the content they can sell. Some also reported that users have had difficulty accessing their content.

Difficulties with access were reported particularly among creators operating in U.S. and UK markets, according to SWR Data. The research also noted ongoing issues with piracy, suggesting that some users may be bypassing age verification systems.

Two separate studies conducted last year found that age verification laws have not consistently prevented minors from accessing adult content. Factors cited include the use of virtual private networks and access to non-compliant websites. The findings indicate that while age verification measures are in place, their effectiveness and broader impact continue to be evaluated.

Read More »

Trump and the Pledge: The Shot Not (Yet) Fired in the War on Porn by Stan Q. Brick

DOJ Logo

It’s hard for me to believe 2016 was only 10 years ago. In the years since, we’ve had a pandemic, a handful of Olympic Games and 1.25 Trump Administrations. We’ve also had a few wars, as well as one “short-term excursion” that bears a strong resemblance to a war, at least to my untrained eye.

In a less literal sense of the word war, we’ve also seen an escalation in the War on Porn, at least with respect to state laws in the United States and international laws and regulations globally.

Interestingly, the one force that hasn’t fired the sort of shot it certainly could squeeze off in the War on Porn has been the federal government. While various bills that would impact the adult entertainment industry are being considered in the House and Senate, what we haven’t seen is an effort on the part of the current Department of Justice to prosecute federal obscenity crimes, other than cases involving Child Sex Abuse Material (“CSAM”).

This brings me back to 2016. That was the year then-presidential candidate Donald Trump signed the “Children’s Internet Safety Presidential Pledge,” an oath authored by the anti-porn activism group Enough is Enough.

Under the terms of the pledge, Trump promised that if he was elected, he would “uphold the rule of law by aggressively enforcing existing federal laws to prevent the sexual exploitation of children online, including the federal obscenity laws,” and appoint an Attorney General “who will make the prosecution of such laws a top priority in my administration and give serious consideration to appointing a Presidential Commission to examine the harmful public health impact of internet pornography on youth, families and the American culture.”

I put the “including the federal obscenity laws” because that phrase indicates obscenity prosecutions involving content created by and for adults, not CSAM. (There are always lots of prosecutions for crimes involving CSAM, as there should be.)

During the first Trump Administration and the first 14 months of the second one, the DOJ has initiated zero obscenity prosecutions of that sort, so far as I’m aware. I can’t speak to whether Trump ever gave “serious consideration” to appointing a Presidential Commission like the one the pledge describes, but no such Commission has been established.

On the one hand, the fact that Trump hasn’t followed through on the pledge he signed is unsurprising. Like every politician, Trump is happy to make promises in one moment and forget all about them in the next. On the other hand, if the current polls are to be believed, he could use a boost – and making a move that appeals to a big segment of his support base might have more appeal than it did previously.

I’m not saying I expect the Trump Administration to suddenly start indicting American pornographers for alleged violation of federal obscenity laws. All I’m saying is it wouldn’t shock me if they did, even if only to bask in the glow of positive publicity coming from the socially conservative ‘Christian Right’ after the press conference at which the prosecutions are announced.

I can almost hear that press conference now:

“These guys we indicted for obscenity are the worst of the worst. They made filthy, disgusting pornography at a level nobody has ever seen before, not even my super smart uncle who taught at MIT and whose great brain was almost as big as mine. And now we’ve totally obliterated their ability to make any more of their sick, demented, often foreign pornography – unless they agree to a deal, which they really should, because we have so much evidence. I’m telling you, we have so much evidence, the jury will be tired of all the evidence we have. And that’s why we have to get rid of paper ballots and mail-in voting.”

I sure hope I’m wrong. Among other things, I’m sick of hearing about his uncle. Except the whole lying about his uncle’s connection to the Unabomber thing. That was awesome.

Read More »

Being Opposed to Age Verification Measures Doesn’t Mean You Hate Kids by Morley Safeword

Age verification

Over the years, I’ve had many conversations with friends and peers who, unlike me, do not work in the adult entertainment industry, in which those friends and peers have expressed their confusion over why the adult industry opposes things like age verification mandates for porn websites.

Who could be against preventing kids from accessing online porn, right? Isn’t it just sensible and reasonable to impose the same limitations we have in the brick-and-mortar world on the internet, when it comes to accessing pornography?

Frankly, if verifying a person’s age online were as simple and straightforward as checking a patron’s ID as they walk into an adult shop, I doubt many (if any) folks in the adult industry would be against it. The problem is, checking someone’s ID on the internet isn’t remotely the same as doing so in person – and the circumstances requiring a merchant or service provider to check the ID isn’t nearly as straightforward as it is in the physical world, either.

As state level age verification mandates continue to proliferate across the U.S. and the globe, it’s fair to say most legislators are firmly in favor of these measures. And while plenty of people are noting some of the downsides of age verification statutes, including emerging evidence that such regulation tends to drive users to darker, less safe corners of the web, there seems to be no slowing the momentum of the drive to impose age verification in a global fashion.

As governments continue to adopt these rules, it falls to those of us who must live under them to at least try to keep those governments honest, to assure that if they must pass these laws and establish these regulations, they at least do so under a set of clear, easily understood and narrow definitions.

Spoiler alert: Staying within clear, easy to understand definitions typically isn’t a strong suit of governments or legislators.

In critiquing social media age restrictions, which are supported by arguments that closely track the ones used to argue in favor of age verification mandates for adult sites, the Electronic Frontier Foundation notes that in the UK, the choices made about which platforms and sites are subject to the rules is a “process is devoid of checks or accountability mechanisms as ministers will not be required to demonstrate specific harms to young people, which essentially unravels years-long efforts by Ofcom to assess online services according to their risks.”

“And given the moment the UK is currently in, such as refusing to protect trans and LGBTQ+ communities and flaming hostile and racist discourses, it is not unlikely that we’ll see ministers start restricting content that they ideologically or morally feel opposed to, rather than because the content is harmful based, as established by evidence and assessed pursuant to established human rights principles,” adds Paige Collings for EFF.

Collings adds that we already know from actions taken in other jurisdictions, including the U.S., “that legislation seeking to protect young people typically sweeps up a slew of broadly defined topics.”

“Some block access to websites that contain some ‘sexual material harmful to minors,’ which has historically meant explicit sexual content,” Collings notes. “But some states are now defining the term more broadly so that ‘sexual material harmful to minors’ could encompass anything like sex education; others simply list a variety of vaguely defined harms. In either instance, this bill would enable ministers to target LGBTQ+ content online by pushing this behind an under-18s age gate, and this risk is especially clear given what we already know about platform content policies.”

In other words, these regulations are going to be applied in ways the sponsors of age verification legislation never copped to when crafting or debating the laws. In some cases, the people who wrote the laws may not even have foreseen the problem created by their loose approach to statutory construction.

At this point, we probably can’t stop the forward march of the age verification mandate trend. More of these laws will be written, more will be passed and – given the Supreme Court’s ruling in Free Speech Coalition v. Paxton, more will survive court scrutiny.

What we can do, as citizens of the jurisdictions covered by these age verification mandates, is make our voices heard on the more problematic aspects of these laws. We can contact our legislators, lobby for changes to the laws, lobby for better definitions and support candidates who show a willingness to listen. After all, a law surviving court scrutiny doesn’t mean we have to like it – or that we must stop telling our elected officials we don’t like it.

The bottom line is, porn continues to be popular, whether the people who would like to ban it (or effectively regulate it into a dark corner) like it or not. If people who enjoy adult entertainment are willing to speak up, we may not be able to strike a decisive blow in the War on Porn, but we can at least mitigate the collateral damage.

Read More »

Beware Opportunists in Superhero Capes by Stan Q. Brick

Superheroes

Some folks who favor suppression of sexually explicit materials are more forthright about what gives life to their censorious zeal than others. Say what you will about the old “Morality in Media” brand, back when the organization went by that moniker, everybody knew where they were coming from just by reading the sign on their door.

Perhaps because the folks at Morality in Media perceived they were limiting their demographic reach with the judgy-sounding, clunky old name, they opted for a rebrand back in 2015, becoming the National Center on Sexual Exploitation. Suddenly, with the flip of a logo, they sounded less like angry Bible thumpers out to cancel your favorite sitcom and more like a serious nongovernmental agency out to prevent real harm.

You know what didn’t change when MIM became NCOSE? The president of the organization. Patrick A. Trueman ran the joint on both sides of the rebrand, from 2010 to 2023. Before that, Trueman was prosecutor at the U.S. Department of Justice during the administration of George H.W. Bush, which also happens to be the last time federal prosecutors aggressively enforced the nation’s obscenity laws. Trueman remains the President Emeritus of NCOSE to this day.

Just as I doubt Trueman lost his zest for cleaning up American media when his organization rebranded, I don’t buy that a lot of the organizations most strenuously supporting various age verification mandates at the state and federal level are really in it to protect minors from harmful materials online – unless one happens to define “harmful” the same way they do, of course.

Referencing remarks recently made by Rep. Leigh Finke, a transgender member of the Minnesota Legislature who has criticized elements of her state’s proposed age verification law, Rindala Alajaji, Associate Director of State Affairs at the Electronic Frontier Foundation (EFF), and Molly Buckley, one of the organization’s legislative analysts, call attention not only to the impact of the Supreme Court’s ruling in Free Speech Coalition v. Paxton, but the nature of the organizations supporting Texas in the case.

“The Paxton case, and the coalition behind it, illustrates exactly how these laws can be weaponized,” Alajaji and Buckley write. “They weren’t there just to stand up for young people’s privacy online—they were there to argue that the state has a compelling interest in shielding minors from material that, in practice, often includes LGBTQ content. Ultimately, these groups would like to age-gate not just porn sites, but also any content that might discuss sex, sexuality, gender, reproductive health, abortion, and more.”

Alajaji and Buckley add that the “coalition of organizations that filed amicus briefs in support of Texas’s age verification law tells us everything we need to know about the true intentions behind legislating access to information online: censorship, surveillance, and control.”

“After all, if the race to age-gate the internet was purely about child safety, we would expect its strongest supporters to be child-development experts or privacy advocates,” the authors note. “Instead, the loudest advocates are organizations dedicated to policing sexuality, attacking LGBTQ+ folks and reproductive rights, and censoring anything that doesn’t fit within their worldview.”

The thing about appealing to people’s desire to protect children is that it works – and for a good reason. It’s a good thing to want to protect your kids. God knows they need protection, including from themselves. Parents should do all the reasonable, rational, normal things they can do to protect their kids.

But if you’re denying a gay or trans kid access to information from people who have been through the same things that kid is going through and can offer guidance, support and maybe a little solace for the kid, you’re not protecting that kid; you’re stifling, aggravating and alienating that kid. Shit, you might be killing that kid – even if you earnestly believe you’re helping.

I can also understand why the idea of age-gating the internet might sound good to people, especially frightened people who are raising kids who are online much more than their parents. But fear is a state of mind that can make people suggestible – and that’s when opportunists don their superhero capes and make a dramatic entrance, promising to make the world (wide web) a safer, better place for you and your kids—without really mentioning the part about how they’re actually in this to keep The Gays from enacting their Sinister Agenda, or whatever it is that animates some of these zealots.

I guess what I’m saying is this: You can’t save your kid from drowning by throwing someone else’s kid into the deep end of the pool with lead boots on. And some of the people promising to provide your kid a life jacket are heavily invested in lead.

Read More »

Australia’s Porn Age-Verification Law Sparks Debate Over Safety and Shift to “Darker Corners”

Pornhub warning collage

Something changed overnight — not just on adult sites, but in how people moved through the internet itself.

When major porn platforms began blocking Australians from access, it didn’t stop there. X also started requiring age checks before users could view adult content. And for some, that meant something far more intrusive: being asked to submit a video selfie just to look at a single post.

“Almost every post on my alt account has a content warning and asks me [for a] selfie for age verification,” one Australian porn consumer, Joe*, said. “It’s maddening.”

Others described pulling back entirely, choosing to walk away rather than comply.

“I’m honestly no longer engaging with any of the sites and platforms I used to use because not only is the verification process really invasive, but some of them even give you the option to sign in with Google … and that’s the last platform I’d trust with any sensitive data,” Jethro said.

“The choices are: link your perversions to your government ID, or submit your face into the AI slop machine,” Chris* said.

It’s still early days. Aside from several Aylo-owned sites like RedTube blocking Australians outright, and Pornhub limiting access to safe-for-work content for users who aren’t logged in, most of the top free adult platforms have yet to fully implement age verification.

Data from the SEO firm Semrush suggests only one site in the country’s top 20 — Thisvid — had complied so far. But with potential fines reaching $49.5 million for violations, more platforms are expected to follow. Users have already begun to react.

Search interest in porn-related terms has climbed to its highest level since pandemic lockdowns ended in 2022. At the same time, searches for virtual private networks — tools that allow users to appear as though they’re browsing from outside Australia — have surged to levels not seen since 2015, when website blocking laws targeting piracy were introduced.

Sex workers say none of this is surprising. For years, they warned that regulations developed between the eSafety commissioner and industry stakeholders could drive users away from regulated spaces and into less controlled environments.

“We’ve already warned that these laws will funnel traffic away from platforms that do have moderation safeguards in place and towards sites that profit from non-consensual and stolen porn, including the unpaid work of sex workers,” said Mish Pony, chief executive of Scarlet Alliance.

“So driving people off mainstream services, such as Pornhub, does not stop porn consumption, it just pushes it into darker corners of the internet. It makes it harder to address real harms.”

Andy Conboi, an OnlyFans creator based in Sydney, said he has already seen the effects firsthand. Engagement on his posts has dropped.

“People don’t really want to send a photo of themselves or their licence or whatever to these platforms, particularly Twitter [X],” he said.

“In the group chats I do have with creators, people are just frustrated and annoyed, their engagement is down [and] it’s much more difficult to put stuff out there and be seen a lot of the time.”

Some creators, he added, are pivoting. They’re shifting toward safe-for-work content on platforms like Instagram and TikTok just to maintain visibility — a move he described as ironic, given the presence of underage users on those services.

For longtime opponents of pornography, however, the changes mark a milestone.

After earlier attempts at internet filtering fell short under previous governments, and opt-out filtering proposals were abandoned before the 2013 election, regulators have gradually expanded their authority over online content. The eSafety commissioner’s role has grown significantly over the past decade.

Advocacy groups that have campaigned for tighter controls welcomed the developments.

“This day was hard fought for,” said Melinda Tankard Reist, movement director for Collective Shout. “Collective Shout and our partners and allies worked hard to bring it to fruition.”

“It is a relief to know proof-of-age protections are now in place as one obstacle in the way of young people being exposed to rape porn, torture porn, incest porn and extreme violence and degradation of women.”

The Australian Christian Lobby also supported the outcome.

“The fact that P*rnhub have ceased operating in Australia is already proof of its effectiveness,” said chief executive Michelle Pearse.

Questions remain about whether those outcomes will hold — or simply shift behavior elsewhere.

Researchers studying similar laws in parts of the United States found that when major sites restricted access, users didn’t necessarily stop searching. They redirected.

“We saw very large substitution effects for search traffic for XVideos, which is the second largest porn website in the states,” said David Lang, a Stanford University researcher and lead author of the report.

“It’s a sufficiently large change that the No 2 site is now the No 1 site in states that passed those laws.”

Tracking VPN use proved more difficult, researchers noted, since users often disappear from local data once they connect through external servers.

For digital rights advocates, the concern isn’t just where people go — it’s what they leave behind.

Tom Sulston, head of policy at Digital Rights Watch, warned that age-verification systems could create centralized pools of highly sensitive personal data.

“It would be absolutely trivial for a criminal to set up porn sites as honeytraps to capture Australians’ identities and sexual interests; and then use that material for blackmail, similar to existing sextortion schemes,” Sulston said.

“Foreign intelligence services looking to trap Australian targets could easily do the same. The age-verification regime puts Australians at greater risk of harm, not less.”

And that’s the uneasy part of it all. The behavior doesn’t disappear — it just moves.

Read More »

Utah’s Proposed Porn Tax Raises Major Civil Liberties Concerns

Utah House building

SALT LAKE CITY — Utah lawmakers are again stepping into the middle of the long-running debate over how far governments should go when regulating online adult content. This time, the focus is a proposed tax on pornography purchased through digital platforms.

Senate Bill 73, introduced earlier this year by Republican lawmakers in the Utah Legislature, would impose what the bill calls a “material harmful to minors” tax on revenue generated from the sale of online pornography. The rate is currently set at 2 percent, after originally being proposed at 7 percent.

After several amendments, the measure passed the state Senate with broad support and now awaits further consideration in the House of Representatives. If approved there, it would head to the desk of Gov. Spencer Cox, who has publicly supported policies aimed at restricting access to online pornography.

The legislation was introduced by Republican state Sen. Calvin R. Musselman and state Rep. Steve Eliason, both of whom have supported previous efforts in Utah to regulate online adult content.

Under the proposal, revenue generated by the tax would be directed toward several state programs. The bill specifies that funds could be used to support enforcement efforts tied to Utah’s existing age verification laws for social media and adult websites, among other regulatory initiatives.

During the legislative process, lawmakers added language addressing virtual private networks (VPNs) and similar technologies used to bypass location-based restrictions. The revised bill would make it illegal to intentionally circumvent content blocks implemented by platforms as part of age verification compliance, with violations subject to civil penalties.

The measure also includes provisions aimed at limiting how websites communicate with users in Utah about these tools. Specifically, the bill states that platforms covered by age verification requirements may not provide instructions or guidance that would allow users to bypass those restrictions.

The current version of Senate Bill 73 states:

“A commercial entity that operates a website that contains a substantial portion of material harmful to minors may not facilitate or encourage the use of a virtual private network, proxy server, or other means to circumvent age verification requirements, including by providing: (a) instructions on how to use a virtual private network or proxy server to access the website; or (b) means for individuals in this state to circumvent geofencing or blocking.”

Measures regulating adult content have appeared in several states in recent years. Alabama, for example, enacted legislation that imposes a 10 percent tax on pornography-related revenue generated within the state, alongside additional legal requirements for adult performers involving notarized consent documentation.

Utah’s proposal does not include those record-keeping provisions, but it does expand the scope of enforcement mechanisms connected to age verification and online access controls.

The tax itself would function similarly to what policymakers often describe as a “sin tax,” a type of levy commonly applied to products such as alcohol, tobacco and gambling. In this case, the tax would apply to companies that generate revenue from online adult content through methods including clip sales, subscriptions and fan-based platforms.

Under the proposal, entities meeting the bill’s definition of “covered entities” would calculate the portion of revenue generated from Utah-based users and pay the 2 percent tax to the state on an annual basis.

If the measure becomes law, larger online platforms could likely absorb the additional compliance costs with relative ease. For smaller companies operating in the adult content market, however, the administrative and regulatory requirements could prove more challenging.

The bill’s future now depends on the outcome of deliberations in the Utah House. Should it pass there and receive the governor’s signature, the measure would add Utah to a growing list of states experimenting with new approaches to regulating digital adult content.

Whether the proposal ultimately reshapes how online platforms operate — or instead becomes the subject of courtroom challenges — may become clear only after the legislative process runs its course.

Read More »

The Web Used to be the “Information Superhighway;” it’s Becoming a Low-Speed School Zone by Stan Q. Brick

Blurred highway

Back in the late 90s and early aughts, it was commonplace to hear the internet referred to as “The Information Superhighway,” a term that for many of us connoted not just speed of transfer, but the relatively unfettered regulatory environment surrounding what was then an emerging network for communications and commerce.

Fast forward to 2026 and those heady days of rapid growth and regulatory permissiveness are gone. Some might say “good riddance,” but I can’t help but wonder what we’re losing as we grope for ways to make the web ‘safer’ for a population who arguably shouldn’t be using it, at all.

During an adult industry trade event over 20 years ago, an attorney friend of mine posed a good question: If the web is the “information superhighway,” who in their right mind would want to build a playground for children in the median of such a thoroughfare?

The answer, then and now, is: “Far too many people.” Crucially, a significant subset of those people are legislators, national, state and local. And these days, every time you turn around, one of them is sponsoring, writing or endorsing a measure like the Kids Internet and Digital Safety (KIDS) Act, or the Innocence Act, or some manner of tax directed specifically at adult websites.

I can’t speak for the populations of other countries, but here in the U.S., what I’ve noticed over the decades is many people look to the government to handle jobs they probably ought to be doing themselves – or indeed, that it’s only possible for them to do for themselves.

Look, I get it; it’s hard raising kids. But the difficulty of being a parent is not a new thing – and it certainly isn’t limited to the internet era. When I was kid, way back in the early 1970s, once I left the immediate vicinity of my parents’ home, they had almost no way of knowing what I was up to – a worrying fact for a lot of parents, especially during times when panics over child abductions and general “stranger danger” were in full swing.

Was it easier for my parents to watch me walk off to catch the school bus back when I couldn’t text to confirm my arrival at school than it is for parents these days to do the same, when their kids have dozens of options for checking in or marking themselves “safe”? I think that’s a tough argument to make.

Yes, largely because of the internet and related technologies, kids today have easier access to things like porn than I did when I was a kid. Guess what? Even in the days when we had to go digging through our fathers’ sock drawers to find porn, we still managed to find it. (Where there’s a hormone-fueled will, there is always a way.)

Of course, the impulse to restrict and regulate access to content deemed to be beyond the years of kids is a lot older than the internet, too. They seem almost quaint now, but broadcast decency standards have been around for decades. Does anyone believe these standards have prevented kids from hearing “profane language” or being exposed to content that is “patently offensive” but does not rise to the point of being “obscene” under federal law? If so, I have a healthy store of bridges on hand to sell to these poor, credulous souls.

Yes, the internet is filled with problematic content. But if your concern about what kids stumble across online is limited to “obscene” or “indecent” content, then you’re either ignorant of what lurks online, or the nature of your concern says more about you than it does the internet.

One thing about the internet has not changed since the days when it was common to call it the Information Superhighway: It remains an enormous network of independently operated computers, on which virtually anyone can publish virtually anything. Mixed in to that ‘everything’ is a long list of things that are potentially “harmful to minors.”

Are sites that promote racial hatred less damaging to minors than pornography? How about sites that disseminate misinformation and disinformation? Are false medical claims something we want kids to be perusing with no guidance or guardrails? How about deepfake videos of a war in progress?

Don’t get me wrong: Not for one minute am I suggesting all those things listed above should be subject to governmental blocking, censorship or over-regulation to prevent their spread. What I’m suggesting – and what I’ve been telling my less-wired friends for literal decades – is simply this: The internet isn’t for children, and it simply can’t be made “safe” for them, try as we might.

The difficult fact is, even if every proposed measure to limit kids’ access to “harmful” content currently under consideration is passed and vigorously enforced, the internet will remain as I described it above – “an enormous network of independently operated computers, on which virtually anyone can publish virtually anything.” To make it ‘safe’ will require fundamentally altering the nature of that network and siloing it to a degree where it will no longer be recognizable as the internet.

And guess what? Even if we do that, you’ll still have to parent your kids. You’ll still have to shepherd them through their early years – and you’ll still have to let go of being a shepherd when they become adults. The internet age didn’t change any of that, either.

If you believe the answer will come from the government, if you believe legislation like the KIDS Act or the Innocence Act will make the world (or even just the internet) a substantially safer place, knock yourself out. Write to your representatives and demand that they pass those laws – and then see what happens.

I’ll tell you what isn’t going to happen: Your job as a parent isn’t going to get easier. The sooner you accept that and get on with the difficult business of raising a child, the better.

Read More »

There’s a “Porn Lesson” to Take from Lindsey Vonn’s Olympic Experience (No, Really) by Stan Q. Brick

Lindsay Vonn

When champion skier Lindsey Vonn experienced a terrible crash on what turned out to be her final run in the women’s downhill skiing event at the Winter Olympics in Milan earlier this month, maybe there were a few people out there thinking she shouldn’t have been permitted to take the risk of running the race, given that she already had a torn ACL injury in her left knee. But if a significant number of people felt that way, they seem to have kept it to themselves, for the most part.

Instead, the dominant reaction to Vonn’s knowing acceptance of added risk rightfully has been to praise her bravery, determination and champion spirit. As Madison Chapman wrote for Newsweek, “Winner or not, Vonn is the ideal Olympic champion. Her grit and resilience helped me shed my own fear of risk and learn to see myself as a champion over adversity after my cancer treatment and subsequent knee injury. She may not have clinched gold, but Lindsay Vonn reminded us all how to live.”

I’ve always been fascinated by the way people view the act of taking a physical risk, be it in the context of competitive skiing, climbing a mountain or something as fundamental managing one’s personal health. I’ve long believed that the question of whether something is safe to do is a different question than whether ought to be allowed to do it. As I see it, it’s not complicated; adults should be allowed to take informed risks – including a litany of risks I would never take, myself.

Doubtlessly, one reason Vonn found so much support for her decision is the competitive context. She was attempting to win a gold medal, an achievement for which there’s a very limited window of opportunity, one that only comes around every four years – and only for so many cycles in an athlete’s career.

Make no mistake, though; the reason Vonn’s decision, the Olympic Games themselves and Vonn’s injuries are global news is because sports are popular entertainment – and big business.

In other words, while we support Vonn’s chosen form of risk taking because competition is deemed a worthy enterprise by a significant portion of the human population, we also support it because we accept, at least in the context of sport, that people have a right to risk bodily harm in the process of entertaining other people.

We’re not consistent about this acceptance of risk for entertainment’s sake, of course. The response to people taking risks in the context of porn is less enthusiastic. Sometimes it inspires proposals specifically designed to deter peoplefrom plying their trade in adult entertainment.

I’m not saying I think social media should light up with words of encouragement every time a porn star gets nominated for an award, or when an adult content creator releases a new clip (although that would be nice). But maybe, if society can applaud people for risking grievous bodily harm while competing on the Olympic stage, society can at least also manage to avoid shaming people and subjecting them to paternalistic government regulation when the risks they take involve other, less celebrated forms of entertainment.

Read More »

Debanking Explained: How Politics, Policy, and Perception Shape Account Closures

A board with the word debanking.

The Cato Institute has published a report on debanking. You can find it here: https://www.cato.org/policy-analysis/understanding-debanking-evaluating-governmental-operational-political-religious?utm_source=social&utm_medium=email&utm_campaign=Cato%20Social%20Share

Here’s what the report is all about:

Debanking can be a frustrating and deeply unsettling experience. One day everything seems fine, and the next, a notice arrives giving just 30 days to withdraw funds and find a new financial institution. Confusion quickly turns into anxiety. Bills were paid, nothing appeared out of the ordinary — so what changed? A call to the bank rarely brings clarity, and the response is often the same: no further details can be provided. Customers are left with more questions than answers.

On the other side of the conversation, bankers are frequently constrained by strict confidentiality requirements. Even frontline staff may not have access to the underlying reasons for an account closure. Financial institutions operate within a framework of anti–money laundering, know your customer, and countering the financing of terrorism regulations — commonly referred to as AML, KYC, and CFT. While these rules are standard practice within the industry, they remain largely invisible to the public, creating a disconnect that fuels frustration on both sides.

For those looking to address the growing concern around debanking, some argue that meaningful change will require greater transparency. That could mean reconsidering the confidentiality that surrounds account closures, removing reputational risk as a regulatory factor, and reevaluating the Bank Secrecy Act framework that effectively places financial institutions in the role of investigative gatekeepers.

Read More »

Good News: Sometimes Adult Businesses DO Get Treated Like Everyone Else by Stan Q. Brick

Judge's gavel

In a country where it seems like lawsuits get filed at the drop of a hat – particularly if the hat is quite hard, quite heavy and falls on someone’s toes, causing both physical injury and extreme emotional distress – the fact that our courts do make plaintiffs jump through at least a minimal set of hoops can be something of a comfort.

For example, if I get into a fender bender with someone in California, but that person lives in New York, they probably can’t haul me into court in New York to make me face a lawsuit there, simply because New York happens to be the plaintiff’s state of domicile. They’d likely have to sue me in California, due to the way the courts handle the question of personal jurisdiction.

As you may have heard, a district court in Kansas applied this logic in dismissing a couple lawsuits filed against companies that operate adult websites, because a plaintiff there alleged those sites are not complying with the state’s age verification requirements for adult sites.

Among other things, judge in the case, U.S. District Judge Holly Teeter, wrote in her decision dismissing a lawsuit against Titan Websites that “merely intending that users accessing its content be able to do so from a wide geographic area is not the same as purposefully directing one’s activities at a forum.”

“Technical steps taken to make a universally accessible website easier for all users to access no matter where they are located is no more purposeful direction than the act of setting up the website in the first place,” Teeter added. “And just like the act of setting up a website, were the indiscriminate use of a CDN or other technologies to indiscriminately facilitate content delivery enough, ‘then the defense of personal jurisdiction, in the sense that a State has geographically limited judicial power, would no longer exist.’”

Teeter also wrote that her reasoning “does not mean that a website owner’s use of a CDN is never relevant” and “does not mean that a website owner’s use of a CDN could never show purposeful direction.”

“It does mean that more is needed to determine how the CDN is used and whether the CDN is being used to target a forum or an immediate region of which the forum is a part,” Teeter wrote. “The Court need not dissect the contours to resolve this case. Here, Plaintiff simply alleges that a CDN is being used and that the CDN has servers near the forum because logically it must. Defendant responds with evidence that it uses a third-party web-hosting service and that it does not know or care where the CDNs are located. This record is not enough to carry Plaintiff’s admittedly light burden.”

This dismissal of this case, as well as Teeter’s decision dismissing an identical case against a company called ICF Technology, is certainly good news for other adult businesses that might find themselves hailed into court over alleged violations of a state’s age verification law. They are not, of course, the end of the story.

The plaintiff is likely to appeal these decisions, whereupon the matter will go to the Tenth Circuit Court of Appeals. I’m no lawyer and I don’t have much to offer in terms of a prediction as to how the Tenth Circuit might ultimately rule. I just know that I don’t have much confidence in how the next court up the chain, the U.S. Supreme Court, might rule, should they take up the question.

Having found the age verification law passed in Texas to be constitutional, it wouldn’t surprise me one bit if SCOTUS decided that merely being accessible in a state creates a sufficient “minimum contact” with any given state for a court there to assert personal jurisdiction.

Still, at least for the time being, Teeter’s decisions represent something of a victory for the porn side of the War on Porn. Whether that victory is lasting or ephemeral remains to be seen. Fingers crossed.

Read More »