Anthony Comstock

Origins of the War on Porn: The Comstock Act by Morley Safeword

Over the course of the nearly 30 years, I’ve worked in the adult entertainment industry; I’ve lost count of the number of times I’ve written about and discussed the nature of American obscenity laws. In this new series of articles, “Origins of the War on Porn,” we’ll examine some of the roots of the long-running effort by elements of American society to stamp out pornography, including key pieces of legislation dating back to the late 19th Century.

One of the most significant of these laws is the Comstock Act of 1873, which criminalized the use of the United States Postal Service to transmit obscene materials. Bear in mind, this was long before the existence of the “Miller Test,” the modern definition of obscenity established by the Supreme Court; as such, much of the material which would constitute a violation of the Comstock Act would strike modern viewers as quite tame.

The Comstock Act was named for Anthony Comstock, a staunch Christian who was born in rural Connecticut, then moved to New York after serving in the Civil War. Comstock was shocked by the city, which seemed to him a place “teeming with prostitutes and pornography,” as PBS put it in profiling Comstock.

Determined to shape the city’s sexual mores to his liking, Comstock began supplying police information on local prostitution operations to assist in their anti-vice efforts. Comstock was also taken aback by ads for contraception devices, so he soon adopted the contraceptive industry as another source of societal ills.

In 1872, Comstock began lobbying in Washington for the passage of an anti-obscenity bill, which would include a ban on contraceptives, which the determined activist had penned himself. He succeeded in his lobbying, and the Comstock Act was attached as a rider to the Post Office Consolidation Act of 1872.

Codified largely at 18 USC §1461 and 1462, the Comstock Act has been amended many times over the decades – as has the legal definitions of terms like “obscene” and “indecent,” which are peppered throughout the statutes. Still, even after these amendments, the core principles of the Act remain in place.

18 USC §1461 still prohibits the use of the U.S. mail to send any “obscene, lewd, lascivious, or filthy book, pamphlet, picture, motion-picture film, paper, letter, writing, print, or other matter of indecent character; or any obscene, lewd, lascivious, or filthy phonograph recording, electrical transcription, or other article or thing capable of producing sound; or any drug, medicine, article, or thing designed, adapted, or intended for producing abortion.” (In 1958, the law was amended to replace “preventing conception” with “producing abortion” in the last line quoted above.)

Comstock himself might be gratified to learn that his namesake law still survives, but he’d likely be aghast at how watered down it has become in its application and definitions. Credit for that reduction in scope and efficacy goes in part to a very different American activist, Margaret Sanger, the founder of Planned Parenthood who successfully challenged the Comstock Act when she opened the first birth control clinic in the country.

Of course, you are familiar with Sanger, you know her legacy as an advocate for women’s rights is complicated by her beliefs on eugenics, which in recent years have been used as a means for social conservatives to attack the organization she founded. Planned Parenthood has disavowed Sanger’s stated beliefs, noting as they did so that “today, anti-reproductive rights activists continue to attack Sanger as a strategy to undermine the crucial services Planned Parenthood currently provides.”

As both Comstock and Sanger demonstrate, the battles that underpin the War on Porn (as well as what many people have termed the “War on Women”) have their roots in debates and disagreements far older than anyone reading this article. Over 150 years after the Comstock Act was established, the issues that animated it are as contentious as they were in Comstock’s time.

Proof of the continuing influence of the Comstock Act and the man for whom it is named can be found in articles like this one from the Kaiser Family Foundation, titled “The Comstock Act: Implications for Abortion Care Nationwide,” published in 2024. Among other things, the article notes the Comstock Act “could be used by a future presidential administration opposed to abortion rights to sharply restrict abortion nationwide.” (President Trump declining to do exactly that angered some of his supporters not long after the KFF article was published).

Echoes of the Comstock Act also can be found in the ongoing effort to restrict access to online porn, or in more extreme cases, ban porn altogether. As these efforts demonstrate, Anthony Comstock might be long gone, but there are many folks happy to walk in his footsteps – and to become the next foot soldiers for their side of the War on Porn.

Read More »
UK flag

U.K. Enacts Ban on Strangulation Content, Adult Depictions of Minors

LONDON — Lawmakers in the United Kingdom have approved the Crime and Policing Bill, with final passage secured Monday, establishing new criminal restrictions covering depictions of “non-fatal strangulation” and certain sexual content involving adults portraying minors.

The measure was first introduced in the House of Commons in February 2025 and has since been revised through an extended period of parliamentary debate and amendment.

It will become law following Royal Assent from King Charles III, a procedural step that is expected to proceed without issue.

Restrictions on Strangulation Content

The legislation designates the possession and distribution of “pornographic images of strangulation or suffocation” as priority offenses under the Online Safety Act. Individuals found in possession may face prison terms of up to two years, while those responsible for distribution could face sentences of up to five years.

Support for prohibiting this category of content increased after the release of a government-commissioned “pornography review” in February 2025. That report recommended banning material it described as “degrading, violent and misogynistic.” Industry groups and free speech advocates raised objections at the time to proposals targeting so-called “extreme” material.

Depictions of Minors by Adult Performers

The law also makes it an offense to publish or possess sexual content in which adults portray individuals under the age of 16.

It specifies that “sound or information associated with the image” will be used to determine whether a performer is representing someone under 16. This replaces earlier draft language that would have allowed visual cues such as costume or setting to be considered as evidence.

The legislation states, “A person is not to be taken as pretending to be under 16 if it is fanciful that they are actually under 16 in the way pretended.” A related government memorandum adds that the provision is “not intended to criminalize a pornographic image of someone who is clearly an adult where the only marker of childhood is the fact that he or she is in school uniform.”

Provisions Not Included in Final Version

Two proposals discussed during the bill’s passage were not adopted: a blanket prohibition on “step” content and a measure addressing consent in performer agreements.

While the law prohibits depictions of incest involving blood relatives, it applies to “step” scenarios only in cases where a performer is portraying someone under 18. According to the government, this is intended to align enforcement with conduct that would be unlawful in real life.

An amendment introduced in the House of Lords that would have allowed performers to withdraw consent to publication and distribution at any time, regardless of prior contracts, was also not included in the final text.

Instead, a Commons amendment directs the Secretary of State to review age and consent verification practices used by websites and to report findings to Parliament within one year. The law also grants authority to intervene and regulate those practices without requiring additional legislation.

Next Steps and Ongoing Policy Work

The government’s Violence Against Women and Girls (VAWG) strategy includes the formation of a “joint pornography team” tasked with reviewing the issues identified in the 2025 pornography review and evaluating evidence to guide future policy.

The team includes representatives from the Home Office, the Department for Science, Innovation and Technology, the Ministry of Justice, and the Department for Culture, Media and Sport. It began work in December 2025, and its findings are expected to influence future legislative proposals and regulatory actions by Ofcom.

Read More »
Take it down act

FSC Notes Additional TAKE IT DOWN Act Provisions Set to Take Effect May 19

In a statement released today, the Free Speech Coalition reminded stakeholders across the adult industry that key provisions of the TAKE IT DOWN Act — legislation that “created a federal criminal prohibition on the nonconsensual publishing of intimate images (including AI-generated “deepfakes”) and requires covered platforms to establish a notice-and-removal process for such content within 48 hours of a valid request,” — will take effect May 19, 2026.

While the ban on nonconsensual imagery went into effect immediately after the law was signed, the notice-and-removal requirements begin on that date.

As outlined in the FSC statement, the law applies to two categories of content: “authentic intimate visual depictions published without consent” and “digital forgeries.” The latter includes “AI-generated or otherwise computer-manipulated intimate images of an identifiable individual that a reasonable person would find indistinguishable from authentic depictions.”

FSC stated that “any person who knowingly publishes such content using an interactive computer service” may be held liable under the law, adding that enforcement is directed at “the individual uploader/publisher, not the platform.”

Under the notice-and-removal provisions taking effect May 19, “websites, online services, online applications, or mobile applications that serve the public and primarily provide a forum for user-generated content (including messages, videos, images, and audio)” must comply.

“Covered platforms must establish a process by which an individual (or their authorized representative) can submit a removal request,” FSC said. “The request must include a signature, identification of the content, a good faith statement that it was published without consent, and contact information.”

As for timing, once a valid request is received, platforms “must remove the content as soon as possible, but no later than 48 hours after receipt,” FSC explained. “Platforms must also make reasonable efforts to identify and remove known identical copies.”

The law also requires platforms to “post a clear, conspicuous, plain-language notice of their removal process and how to submit a request.”

“Failure to comply with the notice-and-removal obligations is treated as an unfair or deceptive act or practice under the FTC Act, enforced by the Federal Trade Commission,” FSC added.

FSC noted that the definition of “covered platform” is broad and may apply to “most sites that host user-generated content.”

“Platforms that host any user-uploaded content should assume they are covered and consult with counsel,” FSC said.

FSC also emphasized that under the law “consent to create an intimate visual depiction does not equal consent to publish it.”

Addressing what qualifies as a valid request, FSC explained that submissions must be made in writing and include the following:

  • a physical or electronic signature of the requestor (or their representative)
  • identification of, and information sufficient for the platform to locate, the offending content
  • a statement of the requestor’s good-faith belief that the depiction was not consensual
  • the requestor’s contact information

FSC also noted that the law “includes no provisions that address how platforms can or should deal with erroneous or fraudulent removal requests.”

The full statement is available on the FSC website.

Read More »
Make love not porn logo.

MakeLoveNotPorn Denies Allegations of Violating U.K. Online Safety Law

LOS ANGELES—Cindy Gallop, founder and chief executive officer of the “real-world” porn platform MakeLoveNotPorn (MLNP), denied claims that her platform is not compliant with the United Kingdom’s Online Safety Act after allegations surfaced that the site exposed minors to age-restricted content without proper age assurance measures.

The allegations emerged after Gallop announced a public education partnership with Samantha Niblett, a Labour Party MP for South Derbyshire in England’s East Midlands. The two are promoting a “Summer of Sex” campaign focused on comprehensive sex education across the United Kingdom. A British tabloid reported that MakeLoveNotPorn was allegedly “breaching” age verification requirements under the law, which is enforced by U.K. digital regulator Ofcom.

Security correspondent Abul Thaer reported that MLNP, described as an interactive sexual education platform, “only allows a 12-second preview before requiring payment, during which viewers can already see explicit visuals.” Gallop disputed that characterization in a statement.

Gallop wrote, “We had a technical glitch with our site, but I’m happy to confirm we were and are fully compliant with the Online Safety Act. We had a temporary disruption due to a technical issue we addressed asap, and can confirm full compliance.”

“It is this Labour government that is implementing the Online Safety Act in order to prevent children from seeing content that is not age-appropriate,” Niblett said. “I was assured MLNP was Ofcom-compliant, having only ever signed in myself using my own age to access the site.”

Niblett has described the “Summer of Sex” initiative as part of a broader effort tied to the government’s approach to addressing harmful online content, including recent measures affecting certain categories of adult material.

Read More »
Tennessee sign

Warning: Freedom of Expression May Be Harmful to Your Mental Health by Stan Q. Brick

As you might have read recently, the Tennessee legislature is considering passing a law that would require “adult oriented establishments” to post warning signs on their premises. These signs would warn prospective patrons and passersby that the forms of entertainment and expression found within are the source of a variety of societal ills – and that by frequenting such establishments, duly warned readers of the sign are effectively branding themselves contributors to human trafficking, sexual assault, domestic violence and other societal ills.

As noted in the article I linked to above, there are two different versions of the sign verbiage being considered.

One version reads: “Attention: By engaging in this type of entertainment, you may be contributing to an increase in domestic assault, rape or sexual assault, and human trafficking.”

The other version is: “Attention: By purchasing, borrowing, or using this pornographic material, you may be contributing to an increase in domestic assault, rape, or sexual assault, and human trafficking.”

Don’t worry, though; no reputable public health agency or serious anti-trafficking organization has endorsed the conclusions that would be required on these signs, should one of these bills become law. These signs would merely convey the beliefs of some elected officials – and some of those who voted for them as well, presumably.

The people pushing for the warning labels might believe the things they want to force adult businesses to say are true, or they might just be trying to foist their moral vision of how the world ought to be – and, crucially, how the rest of usought to view matters of sexuality.

This is far from the first time a government has sought to steer consumers away from content to which it objects. Censorship, in all its ugly forms, is a near-universal phenomenon. Even in countries with a seemingly libertine disposition, governments routinely seek to limit their citizens’ access to certain ideas, depictions and forms of expression.

In the U.S., our freedom of expression is guaranteed under the First Amendment. Unfortunately, one of the few limitations of the First Amendment’s protections is “obscene” speech, which creates the circumstances in which we now find ourselves, where there’s tension between the expansive nature of the First Amendment’s text –“Congress shall make no law… abridging the freedom of speech” after all – and the more restrictive interpretation of that text the courts have adopted over the decades.

Thankfully, for the most part, U.S. courts have rejected as unconstitutional several previous attempts to force people to affix warning labels to their expressive materials, including the federal court that enjoined HB 1181, the Texas law that not only mandated age-verification measures for adult sites, but also contained provisions requiring adult sites to carry warning labels. To give you an idea of how much confidence Texas had in the constitutionality of the warning label provisions, its legal team didn’t appeal that part of the court’s ruling, so the Supreme Court upholding HB 1181 does not mean adult sites must now carry those warnings.

To be fair to Tennessee, the warning labels Texas wanted adult sites to carry were even worse than the ones being considered in the Volunteer State, in that Texas wanted to force adult business to outright lie, as opposed to merely forcing them to parrot highly disputed anti-porn talking points.

Back in 2023, when the trial court enjoined HB 1181, US District Court Judge David Ezra took one look at the labeling provisions in the law and declared that the law “unconstitutionally compels speech.”

“There is no doubt that HB 1181 forces the adult video companies into compelled speech,” Ezra wrote. After describing the three required warning labels, Ezra concluded, simply: “This is compelled speech.”

“The government is forcing commercial sites to speak and broadcast a proposition that they disagree with,” the judge noted. “The Supreme Court has ‘held time and again that freedom of speech includes both the right to speak freely and the right to refrain from speaking at all.’… Even if, as the defendant argues, the law compels only commercial speech, it does not pass constitutional muster.”

And, as Ezra observed in another part of his order, while the warnings would have required websites to attribute the labels’ findings to the Texas Health and Human Services, the “Texas Health and Human Services Commission has not made these findings or announcements.”

A government compelling a business to say a particular thing is bad enough. A government compelling a business to proclaim things that simply aren’t true at all is a whole other level of wrong. Who knows; perhaps Texas recognized this after the fact and that’s why they didn’t challenge the injunction with respect to the labels.

Look, I get it: Everyone has their hangups, the things they’d rather not see, hear, read or to which they’d otherwise prefer not to be exposed. I don’t like horror movies, for example – and I really can’t abide by depictions of terrible things happening to people’s eyes.

Rather than lobby governments to pass laws prohibiting the depiction of eye trauma, I use the following novel technique: I try to avoid such depictions on my own, without government assistance. It’s crazy, I know, but it’s an approach that has served me well for over 50 years now and I stand by it.

Ultimately, the only sort of warning label I can abide by is the sort employed by the late, great Frank Zappa, who crafted a warning for his Barking Pumpkin record label that he cobranded as a “guarantee.” The warning is worth reading in full, not only for the humor, but the serious point the humor is employed to make.

“WARNING/GUARANTEE: This album contains material which a truly free society would neither fear nor suppress. In some socially retarded areas, religious fanatics and ultra-conservative political organizations violate your First Amendment Rights by attempting to censor rock & roll albums. We feel that this is un-Constitutional and un-American. As an alternative to these government-supported programs (designed to keep you docile and ignorant). Barking Pumpkin is pleased to provide stimulating digital audio entertainment for those of you who have outgrown the ordinary. The language and concepts contained herein are GUARANTEED NOT TO CAUSE ETERNAL TORMENT IN THE PLACE WHERE THE GUY WITH THE HORNS AND POINTED STICK CONDUCTS HIS BUSINESS. This guarantee is as real as the threats of the video fundamentalists who use attacks on rock music in their attempt to transform America into a nation of check-mailing nincompoops (in the name of Jesus Christ). If there is a hell, its fires wait for them, not us.”

Honestly, if Tennessee wanted adult businesses to post a warning like that, I could probably get on board.

Read More »
Playboy logo

Court Orders Lead to Restoration of Playboy Germany’s Facebook Page by Meta

DÜSSELDORF, Germany — For two months, the Facebook page for Playboy Germany sat in limbo. Then, just like that, it was back — restored after a court stepped in and told Meta to reverse course.

A regional court in Düsseldorf issued an injunction against the company, finding that the decision to block the page wasn’t lawful.

The page, followed by roughly 1.8 million people, had gone dark on Feb. 17.

In a statement, Kouneli Media — the company behind Playboy Germany — said Meta justified the takedown by pointing to alleged violations of its community standards, including nudity and sexual content. But according to the company, no specific posts were identified. Instead, the notice referenced activity that only “seemed” to break the rules.

The situation echoes a broader pattern that’s been hard to ignore. Across Meta’s platforms, moderation decisions often land without much warning — or clarity. Accounts tied to adult content, even those operating within legal bounds, can disappear overnight. Earlier this month, the Instagram account of sex tech company Bellesa was also taken down.

Kouneli Media has since filed a complaint with Germany’s Federal Network Agency, an independent regulator overseeing telecommunications and digital infrastructure.

Read More »
Age verification image

EU Unveils Age-Verification App; Researchers Say It Can Be Hacked in Two Minutes

BRUSSELS — The European Union’s rollout of a mobile app designed to verify users’ ages online has drawn scrutiny after cybersecurity researchers identified potential privacy and security issues in the code.

European Commission President Ursula von der Leyen presented the tool in Brussels on Wednesday, stating it was “technically ready” and would soon be made available as countries introduce measures to restrict minors’ access to social media.

“It is fully open source. Everyone can check the code,” von der Leyen said.

Cybersecurity and privacy specialists reviewed the publicly available code on GitHub and reported several concerns related to the app’s design.

The development comes as policymakers, privacy advocates, technology companies and child protection groups continue to debate how best to safeguard minors online while maintaining data protection standards.

Within hours of the app’s release, security consultant Paul Moore said the app stored sensitive data on users’ devices without sufficient protection, according to a widely circulated post on X. Moore said he was able to compromise the app in under two minutes.

Baptiste Robert, a French ethical hacker, confirmed several of the findings and said it was possible to bypass the app’s biometric authentication features, allowing access without a PIN code or fingerprint verification.

Olivier Blazy, a cryptographic researcher and member of a French digital identity task force, said: “Let’s say I downloaded the app, proved that I am over 18, then my nephew can take my phone, unlock my app and use it to prove he is over 18.”

The European Commission said Friday that the app is technically ready. “Yes, it is ready. Maybe we can add, ‘and it can always be improved’,” Chief Spokesperson Paula Pinho told reporters.

Digital spokesperson Thomas Regnier said: “Now, when we say it’s a final version, it’s … still a demo version.” He added that the final product is not yet available to the public and that “the code will be constantly updated and improved … I cannot today exclude or prejudge if further updates will be required or not.”

In a statement issued Thursday, the Commission said the issues identified by researchers related to an earlier “demo version” of the app released for “testing and development purposes,” and that the vulnerability “was fixed.”

However, Moore and Blazy said their findings were based on the most recent version of the code available online.

“It’s a good thing they made the app open source for experts to try and test it. The problem is the released source code does not meet cybersecurity standards we would expect for such an important app,” Blazy said.

“We were worried that the Commission would launch its app in a hurry, no matter its security issues, and now we can see it wants to launch something that is not technically ready,” Blazy added. “Such a rushed launch could undermine trust in future digital identity wallets.”

Inti De Ceukelaire, a Belgian ethical hacker, said: “For open source code projects like this one, it would be a good move to also publish any security assessments prior to launch, so everyone can balance out the benefits versus the risks.”

Debate over the app reflects broader disagreements about how to regulate access to online platforms, including social media and adult content.

The EU and several member states are developing systems to verify users’ ages online as part of efforts to strengthen protections for minors.

French President Emmanuel Macron held a video conference with European leaders on Thursday evening to discuss the issue. Participants included von der Leyen, Italian Prime Minister Giorgia Meloni, Spanish Prime Minister Pedro Sánchez, German Chancellor Friedrich Merz and other officials.

Australia in December became the first country to introduce restrictions on minors’ use of social media, effectively barring users under 16 from platforms such as TikTok and YouTube.

The European Commission opened a €4 million tender for the app in 2024, which was awarded to Swedish digital identity company Scytáles and Deutsche Telekom.

The system allows users to verify their age using a passport, national ID or a trusted third party such as a bank. Online platforms can then confirm whether a user meets a required age threshold without accessing additional personal data, using a method known as “zero-knowledge proof.”

Member states may also develop their own compatible applications, intended to function across the EU for age verification.

Some critics have said that current age-verification technologies do not yet provide sufficient guarantees for privacy and data protection, and that users may be able to bypass restrictions using tools such as virtual private networks (VPNs).

Blazy was among more than 400 privacy and security experts who signed an open letter in March calling on the Commission to impose a “moratorium on deployment plans until the scientific consensus settles on the benefits and harms that age-assurance technologies can bring, and on the technical feasibility of such a deployment.”

Markéta Gregorová, a member of the Czech Pirate Party and lead lawmaker on a cybersecurity bill in the European Parliament, said: “this process is being rushed under political pressure.” She added that further review is needed “to assess if all measures were taken for cybersecurity and privacy.”

Birgit Sippel, a German center-left lawmaker, described the app as a “half-baked app solution that doesn’t live up to [the EU’s] own standards,” in a comment.

Piotr Müller, a Polish lawmaker from the European Conservatives and Reformists group, said: “Brussels is once again pushing for a centralized, EU-wide technological tool. The hastily announced age verification app poses a massive risk to the privacy of citizens … We cannot agree to the step-by-step creation of a Chinese-style internet in Europe.”

Read More »
European commission

UPDATED: European Commission Releases Age Verification App, Responds to Security Concerns

BRUSSELS — The European Commission’s age verification app is now technically ready and is expected to be made available to EU citizens for use in confirming their age when accessing online platforms, European Commission President Ursula von der Leyen said Tuesday.

In July 2025, the European Commission published guidelines aimed at protecting minors online under the Digital Services Act and introduced a “white label” age verification app designed to help websites and platforms meet compliance requirements under the law.

The app later moved into a pilot phase, where it was tested across several EU member states. Those countries either integrated the system into their digital identity wallets or released versions through app stores, adapting the interface to national systems while keeping core privacy features unchanged.

Aylo, the operator of Pornhub and other high-traffic platforms, has taken part in the pilot program.

In a statement, von der Leyen described the app as “a free and easy-to-use solution that can shield our children from harmful and illegal content.”

“First, it is user-friendly,” von der Leyen said. “You download the app. You set it up with your passport or ID card. You then prove your age when accessing online services. Second, it respects the highest privacy standards in the world. Users will prove their age without revealing any other personal information. Put simply, it is completely anonymous: users cannot be tracked. Third, the app works on any device — phone, tablet, computer, you name it. And, finally, it is fully open source – everyone can check the code. This means that our partner countries can also use it. This is very important, that this can be used by our global partners.

“But more importantly, online platforms can easily rely on our age verification app,” von der Leyen added. “So there are no more excuses.”

Von der Leyen identified France, Denmark, Greece, Italy, Spain, Cyprus and Ireland as early adopters of the system.

“They are planning to integrate the app into their national wallets,” she said. “I hope more Member States and private sector will follow so that every citizen can soon use the app.”

She also warned that the Commission would take enforcement action against companies that fail to comply with child protection requirements.

“This is why we are moving ahead with full speed and determination on the enforcement of our European rules,” she said. “We are holding accountable those online platforms that do not protect our kids enough.”

As an example, European Commission Executive Vice President Henna Virkkunen pointed to recent enforcement actions involving Pornhub, Stripchat, XNXX and XVideos. A Commission investigation last month preliminarily found those platforms in breach of provisions under the Digital Services Act aimed at preventing minors from accessing adult content.

“They simply do not have proper age verification tools in place to keep our children away from their adult content,” Virkkunen said. “As platforms do not have proper age verification tools in place, we came up with the solution ourselves.”

Virkkunen also said she is establishing an EU-wide coordination system for accrediting age verification solutions, “to ensure that we continue to build one solution for the EU, not 27 different ones.”

“Our blueprint is open-source, and any private company is free to use the blueprint to develop innovative solutions,” she said. “We only have two conditions: respect the privacy standard. And make sure we have the same technical solution everywhere in the EU.”

Following the announcement, reports emerged raising concerns about potential vulnerabilities in the app, along with questions about whether its open-source nature could introduce security risks. Responding to those reports, European Commission Digital Spokesperson Thomas Regnier said, “A new version has just or will soon today be updated … the code will be constantly updated and improved. It’s open source, and I cannot today exclude or prejudge if further updates will be required or not.”

Read More »
US State Department logo

U.S. State Department Urges Countries to Adopt Age Verification Laws

WASHINGTON — U.S. diplomats are being encouraged to support the adoption of age-verification measures for online adult content in countries around the world, according to statements from the State Department.

An internal communication described by a media report outlined guidance for diplomats to promote “age assurance” laws and technologies aimed at protecting minors, while also maintaining protections for broader rights. The State Department did not confirm the existence of that document but acknowledged the general policy direction.

“Protecting children from online exploitation and abuse is a top priority for the United States,” a State Department spokesperson said. “We believe these protections should be implemented in ways that are rights-respecting and do not unduly compromise privacy or freedom of expression.

“We favor age assurance that is narrowly tailored, clearly defined, and utilizes privacy-preserving technologies,” the spokesperson continued. “As countries explore frameworks to protect children online, the United States stands ready to engage and encourage that these measures are effective, balanced, and do not impose unintended consequences on fundamental freedoms or technological progress.”

The position aligns with approaches taken by other agencies within the executive branch, including the Federal Trade Commission, which earlier this year supported age-verification measures at the national level and expressed backing for related proposals in Congress, such as the Kids Online Safety Act.

Some observers have linked these policy directions to broader discussions within the current administration about online content regulation, including references to policy frameworks that have proposed stricter controls on internet pornography.

Separately, Russell Vought, director of the Office of Management and Budget, was recorded prior to the 2024 presidential election discussing age verification in a conversation with undercover reporters, during which he described it as a potential “back door” to broader restrictions on legal pornography.

Read More »
Tennessee sign

Tennessee Bill Would Require Warning Labels at Adult Businesses

NASHVILLE, Tenn. — Lawmakers in Tennessee are weighing a proposal that would require adult-oriented businesses to display warning notices linking pornography and adult products to a range of harms, including claims that critics say are not supported by established evidence.

Senate Bill 2481, sponsored by Republican Sen. Janice Bowling of Tullahoma, cleared the state Senate on Wednesday.

As written, the bill would require adult businesses to post printed warnings at their entrances. The notices must be “no smaller than 8 1/2” x 11”, in 48-point type, in boldface, block letters, centered on the sign,” with black lettering on a white background. Displaying the signs would be a condition for obtaining a license from county-level adult-oriented business regulatory boards, where such boards exist.

State law defines adult-oriented businesses to include “adult bookstores, adult mini-motion and motion picture theaters, adult cabarets, escort agencies, sexual-encounter centers, massage parlors, rap parlors, saunas, and similar businesses.”

The bill outlines specific language for the warnings.

Attention: By engaging in this type of entertainment, you may be contributing to an increase in domestic assault, rape or sexual assault, and human trafficking.

An alternative version reads:

Attention: By purchasing, borrowing, or using this pornographic material, you may be contributing to an increase in domestic assault, rape, or sexual assault, and human trafficking.

Republican Rep. Monty Fritts of Kingston is sponsoring the measure in the House. The legislation is expected to receive support from the state’s Republican majority.

SB 2481 is paired with House Bill 2314. Legislative records show the Senate version has been revised to align with the House language as lawmakers move toward potential approval by Gov. Bill Lee.

If enacted, the measure would add another regulatory requirement for adult-oriented businesses in Tennessee, alongside the state’s existing age verification law, which includes felony penalties for violations and remains the subject of ongoing legal challenges.

The proposal reflects an approach used by lawmakers in other states, drawing comparisons to warning labels required for tobacco and alcohol products and to online notice requirements adopted in Texas and Alabama.

In Texas, legislation required pornography websites to display warnings stating that the state’s Health and Human Services Department had identified viewing pornography as addictive, similar to drugs and alcohol. That requirement was part of House Bill 1181, which also included an age verification mandate that later reached the U.S. Supreme Court.

In a legal challenge brought by the Free Speech Coalition and operators of major adult platforms, a federal district court found the compelled warning language in HB 1181 unconstitutional under the First Amendment, citing the Zauderer test, which allows required disclosures only if they are purely factual and not controversial.

The court determined that the statements in the Texas law were not scientifically established, noting that “pornography addiction” is not a medically recognized diagnosis.

First Amendment attorney Corey Silverstein said similar legal questions could arise in Tennessee.

“This amendment raises serious First Amendment concerns by forcing lawful businesses to disseminate government-compelled messaging that is, at best, highly contested and, at worst, misleading,” Silverstein said. “The government cannot condition a license on requiring speakers to adopt and promote a viewpoint that stigmatizes their own protected activity.

“Adult entertainment is legal expression,” he contined. “Mandating that these businesses post warnings implying a causal connection to crimes like trafficking or sexual assault is not grounded in reliable evidence and risks crossing the line from regulation into unconstitutional compelled speech. If the state’s objective is to address exploitation or violence, it should do so through evidence-based policy—not by singling out a lawful industry and forcing it to carry a message designed to undermine it.”

Research cited by advocacy groups has also addressed the relationship between legal adult content and exploitation. The Woodhull Freedom Foundation states, “Many people of all genders choose to work in the adult industry in a variety of ways that are legal, consenting, and informed, including the making of adult visual content.

“While workers in the adult industry are at risk of harm and exploitation just like any other industry, pornography is a legal and frequently accessed mechanism for women and gender expansive people to participate in the economy and earn a living,” Woodhull adds.

Some studies have also examined broader social trends, with findings indicating that increased availability of online pornography has been associated with declines in reported sexual assault rates.

Read More »