Political Attacks

The Big Chill: If You Can Get People to Self-Censor, You Don’t Need a Ban By Morley Safeword

Cold weather

A couple years back, when many of the state laws requiring age verification on the part of adult websites were merely proposals floating around in state legislatures, a state legislator from one of the few states that already had such a law on its books publicly noted that Pornhub had begun blocking all traffic from the state, which meant the law was “already working” as intended.

I remember seeing a clip online of the legislator saying this and wondering if she knew just how right she was. The law she was talking about may have been presented as a measure to deter kids from accessing online porn, but in truth, it was about making it harder for anyone to access online porn.

While you might not realize it if you were to review some of the blatantly unconstitutional stuff they dream up, most state legislators do know they can’t simply ban speech they don’t like. They know this is true whether the speech they dislike is sexually-explicit material like porn, speech intended to “annoy” or “offend” people who share their sensibilities or speech from Jimmy Kimmel.

Legislators, censorious activists and other vile creatures of darkness also know the next best thing to a ban is an effective effort to cow people into silence, whether by criminal law or civil sanction.

When attorneys and scholars who are familiar with the First Amendment and free speech issues discuss these things, you’ll often hear them reference the “chilling effect,” which in the context of legislation refers to “government unduly deterring free speech and association rights through laws, regulations or actions that appear to target activities protected by the First Amendment,” as the Free Speech Center at MTSU puts it.

When the Supreme Court recently upheld the Texas law requiring websites that offer over a certain amount of content deemed to be “harmful to minors” to verify the age of users before displaying any such content, the court effectively said the chilling effect of the Texas law is insubstantial, easily outweighed by the government’s “compelling interest” in deterring minors from accessing pornography.

A casual observer might think: “Good! Why shouldn’t adult sites be required to do the same thing the store on the corner has to do before selling someone porn?” But the casual observer maybe hasn’t thought this one all the way through.

The casual observer probably hasn’t considered the difference between briefly flashing your ID at a bored store clerk who probably didn’t give it much of a look anyway before waving you onward, and uploading a copy of your government issued ID, with your home address on it, to some third-party age verification service about which you know nothing.

The casual observer also may not have thought much about the websites that aren’t even arguably “porn sites,” but could still host enough content deemed “harmful to minors” to be covered by the law.

Worse still, this chilling effect is attached to a measure that isn’t particularly effective at its stated purpose. When Louisiana’s law was first passed, it took a matter of minutes for users to find a workaround – one that didn’t even require knowing what the acronym “VPN” signifies.

As a beloved old teacher of mine used to say about things of dubious value, these laws appear to be “worth their weight in sawdust” – except when it comes to their chilling effect.

As Hannah Wohl, an associate professor of Sociology at the University of California, Santa Barbara, put it in a post for The Hill, age verification laws “fail to protect minors and threaten the free speech of all Americans in ways that go far beyond pornography.”

“This chilling effect is by design,” Wohl added. “Pornography has long been the canary in the coal mine for other restrictions against free speech. Indeed, some conservatives have acknowledged that age verification laws are a back door to fully criminalizing pornography.

No, age verification laws are not “blocks” or “bans” on pornography. But they certainly are barriers – and we’re not supposed to go around erecting barriers to protected speech in America, willy-nilly. There’s supposed to be a damn good reason (that “compelling interest” of the government’s), some indication that the law serves its purpose in furthering that interest, and an assurance that the law doesn’t burden too much speech that’s legal for adults to consume in restricting minors’ access to that same speech.

Or, as Vera Eidelman, senior staff attorney with the ACLU Speech, Privacy and Technology Project, put it when talking about the Supreme Court’s decision on Free Speech Coalition v. Paxton: “With this decision, the court has carved out an unprincipled pornography exception to the First Amendment. The Constitution should protect adults’ rights to access information about sex online, even if the government thinks it is too inappropriate for children to see.”

Unfortunately, to many of those who support and advocate for these laws, the chilling effect bemoaned by their critics is a design feature, not a bug.

Read More »

Apple Raises Privacy Concerns Over Texas Age Verification Law

Apple Store

Apple on Thursday outlined how it plans to comply with a new Texas law, SB 2420, which introduces strict age assurance requirements for app stores and app developers.

While Apple had already begun rolling out its own age verification tools earlier this year in anticipation of regulatory changes, the company voiced significant privacy concerns regarding the Texas legislation.

In a statement to developers, Apple said, “…we are concerned that SB2420 impacts the privacy of users by requiring the collection of sensitive, personally identifiable information to download any app, even if a user simply wants to check the weather or sports scores.”

The Texas law is part of a growing wave of state-level regulations across the United States. These laws have emerged as individual states step in where federal lawmakers have failed to enact comprehensive online protections for minors. Though the goals are similar—to safeguard children online—the methods differ widely between states.

While Apple has the resources to comply with such mandates, smaller app developers may struggle. The company noted that its new tools are intended to help those developers meet legal requirements. Some smaller startups are already feeling the impact; for example, social media platform Bluesky recently blocked its service in Mississippi, explaining it lacked the resources to comply with similar state laws.

When SB 2420 takes effect on January 1, 2026, Apple will be required to confirm whether Texas-based users are at least 18 years old. Users under 18 will need to join a Family Sharing group managed by a parent or guardian. Parents will have to give consent for all App Store downloads, purchases, and transactions through Apple’s existing in-app purchase system.

To align with the new requirements, Apple said it will enable developers to determine user age “in a privacy-preserving way.” Developers can currently use the company’s Declared Age Range API, which will be updated before the law’s effective date to include new age categories for users in Texas.

Apple also announced it will introduce new APIs later this year that allow developers to request parental consent if they make major changes to an app that alter its age rating. Parents will be able to revoke consent at any time if they decide an app is no longer appropriate for their child.

The company even acknowledged the potential household dynamics such tools could create, joking that “we can imagine this being used as a new punishment technique; no Instagram for a month!”

Apple also cautioned developers that similar age assurance laws are scheduled to take effect soon in Utah and Louisiana, urging them to prepare for additional compliance requirements.

Read More »

Apple Introduces New Tools to Protect Kids and Teens on Its Devices

Kid on tablet

Apple announced a series of new initiatives Thursday aimed at helping parents and developers create safer digital environments for kids and teens. Along with simplifying the setup process for child accounts, parents will now be able to share information about their children’s ages, allowing app developers to provide more age-appropriate content.

The App Store will also feature a new set of age ratings designed to give developers and users a clearer understanding of an app’s suitability for specific age groups. Product pages for third-party apps will soon include additional details to help parents make informed decisions—such as whether the app includes ads, user-generated content, or its own parental controls.

“These updates will roll out to parents and developers later this year,” Apple said.

The changes come amid ongoing national and state-level debates over how tech companies should protect children online. Nine U.S. states, including Utah and South Carolina, have recently proposed bills requiring app store operators to verify children’s ages and obtain parental consent before minors can download apps.

Apple has long advocated for app developers to handle age verification themselves, while companies like Meta have argued that app store operators should manage the process, given their direct access to user information.

Apple’s latest system represents a middle ground. The company will collect children’s age data directly from parents while requiring third-party developers to use that information to design age-appropriate experiences.

Simpler Setup for Child Accounts

Apple’s new setup flow for child accounts—required for children under 13 and optional for minors up to 18—makes the process smoother for families.

Parents can now select their child’s age range and verify their identity by confirming an existing credit card on file, rather than re-entering payment information manually.

If a parent isn’t available during setup, the child can still start using the device. Apple will automatically apply age-based web filters and allow access only to preinstalled apps like Notes, Pages, and Keynote. Neither Apple nor developers can collect the child’s data without parental consent during this stage.

When the child first visits the App Store and attempts to download an app, they’ll receive a reminder to ask their parent to complete setup.

Once setup is complete, the child can use Apple services with the content and app restrictions defined by their parent.

New Age Range API for Developers

Instead of asking children to manually enter their birthdays, developers can now use a new Declared Age Range API to access the age range information that parents provide during account setup. Parents can correct or revoke this data at any time.

Through the API, developers receive an age range—such as 9–12 or 13–15—without learning the child’s specific birthdate.

If an app requests age information, the child will see a pop-up asking permission to share it—similar to existing prompts for camera, microphone, or location access.

Apple says this approach is “more effective,” since “kids often lie about their birthday to access an app’s full experience.”

Developers must opt in to use the API, but future legislation could make its adoption mandatory for certain app categories.

Expanded Age Ratings on the App Store

Apple is also updating the App Store’s existing age rating system. Currently, apps are labeled 4+, 9+, 12+, or 17+. The new framework adds more detail, breaking down teen users into 13+, 16+, and 18+ categories while keeping the younger ranges intact.

Apple says an app’s rating is determined by developer responses about its content and the intensity or frequency of that material.

“This will help parents better determine if an app their child requests is age-appropriate,” the company explained. “If content restrictions are enabled, kids are prevented from downloading or updating apps that exceed their age range.”

In addition, age-restricted apps will not appear in curated sections like Today, Games, or Apps when a child is browsing.

Several of these new features for child accounts are available in the public beta of iOS 18.4. The ability to modify a child’s age after account creation, along with the Declared Age Range API and new App Store ratings, will launch later this year.

In response to Apple’s announcement, a Meta spokesperson described the update as “a positive first step,” but noted that “developers can only apply these age-appropriate protections with a teen’s approval.”

“Parents tell us they want to have the final say over the apps their teens use,” the spokesperson added, “and that’s why we support legislation that requires app stores to verify a child’s age and get a parent’s approval before their child downloads an app.”

Read More »

Wisconsin Proposals Could Criminalize VPN Use Under Age Verification Laws

Wisconsin flag

MADISON, Wis. — Lawmakers in the Wisconsin state legislature are gradually advancing an age verification bill that not only targets adult entertainment websites but also includes provisions restricting the use of virtual private networks (VPNs). The most recent movement on the proposal occurred during a state Senate committee meeting on Oct. 8.

The measure to limit VPN usage stems from two companion bills, Assembly Bill 105 and Senate Bill 130, both introduced exclusively by Republican lawmakers. Since the GOP controls both chambers of the legislature, the proposal stands a strong chance of passing in some form through the Assembly and Senate.

What remains uncertain, however, is how Democratic Gov. Tony Evers will respond if the bill reaches his desk—whether he would sign it into law or veto it. Taken together, AB 105 and SB 130 represent the latest effort to criminalize or restrict the use of commercially available VPNs when used to bypass age verification systems.

In neighboring Michigan, a group of far-right legislators recently proposed an even more extreme measure: a total ban on pornography that initially sought to outlaw VPNs and proxy tools.

Rep. Josh Schriver, who sponsored House Bill 4938, titled the Anticorruption of Public Morals Act, described the bill as a “public decency and public safety solution” aimed at curbing access to what he considers harmful online content. However, following public and stakeholder backlash, Schriver announced that he would amend the measure to remove any language referencing VPNs and proxy restrictions.

The Age Verification Providers Association (AVPA), a trade group for the age-verification sector, has long been criticized for sending mixed messages about VPNs. While its executive director, Iain Corby, has said the organization does not oppose VPNs, he has also stopped short of clarifying how they factor into bypassing age verification systems.

Across the Atlantic, similar discussions are taking place in the United Kingdom, where regulators are considering age verification requirements for VPN services. Dame Rachel de Souza, the Children’s Commissioner for England, told BBC’s Newsnight program in August that current age verification provisions under the Online Safety Act are “essentially useless,” given how easy it is for minors to download and use VPNs.

“Of course, we need age verification on VPNs—it’s absolutely a loophole that needs closing, and that’s one of my major recommendations,” de Souza said during the interview.

Read More »

Proposed Pennsylvania Bill Would Impose Tax on Adult Content Platforms

Pennsylvania flag

HARRISBURG, Pa. — A Pennsylvania lawmaker is proposing new legislation that would add an additional tax on online adult content platforms.

State Senator Marty Flynn (D–Lackawanna, Luzerne) introduced the measure, arguing that while adult content services already generate substantial revenue from Pennsylvania subscribers, the money they earn contributes little beyond the state’s standard sales and use tax.

Flynn said his proposal aims to ensure these companies pay their “fair share” to the Commonwealth. The legislation would impose a 10% tax on all subscriptions and one-time purchases made through online adult platforms. This new tax would be applied in addition to Pennsylvania’s existing 6% sales and use tax.

All revenue collected under the bill would be directed to the General Fund, where it could be used to support essential state programs and priorities.

Flynn noted that his proposal mirrors efforts in other jurisdictions that have sought to modernize their tax codes and ensure online-based businesses are taxed equitably alongside traditional industries.

Read More »

EU Releases Updated Age Verification App Blueprint as Social Media Access Rules Spark Debate

EU DSA logo

The European Commission has unveiled a second version of the EU’s age verification app blueprint, as debates continue across the bloc over how to prevent children from accessing social media platforms.

Originally introduced in July 2025 as a “white label” prototype, the app was designed to work seamlessly with upcoming EU Digital Identity (EUDI) Wallets. The blueprint serves as a foundation that EU member states and private sector developers can adapt to create their own local versions of an age verification system.

The latest version introduces several new features, including the use of passports and national ID cards—in addition to electronic IDs (eIDs)—as onboarding methods to generate proof of age. It also incorporates support for the Digital Credentials API, a system designed to streamline interactions among users, service providers, and credential issuers.

According to the European Commission, the blueprint’s purpose is to “support the implementation of the Digital Services Act (DSA)” and ensure stronger protections for minors online. The app allows users to prove they are over 18 when attempting to access restricted content, such as adult websites, without revealing personal details like their full birthdate or identity.

The project is being developed by the T-Scy consortium, a partnership between Scytales AB (Sweden) and T-Systems International GmbH (Germany), which also manages stakeholder engagement across the EU.

The blueprint is already undergoing trials in several countries, including Denmark, France, Greece, Italy, and Spain, which are building their own national age verification apps based on the EU model. By the end of 2025, the Commission expects the blueprint to integrate zero-knowledge proof (ZKP) technology, which would enable verification of a user’s age without exposing any private data.

However, despite this progress, widespread adoption remains uncertain, as not all EU members are on board.

Estonia and Belgium Push Back

Two member states—Estonia and Belgium—have refused to sign the Jutland Declaration, a ministerial pledge advocating for the adoption of a digital age of majority across the EU.

The declaration calls for privacy-conscious age verification mechanisms on social media and other digital platforms to “mitigate the negative impact of illegal and inappropriate content, harmful commercial practices, addictive or manipulative design, and excessive data collection, particularly affecting minors.” It also suggests introducing a formal “digital legal age.”

Initiated by Denmark, which currently holds the rotating presidency of the EU Council, the declaration has been signed by 25 other member states, according to The Brussels Times. Denmark has made child online safety one of its top priorities during its six-month term.

Read More »

FSC Clarifies California’s Device-Based Age Verification Law Excludes Adult Websites

Free Speech Coalition logo

LOS ANGELES—The Free Speech Coalition (FSC) issued a statement Wednesday clarifying that California’s newly enacted age verification law does not apply to adult entertainment websites.

California is home to a large portion of the adult entertainment industry, including studios, performers, and site operators.

In a post on its official blog, the FSC explained:

California’s recently passed age-verification bill, AB 1043, does not apply to adult websites. The bill requires device manufacturers and app stores to collect a user’s birthdate or age and provide an API for apps (but not websites) to determine whether a user is under 18 years old. Apps must use the age information to ‘manage the delivery of age-appropriate content.’”

The FSC further noted that although the bill originally included language extending the age signal API to adult websites, that provision was removed in a last-minute amendment. The organization added that it has long supported a device-based solution to help adult sites restrict access to minors using similar technology.

The new law is scheduled to take effect on January 1, 2027.

Read More »

Discord Confirms Data Breach Exposed Government ID Photos of 70,000 Users

Discord logo

SAN FRANCISCO — Discord, the popular chat and community platform, confirmed that one of its third-party vendors experienced a major data breach that exposed the personal information of about 70,000 users, including photos of government-issued identification cards.

The affected vendor was responsible for processing age-verification submissions and appeals on behalf of Discord. The company has not yet named the vendor but indicated that the breach was the result of a cyberattack exploiting a Zendesk instance, allegedly part of an extortion attempt targeting both the vendor and Discord.

Early reports suggested that roughly 1.5 terabytes of data were stolen—around 2.2 million images tied to age-verification records. However, Discord said the actual scope was smaller than initially claimed.

“This was not a breach of our internal systems,” a Discord spokesperson told The Verge. “The attack targeted a third-party service we use to support our customer service operations. Approximately 70,000 users may have had government-ID photos exposed, which the vendor used for age-related appeal reviews.”

The company added that all affected users have been notified. “We’ve secured the affected systems, ended our relationship with the compromised vendor, and continue to cooperate with law enforcement, data protection authorities, and external security experts,” the spokesperson said. “We take our responsibility to protect user data seriously and understand the concern this may cause.”

Discord also disclosed that other personal details—including names, usernames, email addresses, IP addresses, and the last four digits of some users’ credit cards—were included in the compromised data.

While Discord remains best known for its role in gaming culture and online communities, it has also become a hub for artists, streamers, and adult creators who use the platform to interact with fans and build digital communities. The service allows users over 18 to share adult-oriented material within designated, age-restricted spaces.

Read More »

What Was “Verified,” Really? By Stan Q. Brick

Age verification image

As a guy who crossed the magic line of his 18th birthday over 35 years ago, it has been a damn long time since I was last asked to present identification documents as part of purchasing any age-restricted product.

More accurately, I should say it had been a damn long time – until last week, when I tried to log in to the members area of an porn website of which I’ve been a member for several months now.

Rather than simply being prompted to enter my username and password, I was presented with a dialog box informing me that before I could gain access to the site in question – a site to which I’ve already prepaid for nearly 90 more days of access, by virtue of a billing rollover that took place weeks ago – I needed to verify my age.

This struck me as odd and more than a little irritating. I was aware my home state is among those that have passed an age verification law directed at porn websites, but I had assumed existing customers, particularly those whose credit cards had been successfully billed several times already by the merchant involved, might somehow be “grandfathered in,” at least with respect to members’ area access.

No such luck, though. If I wanted to continue to access this site – in other words, if I wanted to receive the full benefit of the membership I’d already paid for – I would have to do business with whatever third-party service they’ve employed to perform the act of age verification on the site’s behalf, as well.

My immediate reaction was to close the browser, so I could weigh the question of whether to continue as a member of the site, cancel my account, or cancel my account and demand a refund. Nowhere in the agreement I ‘signed’ as part of joining the site did it state I’d have to do business with a third party to maintain future access to the site. Foisting that requirement on me without notice seemed dicey.

The first decision I made was not to act at all, right then. Among other things, the unexpected access-block had pissed me off a bit, and anger is never a good frame of mind for making decisions. I joined the site because I like the content they make and because I like watching it; should requiring me to show my ID really be so off-putting as to make me cancel, let alone demand a refund?

I sat on the decision for a couple days, straddling the fence on whether I’d jump through the age verification hoop that had been presented to me. Finally, I decided it made sense to see what the process required, how invasive it was of my privacy – and how effective or ineffective it seemed towards the stated end goal of verifying the users’ age and deterring minors from accessing the site. I could always back out before submitting anything, I reckoned.

The site in question offered only one option for an age verification service, one based in the United Kingdom. The system informed me that to verify my age, I’d need to upload a scan of one of several state-issued forms of ID: a driver’s license, a state ID, a passport, or state-issued military ID. It also referenced the possibility of uploading a selfie, in which I’d be holding the ID – so my face could be compared to that on the ID, presumably.

I wasn’t thrilled about doing any of this, for a variety of reasons. For starters, I don’t trust the promises from these third parties to not retain any of my “personally identifiable information.” I believe most online companies will look for every means available to monetize any piece of data they collect on their users (and seek every loophole in every law preventing them from doing so), and my assumption is that companies offering age verification services will be no different from their peers in that regard. And if such companies collect and store this data, malicious hackers will access it eventually, rest assured.

Beyond privacy concerns, I kept thinking about the lack of notice involved here. One day I’m a member of a porn site who can log in any time and check out the latest updates, then the next day, I’m forced to hand over my name, contact information and ID to some company out of the UK, just for the honor of accessing content I’d already paid to access? Even if that’s not an illegal or tortious arrangement, such a transition certainly doesn’t feel right.

Ultimately, despite my reservations, I decided to go ahead with the age verification process. As much as anything, I was now curious to see just how onerous it was and what all it would require of me.

A funny thing happened though; after uploading a photo of my ID, I was told I’d been verified and could now continue to the members area – no selfie required, no further personal information, just the email address I’d already given them on the previous page of the form and the scan of my ID.

Maybe I should be pleased by the fact I didn’t have to upload a selfie, but instead I’m struck by the pointlessness of it all. All this service had done was verify that someone had uploaded a driver’s license belonging to a man in his 50s, but in no way had they established it was the man in his 50s himself who had uploaded it.

The good news, I suppose, is that now I have access to the content for which I’d already paid. The bad news is… well, the bad news is unknowable, really. But when the bad news comes, with it may come answers to several questions I now have.

How many members of this same site will opt to cancel their memberships, or demand refunds, as I considered doing, rather than go through with the age verification process?

How many minors will find out about how easy it is to circumvent the age verification process of this age verification vendor?

Is this vendor truly not storing age verification documents? If they are storing such documents, will I learn that’s the case via an extortionate email threatening to reveal my porn preferences to my employer or family members?

But the biggest question, at least as I sit here right now typing, is this one: Through this age verification process, what was “verified,” exactly?

Exactly. I don’t know, either.

Read More »

California Governor Signs Device-Level Age Verification Bill Into Law

Gavin Newsom

LOS ANGELES — California Gov. Gavin Newsom, a Democrat, signed into law Assembly Bill (AB) 1043 on Monday, implementing an age verification regime that requires users’ ages to be verified at the device operating system and/or app store level when setting up a new phone, tablet, or computer.

The bill, known as the Digital Age Assurance Act, covers all major operating systems, including Google’s Android and Apple’s iOS. Set to take effect on Jan. 1, 2027, it requires the companies that own and operate these systems to develop a mechanism allowing users to enter and confirm their ages by the summer of that year.

This also means that age verification must occur when users download or purchase apps and content from Google Play or the Apple App Store.

Under the legislation, violations could cost companies up to $2,500 per affected child, with intentional violations climbing to $7,500 per child. The law also “shields” companies from liability for so-called “erroneous age signals” as long as they make a good-faith effort to comply.

Erroneous signals may arise from the use of virtual private networks (VPNs) and other proxy tools designed to bypass age restrictions online.

In addition, the law introduces new safety requirements for digital platforms and services, including measures to prevent suicide and self-harm, clear warnings about social media and AI-powered chatbots, and tougher penalties for profiting from unlawful deepfakes.

“Emerging technology like chatbots and social media can inspire, educate, and connect—but without real guardrails, technology can also exploit, mislead, and endanger our kids,” Gov. Newsom said in a statement issued by his office. He continued, “We’ve seen some truly horrific and tragic examples of young people harmed by unregulated tech, and we won’t stand by while companies continue without necessary limits and accountability.

“We can continue to lead in AI and technology, but we must do it responsibly—protecting our children every step of the way. Our children’s safety is not for sale.”

Beyond device-level age verification, the legislation mandates warning labels on social media platforms to alert young users to the potential risks of excessive use.

It also strengthens penalties for deepfake pornography, allowing victims of non-consensual deepfakes to pursue civil damages of up to $250,000 against individuals who knowingly distribute such material.

“These bills establish guardrails that protect our children’s health and safety while ensuring innovation moves forward responsibly, showing that we can have both at once, always with future generations in mind,” said Jennifer Siebel Newsom, the First Partner of California.

Industry stakeholders have long touted device-level age verification as a potential compromise to existing age-gating systems, which typically occur at the website or platform level. Aylo, the parent company of Pornhub.com, has previously endorsed device-level verification as a privacy-preserving alternative to ID uploads and facial scans.

Read More »