The War on Porn

The Age Verification Era Is Here

age verification

The age verification era has already arrived. Across the United States, Europe and other regions, governments are moving from debate to enforcement. New rules are active in several U.S. states and throughout countries such as the U.K. and France, while regulators and payment processors now expect full compliance. For adult platforms, the message is clear: meet the requirements or risk losing access to financial partners and audiences.

In the United Kingdom, the media regulator Ofcom has taken the lead on enforcement under the Online Safety Act, where the deadline for installing “robust age checks” has already passed.

“We’re seeing more and more services comply every day,” says an Ofcom spokesperson. “We will continue to work constructively with providers who are trying to comply to protect U.K. users. But services that do not comply can expect enforcement action.”

Because every jurisdiction defines its own rules and terminology — “effective age assurance,” for example — businesses often face confusing and inconsistent expectations. Large companies can dedicate compliance teams and budgets to meet those standards, but smaller sites and independent creators are left struggling to interpret vague mandates with limited resources.

Yet compliance doesn’t necessarily mean closing a business or draining finances. With the right approach, platforms can stay lawful, protect their income, and continue operating safely within this new regulatory landscape.

When Small Operators Face Big Rules

Free Speech Coalition board member Megan Stokes stresses that many underestimate how broadly these laws are written.

“One common misconception is that these laws only apply to the ‘big players,’” she notes. “In reality, the language is so broad that even an independent creator with a small personal site could be at risk. At FSC, we work with independent performers who run small branded sites alongside their fan platforms; if those sites aren’t using compliant verification, they could personally get pulled into a lawsuit.”

Jonathan Corona, COO of MobiusPay, echoes that view from a financial compliance perspective.

“AV mandates are applied to everyone equally,” he reports. “It doesn’t matter what size your business is.”

According to Stokes, the outcome is a growing divide between those who can afford compliance infrastructure and those who cannot.

“The wave of state-level AV laws has really created a two-tier system,” she explains. “Larger companies have the budget and legal teams to adapt quickly, whether that means paying for enterprise verification vendors or building complex compliance systems. Smaller operators don’t have those resources, so even the threat of one lawsuit can be absolutely devastating.

“For example, we see both independent creators and boutique studios who operate their own sites struggling to even understand what ‘reasonable’ compliance looks like in practice,” she continues. “On the other hand, the larger platforms they compete with simply hand the problem to a vendor.

“This goes far beyond just compliance,” Stokes adds. “At the end of the day, it’s more about who can even afford to stay in the marketplace. Big companies can adapt. For smaller operators, one lawsuit could completely shut doors. Unfortunately, these laws risk turning compliance into a privilege of size.”

Industry attorney Corey Silverstein argues that this imbalance is not coincidental.

“Sadly, all of these governments knew that these age verification laws would immediately put smaller operators out of business, because that is exactly what they were aiming to accomplish,” Silverstein says. “Lawmakers can continue to claim that AV is about protecting children, but I’m not buying that. These laws were meant to control free speech and sadly, that is exactly what is happening.”

Legal Hazards and Unclear Boundaries

For many website owners, the biggest concern isn’t the concept of age verification itself — it’s the legal uncertainty surrounding it. The wording of these mandates can be vague, making it easy to misinterpret obligations or underestimate enforcement risk.

“It is a mistake to think that simple age gates or geoblocking solve the problem,” says Stokes. “They don’t. VPNs make location checks unreliable and some states set very strict standards for what counts as ‘reasonable’ verification. If operators assume the old tools are good enough, they’re leaving themselves exposed to lawsuits.”

Mike Stabile, Director of Public Policy at the Free Speech Coalition, agrees that confusion is widespread.

“What counts as compliance is often vague and contradictory, and even companies with in-house compliance teams struggle to make sense of these laws,” he says.

Attorney Larry Walters warns that ignoring or mishandling AV rules carries significant risk.

“The most significant legal risk for smaller website operators and creators is the potential for being targeted by either a civil claimant or government agency,” Walters explains.

While most lawsuits have so far targeted major platforms, some smaller sites have already been named.

“Smaller operators are, in some ways, at greater risk since they may have fewer resources to defend a claim or pay a large monetary judgment,” he observes.

With little legal precedent, Walters adds, it’s uncertain whether liability could fall on both corporate entities and individual owners — or whether such judgments could even be discharged through bankruptcy.

Silverstein agrees that every operator must take compliance seriously.

“There is no room for anyone to misinterpret or ignore AV regulations,” he warns. “Additionally, some people are making the mistake of thinking that all of the AV laws are the same — they are not. Each state and country’s AV laws have many nuances that make them distinct from one another.”

Both attorneys emphasize that complexity and cost are not legal defenses. Operators must audit traffic patterns, document compliance measures and seek legal advice.

“All operators need to be conducting substantive traffic audits and consulting with an attorney to discuss how they may specifically be at risk,” Silverstein cautions.

Stabile adds that help is available.

“If you’re panicking, the FSC is here a resource,” he says. “We keep a running list of what laws are in effect and coming into effect in our Action Center, and we provide links to the legislation to help you understand the specifics of the law. That’s available for everyone in the industry, whether you’re an FSC member or not.”

The Coalition’s “AV Tool Kit” is designed to simplify this process.

“The tool kit is meant to be a practical guide, not just a policy overview,” explains Stokes. “Webmasters can use it to see, state by state, where their biggest risks are and what methods are recognized. Think of it as both a map and a checklist; it helps you see your risks clearly and adjust as the laws evolve. The key is to treat it as a living resource and check back regularly, since the laws are changing so quickly.”

Walters emphasizes that the laws demand full adherence:

“None of these state AV laws has a ‘good faith’ defense built in. Actual compliance is expected. Some laws contain prior notification requirements and an opportunity to cure before a claim can be asserted, but these are the exception to the rule.”

Silverstein is even more direct.

“Government agencies and private plaintiffs don’t care how hard someone tries to be in compliance with AV laws,” he says. “They will always take the position that anything short of 100% compliance is not compliance. Thus, webmasters should be documenting the agreements that they are entering into with third-party AV providers — and ensuring that they have thoroughly reviewed the AV provider and its product before signing up.”

Walters advises keeping detailed records on when AV tools were implemented, where they apply and how user data is handled.

“Any data that it is illegal to retain under these AV laws should be immediately and permanently deleted,” he warns.

Silverstein reinforces the importance of data protection.

“Securing, encrypting and limiting data access are key steps,” he says. “But far too many small sites are using inadequate practices to store sensitive data. A few key recommendations are: 1) Do not collect any data that you don’t absolutely need, 2) Do not keep data any longer than is necessary, 3) Data security is your responsibility, so you need to understand all of your data handling systems regardless of whether it’s done by a human being or some type of automated process.”

Payment Processing and Compliance

Even the most compliant website can’t function without payment processing. Credit card acquirers and banks now require proof that sites meet regional age verification rules before approving or maintaining accounts.

“Both Visa and Mastercard require compliance with jurisdictional age verification laws,” explains Cathy Beardsley, CEO of Segpay and FSC board member. “All new programs and any new URL submitted to a bank for approval will require compliance with age verification. The banks will be checking to make sure your site is compliant.”

Corona offers an example of how geography can complicate things.

“For example, if a company is based in a state that does not have an AV law on the books, but the customer is accessing the website from a state that does have one, then the company would need to either comply with the law or block access to their site from that state,” he notes.

Both Beardsley and Corona recommend formal written policies for handling compliance, which can reassure banks and card brands.

“Having a clear policy and procedure for how the company addresses AV requirements is the first step,” says Corona. “Regardless of whether the company is integrating to a third-party service to maintain compliance or blocking states that require age verification, having a solid P&P to show the processing patterns, sponsor banks and card brands will go a long way in demonstrating responsible operations.”

Beardsley suggests another practical workaround.

“Offering safe-for-work tours and then adding age verification after checkout is a good way to ensure the sale and not waste funds on an AV check if the consumer fails to follow through with the checkout process or if the transaction is declined.”

She adds that businesses can include verification costs in subscription pricing or turn to open-source tools to minimize expense.

Noncompliance, however, can be catastrophic.

Sites flagged for violations may lose payment processing immediately until problems are resolved.

“The company could face a simple reprimand and be tasked with remediation, such as implementing an AV service or demonstrating that the site is blocked from the specific jurisdictions that require AV,” Corona says. “On the extreme end, however, the company can face termination.”

Both executives underscore that proactive communication is key.

“We take a hands-on approach when advising our clients and ensure that they are complying with card brand regulations before putting the application forward for concurrence,” Corona explains.

The takeaway is simple: staying compliant with AV laws isn’t just a legal requirement — it’s critical for maintaining payment relationships.

Technology Providers and Practical Solutions

A growing ecosystem of technology vendors now offers tools to make age verification possible for sites of any size. Companies like VerifyMy, Incode, and are developing privacy-preserving systems tailored to international standards.

“One of the biggest challenges for webmasters is keeping up with the rapidly evolving regulatory landscape,” notes Andy Lulham, COO of VerifyMy. “New age verification laws are being introduced at pace and requirements often vary from country to country and even state to state.”

He adds that usability can make or break a compliance program.

“Age verification isn’t one-size-fits-all,” he says. “What works for one site or audience may not work for another. If the process is cumbersome or invasive, it can frustrate users, increase drop-off rates and directly impact revenue.”

Milo Flores from Incode says their framework revolves around three pillars: data minimization, encryption, and automatic data deletion.

“Systems that return only an ‘over 18’ result without exposing identity details show users that platforms aren’t collecting or keeping unnecessary personal data,” he explains.

Flores and other developers point to reusable credentials — where users store age tokens or ID wallets — as an accessible option for smaller sites.

“For users, they deliver more privacy and control,” says Flores. “Credentials can be stored in a secure wallet and shared only when the user chooses. Passkeys can be safely kept on the user’s device, making it easy to prove they’ve already completed an age check without exposing new data.”

Yoti’s representative agrees that reusable solutions lower friction for both users and site owners.

“Users will increasingly need to prove their age for sites in a number of sectors, including adult content, gaming, social media and dating,” the rep says. “The lower friction decreases cost for smaller operators. Plus, reusable tools allow small businesses to compete with larger platforms on trust and compliance — without having to become identity experts themselves.

“In the first months of the U.K. Online Safety Act, 25% of users on adult content sites are choosing to prove their age with a reusable age assurance option, including with reusable ID wallets or by setting up an age token,” the rep adds.

Flores advocates an adaptive “waterfall” model that starts with the least intrusive method and escalates only when necessary.

“This adaptive model means webmasters don’t have to juggle multiple vendors or build complex workflows themselves,” he says.

Looking Ahead: Control, Privacy and Consequences

While AV vendors highlight innovation, critics warn that the broader implications extend far beyond compliance.

“The answer to where AV technology is headed is nuanced, depending upon what you see as its true intention: protecting children or punishing adults,” says digital media analyst Stephen Yagielowicz.

Leaning toward the latter view, he calls AV “not a problem to be solved, but a process to be managed.”

“Not to detract from the very real need to safeguard the innocence of youth, but I suspect more sinister motives are behind the recent regulatory maneuvering,” he contends. “The elimination of personal privacy is the ultimate goal of these initiatives. This is why partisans and regulators so oppose device-level blocking, since anyone could ‘borrow’ a device or otherwise gain access to it — including consenting adults whose identity would then be masked.

“Age verification ‘to protect the children’ is merely the sugar coating intended to make the demise of online privacy more palatable,” Yagielowicz adds.

Despite those concerns, Yagielowicz acknowledges potential technological benefits ahead.

“Improved biometrics could be a positive development in terms of accuracy and verification speed, but the data security implications are profound,” he warns. “This is where blockchain technology can play a role by decentralizing and securing identity information from casual perusal, even if that data is available to governments. Technology is not a silver bullet against the challenges of adequate age assurance, but it will broaden the availability of tools and other resources and this can ease compliance.

“The winning pattern is data-minimizing, auditable and swappable across jurisdictions — aligned to how U.K./EU regulators describe ‘effective’ age assurance and transparency,” he concludes. “It won’t be easy, but it will be necessary.”

The Takeaway

“Necessary” may be the single most accurate word to describe this new landscape. Age verification has become standard policy in multiple regions, with regulators, financial networks and courts all enforcing compliance.

But while the challenges are significant, smaller platforms and independent creators are not powerless. Through careful documentation, use of verified vendors, and reliance on evolving privacy-protective technologies, they can continue to operate safely and sustainably in a fast-changing regulatory world.

Read More »

The Big Chill: If You Can Get People to Self-Censor, You Don’t Need a Ban By Morley Safeword

Cold weather

A couple years back, when many of the state laws requiring age verification on the part of adult websites were merely proposals floating around in state legislatures, a state legislator from one of the few states that already had such a law on its books publicly noted that Pornhub had begun blocking all traffic from the state, which meant the law was “already working” as intended.

I remember seeing a clip online of the legislator saying this and wondering if she knew just how right she was. The law she was talking about may have been presented as a measure to deter kids from accessing online porn, but in truth, it was about making it harder for anyone to access online porn.

While you might not realize it if you were to review some of the blatantly unconstitutional stuff they dream up, most state legislators do know they can’t simply ban speech they don’t like. They know this is true whether the speech they dislike is sexually-explicit material like porn, speech intended to “annoy” or “offend” people who share their sensibilities or speech from Jimmy Kimmel.

Legislators, censorious activists and other vile creatures of darkness also know the next best thing to a ban is an effective effort to cow people into silence, whether by criminal law or civil sanction.

When attorneys and scholars who are familiar with the First Amendment and free speech issues discuss these things, you’ll often hear them reference the “chilling effect,” which in the context of legislation refers to “government unduly deterring free speech and association rights through laws, regulations or actions that appear to target activities protected by the First Amendment,” as the Free Speech Center at MTSU puts it.

When the Supreme Court recently upheld the Texas law requiring websites that offer over a certain amount of content deemed to be “harmful to minors” to verify the age of users before displaying any such content, the court effectively said the chilling effect of the Texas law is insubstantial, easily outweighed by the government’s “compelling interest” in deterring minors from accessing pornography.

A casual observer might think: “Good! Why shouldn’t adult sites be required to do the same thing the store on the corner has to do before selling someone porn?” But the casual observer maybe hasn’t thought this one all the way through.

The casual observer probably hasn’t considered the difference between briefly flashing your ID at a bored store clerk who probably didn’t give it much of a look anyway before waving you onward, and uploading a copy of your government issued ID, with your home address on it, to some third-party age verification service about which you know nothing.

The casual observer also may not have thought much about the websites that aren’t even arguably “porn sites,” but could still host enough content deemed “harmful to minors” to be covered by the law.

Worse still, this chilling effect is attached to a measure that isn’t particularly effective at its stated purpose. When Louisiana’s law was first passed, it took a matter of minutes for users to find a workaround – one that didn’t even require knowing what the acronym “VPN” signifies.

As a beloved old teacher of mine used to say about things of dubious value, these laws appear to be “worth their weight in sawdust” – except when it comes to their chilling effect.

As Hannah Wohl, an associate professor of Sociology at the University of California, Santa Barbara, put it in a post for The Hill, age verification laws “fail to protect minors and threaten the free speech of all Americans in ways that go far beyond pornography.”

“This chilling effect is by design,” Wohl added. “Pornography has long been the canary in the coal mine for other restrictions against free speech. Indeed, some conservatives have acknowledged that age verification laws are a back door to fully criminalizing pornography.

No, age verification laws are not “blocks” or “bans” on pornography. But they certainly are barriers – and we’re not supposed to go around erecting barriers to protected speech in America, willy-nilly. There’s supposed to be a damn good reason (that “compelling interest” of the government’s), some indication that the law serves its purpose in furthering that interest, and an assurance that the law doesn’t burden too much speech that’s legal for adults to consume in restricting minors’ access to that same speech.

Or, as Vera Eidelman, senior staff attorney with the ACLU Speech, Privacy and Technology Project, put it when talking about the Supreme Court’s decision on Free Speech Coalition v. Paxton: “With this decision, the court has carved out an unprincipled pornography exception to the First Amendment. The Constitution should protect adults’ rights to access information about sex online, even if the government thinks it is too inappropriate for children to see.”

Unfortunately, to many of those who support and advocate for these laws, the chilling effect bemoaned by their critics is a design feature, not a bug.

Read More »

Apple Raises Privacy Concerns Over Texas Age Verification Law

Apple Store

Apple on Thursday outlined how it plans to comply with a new Texas law, SB 2420, which introduces strict age assurance requirements for app stores and app developers.

While Apple had already begun rolling out its own age verification tools earlier this year in anticipation of regulatory changes, the company voiced significant privacy concerns regarding the Texas legislation.

In a statement to developers, Apple said, “…we are concerned that SB2420 impacts the privacy of users by requiring the collection of sensitive, personally identifiable information to download any app, even if a user simply wants to check the weather or sports scores.”

The Texas law is part of a growing wave of state-level regulations across the United States. These laws have emerged as individual states step in where federal lawmakers have failed to enact comprehensive online protections for minors. Though the goals are similar—to safeguard children online—the methods differ widely between states.

While Apple has the resources to comply with such mandates, smaller app developers may struggle. The company noted that its new tools are intended to help those developers meet legal requirements. Some smaller startups are already feeling the impact; for example, social media platform Bluesky recently blocked its service in Mississippi, explaining it lacked the resources to comply with similar state laws.

When SB 2420 takes effect on January 1, 2026, Apple will be required to confirm whether Texas-based users are at least 18 years old. Users under 18 will need to join a Family Sharing group managed by a parent or guardian. Parents will have to give consent for all App Store downloads, purchases, and transactions through Apple’s existing in-app purchase system.

To align with the new requirements, Apple said it will enable developers to determine user age “in a privacy-preserving way.” Developers can currently use the company’s Declared Age Range API, which will be updated before the law’s effective date to include new age categories for users in Texas.

Apple also announced it will introduce new APIs later this year that allow developers to request parental consent if they make major changes to an app that alter its age rating. Parents will be able to revoke consent at any time if they decide an app is no longer appropriate for their child.

The company even acknowledged the potential household dynamics such tools could create, joking that “we can imagine this being used as a new punishment technique; no Instagram for a month!”

Apple also cautioned developers that similar age assurance laws are scheduled to take effect soon in Utah and Louisiana, urging them to prepare for additional compliance requirements.

Read More »

Apple Introduces New Tools to Protect Kids and Teens on Its Devices

Kid on tablet

Apple announced a series of new initiatives Thursday aimed at helping parents and developers create safer digital environments for kids and teens. Along with simplifying the setup process for child accounts, parents will now be able to share information about their children’s ages, allowing app developers to provide more age-appropriate content.

The App Store will also feature a new set of age ratings designed to give developers and users a clearer understanding of an app’s suitability for specific age groups. Product pages for third-party apps will soon include additional details to help parents make informed decisions—such as whether the app includes ads, user-generated content, or its own parental controls.

“These updates will roll out to parents and developers later this year,” Apple said.

The changes come amid ongoing national and state-level debates over how tech companies should protect children online. Nine U.S. states, including Utah and South Carolina, have recently proposed bills requiring app store operators to verify children’s ages and obtain parental consent before minors can download apps.

Apple has long advocated for app developers to handle age verification themselves, while companies like Meta have argued that app store operators should manage the process, given their direct access to user information.

Apple’s latest system represents a middle ground. The company will collect children’s age data directly from parents while requiring third-party developers to use that information to design age-appropriate experiences.

Simpler Setup for Child Accounts

Apple’s new setup flow for child accounts—required for children under 13 and optional for minors up to 18—makes the process smoother for families.

Parents can now select their child’s age range and verify their identity by confirming an existing credit card on file, rather than re-entering payment information manually.

If a parent isn’t available during setup, the child can still start using the device. Apple will automatically apply age-based web filters and allow access only to preinstalled apps like Notes, Pages, and Keynote. Neither Apple nor developers can collect the child’s data without parental consent during this stage.

When the child first visits the App Store and attempts to download an app, they’ll receive a reminder to ask their parent to complete setup.

Once setup is complete, the child can use Apple services with the content and app restrictions defined by their parent.

New Age Range API for Developers

Instead of asking children to manually enter their birthdays, developers can now use a new Declared Age Range API to access the age range information that parents provide during account setup. Parents can correct or revoke this data at any time.

Through the API, developers receive an age range—such as 9–12 or 13–15—without learning the child’s specific birthdate.

If an app requests age information, the child will see a pop-up asking permission to share it—similar to existing prompts for camera, microphone, or location access.

Apple says this approach is “more effective,” since “kids often lie about their birthday to access an app’s full experience.”

Developers must opt in to use the API, but future legislation could make its adoption mandatory for certain app categories.

Expanded Age Ratings on the App Store

Apple is also updating the App Store’s existing age rating system. Currently, apps are labeled 4+, 9+, 12+, or 17+. The new framework adds more detail, breaking down teen users into 13+, 16+, and 18+ categories while keeping the younger ranges intact.

Apple says an app’s rating is determined by developer responses about its content and the intensity or frequency of that material.

“This will help parents better determine if an app their child requests is age-appropriate,” the company explained. “If content restrictions are enabled, kids are prevented from downloading or updating apps that exceed their age range.”

In addition, age-restricted apps will not appear in curated sections like Today, Games, or Apps when a child is browsing.

Several of these new features for child accounts are available in the public beta of iOS 18.4. The ability to modify a child’s age after account creation, along with the Declared Age Range API and new App Store ratings, will launch later this year.

In response to Apple’s announcement, a Meta spokesperson described the update as “a positive first step,” but noted that “developers can only apply these age-appropriate protections with a teen’s approval.”

“Parents tell us they want to have the final say over the apps their teens use,” the spokesperson added, “and that’s why we support legislation that requires app stores to verify a child’s age and get a parent’s approval before their child downloads an app.”

Read More »

Wisconsin Proposals Could Criminalize VPN Use Under Age Verification Laws

Wisconsin flag

MADISON, Wis. — Lawmakers in the Wisconsin state legislature are gradually advancing an age verification bill that not only targets adult entertainment websites but also includes provisions restricting the use of virtual private networks (VPNs). The most recent movement on the proposal occurred during a state Senate committee meeting on Oct. 8.

The measure to limit VPN usage stems from two companion bills, Assembly Bill 105 and Senate Bill 130, both introduced exclusively by Republican lawmakers. Since the GOP controls both chambers of the legislature, the proposal stands a strong chance of passing in some form through the Assembly and Senate.

What remains uncertain, however, is how Democratic Gov. Tony Evers will respond if the bill reaches his desk—whether he would sign it into law or veto it. Taken together, AB 105 and SB 130 represent the latest effort to criminalize or restrict the use of commercially available VPNs when used to bypass age verification systems.

In neighboring Michigan, a group of far-right legislators recently proposed an even more extreme measure: a total ban on pornography that initially sought to outlaw VPNs and proxy tools.

Rep. Josh Schriver, who sponsored House Bill 4938, titled the Anticorruption of Public Morals Act, described the bill as a “public decency and public safety solution” aimed at curbing access to what he considers harmful online content. However, following public and stakeholder backlash, Schriver announced that he would amend the measure to remove any language referencing VPNs and proxy restrictions.

The Age Verification Providers Association (AVPA), a trade group for the age-verification sector, has long been criticized for sending mixed messages about VPNs. While its executive director, Iain Corby, has said the organization does not oppose VPNs, he has also stopped short of clarifying how they factor into bypassing age verification systems.

Across the Atlantic, similar discussions are taking place in the United Kingdom, where regulators are considering age verification requirements for VPN services. Dame Rachel de Souza, the Children’s Commissioner for England, told BBC’s Newsnight program in August that current age verification provisions under the Online Safety Act are “essentially useless,” given how easy it is for minors to download and use VPNs.

“Of course, we need age verification on VPNs—it’s absolutely a loophole that needs closing, and that’s one of my major recommendations,” de Souza said during the interview.

Read More »

Proposed Pennsylvania Bill Would Impose Tax on Adult Content Platforms

Pennsylvania flag

HARRISBURG, Pa. — A Pennsylvania lawmaker is proposing new legislation that would add an additional tax on online adult content platforms.

State Senator Marty Flynn (D–Lackawanna, Luzerne) introduced the measure, arguing that while adult content services already generate substantial revenue from Pennsylvania subscribers, the money they earn contributes little beyond the state’s standard sales and use tax.

Flynn said his proposal aims to ensure these companies pay their “fair share” to the Commonwealth. The legislation would impose a 10% tax on all subscriptions and one-time purchases made through online adult platforms. This new tax would be applied in addition to Pennsylvania’s existing 6% sales and use tax.

All revenue collected under the bill would be directed to the General Fund, where it could be used to support essential state programs and priorities.

Flynn noted that his proposal mirrors efforts in other jurisdictions that have sought to modernize their tax codes and ensure online-based businesses are taxed equitably alongside traditional industries.

Read More »

EU Releases Updated Age Verification App Blueprint as Social Media Access Rules Spark Debate

EU DSA logo

The European Commission has unveiled a second version of the EU’s age verification app blueprint, as debates continue across the bloc over how to prevent children from accessing social media platforms.

Originally introduced in July 2025 as a “white label” prototype, the app was designed to work seamlessly with upcoming EU Digital Identity (EUDI) Wallets. The blueprint serves as a foundation that EU member states and private sector developers can adapt to create their own local versions of an age verification system.

The latest version introduces several new features, including the use of passports and national ID cards—in addition to electronic IDs (eIDs)—as onboarding methods to generate proof of age. It also incorporates support for the Digital Credentials API, a system designed to streamline interactions among users, service providers, and credential issuers.

According to the European Commission, the blueprint’s purpose is to “support the implementation of the Digital Services Act (DSA)” and ensure stronger protections for minors online. The app allows users to prove they are over 18 when attempting to access restricted content, such as adult websites, without revealing personal details like their full birthdate or identity.

The project is being developed by the T-Scy consortium, a partnership between Scytales AB (Sweden) and T-Systems International GmbH (Germany), which also manages stakeholder engagement across the EU.

The blueprint is already undergoing trials in several countries, including Denmark, France, Greece, Italy, and Spain, which are building their own national age verification apps based on the EU model. By the end of 2025, the Commission expects the blueprint to integrate zero-knowledge proof (ZKP) technology, which would enable verification of a user’s age without exposing any private data.

However, despite this progress, widespread adoption remains uncertain, as not all EU members are on board.

Estonia and Belgium Push Back

Two member states—Estonia and Belgium—have refused to sign the Jutland Declaration, a ministerial pledge advocating for the adoption of a digital age of majority across the EU.

The declaration calls for privacy-conscious age verification mechanisms on social media and other digital platforms to “mitigate the negative impact of illegal and inappropriate content, harmful commercial practices, addictive or manipulative design, and excessive data collection, particularly affecting minors.” It also suggests introducing a formal “digital legal age.”

Initiated by Denmark, which currently holds the rotating presidency of the EU Council, the declaration has been signed by 25 other member states, according to The Brussels Times. Denmark has made child online safety one of its top priorities during its six-month term.

Read More »

FSC Clarifies California’s Device-Based Age Verification Law Excludes Adult Websites

Free Speech Coalition logo

LOS ANGELES—The Free Speech Coalition (FSC) issued a statement Wednesday clarifying that California’s newly enacted age verification law does not apply to adult entertainment websites.

California is home to a large portion of the adult entertainment industry, including studios, performers, and site operators.

In a post on its official blog, the FSC explained:

California’s recently passed age-verification bill, AB 1043, does not apply to adult websites. The bill requires device manufacturers and app stores to collect a user’s birthdate or age and provide an API for apps (but not websites) to determine whether a user is under 18 years old. Apps must use the age information to ‘manage the delivery of age-appropriate content.’”

The FSC further noted that although the bill originally included language extending the age signal API to adult websites, that provision was removed in a last-minute amendment. The organization added that it has long supported a device-based solution to help adult sites restrict access to minors using similar technology.

The new law is scheduled to take effect on January 1, 2027.

Read More »

Discord Confirms Data Breach Exposed Government ID Photos of 70,000 Users

Discord logo

SAN FRANCISCO — Discord, the popular chat and community platform, confirmed that one of its third-party vendors experienced a major data breach that exposed the personal information of about 70,000 users, including photos of government-issued identification cards.

The affected vendor was responsible for processing age-verification submissions and appeals on behalf of Discord. The company has not yet named the vendor but indicated that the breach was the result of a cyberattack exploiting a Zendesk instance, allegedly part of an extortion attempt targeting both the vendor and Discord.

Early reports suggested that roughly 1.5 terabytes of data were stolen—around 2.2 million images tied to age-verification records. However, Discord said the actual scope was smaller than initially claimed.

“This was not a breach of our internal systems,” a Discord spokesperson told The Verge. “The attack targeted a third-party service we use to support our customer service operations. Approximately 70,000 users may have had government-ID photos exposed, which the vendor used for age-related appeal reviews.”

The company added that all affected users have been notified. “We’ve secured the affected systems, ended our relationship with the compromised vendor, and continue to cooperate with law enforcement, data protection authorities, and external security experts,” the spokesperson said. “We take our responsibility to protect user data seriously and understand the concern this may cause.”

Discord also disclosed that other personal details—including names, usernames, email addresses, IP addresses, and the last four digits of some users’ credit cards—were included in the compromised data.

While Discord remains best known for its role in gaming culture and online communities, it has also become a hub for artists, streamers, and adult creators who use the platform to interact with fans and build digital communities. The service allows users over 18 to share adult-oriented material within designated, age-restricted spaces.

Read More »

What Was “Verified,” Really? By Stan Q. Brick

Age verification image

As a guy who crossed the magic line of his 18th birthday over 35 years ago, it has been a damn long time since I was last asked to present identification documents as part of purchasing any age-restricted product.

More accurately, I should say it had been a damn long time – until last week, when I tried to log in to the members area of an porn website of which I’ve been a member for several months now.

Rather than simply being prompted to enter my username and password, I was presented with a dialog box informing me that before I could gain access to the site in question – a site to which I’ve already prepaid for nearly 90 more days of access, by virtue of a billing rollover that took place weeks ago – I needed to verify my age.

This struck me as odd and more than a little irritating. I was aware my home state is among those that have passed an age verification law directed at porn websites, but I had assumed existing customers, particularly those whose credit cards had been successfully billed several times already by the merchant involved, might somehow be “grandfathered in,” at least with respect to members’ area access.

No such luck, though. If I wanted to continue to access this site – in other words, if I wanted to receive the full benefit of the membership I’d already paid for – I would have to do business with whatever third-party service they’ve employed to perform the act of age verification on the site’s behalf, as well.

My immediate reaction was to close the browser, so I could weigh the question of whether to continue as a member of the site, cancel my account, or cancel my account and demand a refund. Nowhere in the agreement I ‘signed’ as part of joining the site did it state I’d have to do business with a third party to maintain future access to the site. Foisting that requirement on me without notice seemed dicey.

The first decision I made was not to act at all, right then. Among other things, the unexpected access-block had pissed me off a bit, and anger is never a good frame of mind for making decisions. I joined the site because I like the content they make and because I like watching it; should requiring me to show my ID really be so off-putting as to make me cancel, let alone demand a refund?

I sat on the decision for a couple days, straddling the fence on whether I’d jump through the age verification hoop that had been presented to me. Finally, I decided it made sense to see what the process required, how invasive it was of my privacy – and how effective or ineffective it seemed towards the stated end goal of verifying the users’ age and deterring minors from accessing the site. I could always back out before submitting anything, I reckoned.

The site in question offered only one option for an age verification service, one based in the United Kingdom. The system informed me that to verify my age, I’d need to upload a scan of one of several state-issued forms of ID: a driver’s license, a state ID, a passport, or state-issued military ID. It also referenced the possibility of uploading a selfie, in which I’d be holding the ID – so my face could be compared to that on the ID, presumably.

I wasn’t thrilled about doing any of this, for a variety of reasons. For starters, I don’t trust the promises from these third parties to not retain any of my “personally identifiable information.” I believe most online companies will look for every means available to monetize any piece of data they collect on their users (and seek every loophole in every law preventing them from doing so), and my assumption is that companies offering age verification services will be no different from their peers in that regard. And if such companies collect and store this data, malicious hackers will access it eventually, rest assured.

Beyond privacy concerns, I kept thinking about the lack of notice involved here. One day I’m a member of a porn site who can log in any time and check out the latest updates, then the next day, I’m forced to hand over my name, contact information and ID to some company out of the UK, just for the honor of accessing content I’d already paid to access? Even if that’s not an illegal or tortious arrangement, such a transition certainly doesn’t feel right.

Ultimately, despite my reservations, I decided to go ahead with the age verification process. As much as anything, I was now curious to see just how onerous it was and what all it would require of me.

A funny thing happened though; after uploading a photo of my ID, I was told I’d been verified and could now continue to the members area – no selfie required, no further personal information, just the email address I’d already given them on the previous page of the form and the scan of my ID.

Maybe I should be pleased by the fact I didn’t have to upload a selfie, but instead I’m struck by the pointlessness of it all. All this service had done was verify that someone had uploaded a driver’s license belonging to a man in his 50s, but in no way had they established it was the man in his 50s himself who had uploaded it.

The good news, I suppose, is that now I have access to the content for which I’d already paid. The bad news is… well, the bad news is unknowable, really. But when the bad news comes, with it may come answers to several questions I now have.

How many members of this same site will opt to cancel their memberships, or demand refunds, as I considered doing, rather than go through with the age verification process?

How many minors will find out about how easy it is to circumvent the age verification process of this age verification vendor?

Is this vendor truly not storing age verification documents? If they are storing such documents, will I learn that’s the case via an extortionate email threatening to reveal my porn preferences to my employer or family members?

But the biggest question, at least as I sit here right now typing, is this one: Through this age verification process, what was “verified,” exactly?

Exactly. I don’t know, either.

Read More »