Political Attacks

Aylo Says UK Watchdog Fumbled Age-Check Enforcement on Porn Sites

Aylo-logo

Something about this feels almost inevitable. You build walls, people find doors. You close the doors, they dig tunnels. The United Kingdom’s new Online Safety Act — the one meant to force adult sites to verify users’ ages — is supposed to keep minors out. But according to Aylo, the company behind Pornhub and a bunch of other major adult platforms, it’s working about as well as a “Do Not Enter” sign on a back alley at midnight.

They’re not just complaining for sport. Data tracked by Ofcom, the UK’s digital regulator, shows that roughly a third of all UK traffic to porn sites vanished in the months after the law kicked in. That sounds big — until you hear Aylo’s numbers. They’re claiming a 77 percent nosedive in visits from the UK alone. Think about that: nearly four out of five users gone, overnight.

“Since the Act came into effect, Pornhub and other compliant platforms have observed a significant shift in user behavior,” one Aylo document said. “This is not a surprise. This pattern is consistent with trends seen in other jurisdictions.”

Translation: they saw it coming. In places like Louisiana, which rolled out one of the first porn-specific age verification laws in the U.S., traffic didn’t disappear — it simply went elsewhere. “Similarly, since [the] implementation of the OSA, Pornhub has, again, lost nearly 80 percent of its U.K. traffic,” the document continues. “As always, people did not stop looking for porn. They just migrated to other non-compliant sites that don’t ask users to verify age, that don’t follow the law, that don’t take user safety seriously, and that often don’t even moderate content.”

It’s a harsh point, but also a fair one: when enforcement is weak, rules just push users toward the shadows. Aylo even attached a graph — a visual punchline to the story — showing the drop like a cliff dive.

Aylo graph

And here’s the kicker. Ofcom, which is supposed to be policing this digital frontier, has reportedly sent out notices of non-compliance to only 69 sites and apps. That’s less than 0.1 percent of the hundreds of thousands of adult destinations online. So much for cleaning up the web.

Aylo’s executives, it seems, have been making the rounds with government officials and Ofcom itself, though they wouldn’t say what was discussed. “That said, we remain committed to and available for meeting with governments everywhere to share our data and discuss the most effective solution for age verification,” their spokesperson wrote in an email. It’s the kind of diplomatic line you give when you’ve already said your piece behind closed doors.

Meanwhile, Ofcom’s list of investigations reads like a rogues’ gallery — from Motherless.com to sites accused of hosting illegal or AI-generated content. But to Aylo, that’s missing the forest for the trees. They’re sitting back, watching the numbers tumble, quietly thinking: we told you so.

First Amendment attorney Lawrence Walters didn’t mince words either. “It should be no surprise that the U.K. traffic to adult sites has dropped substantially, and now there is [official] statistical data to confirm that assumption,” he said. “Adult users are naturally hesitant to sacrifice their privacy rights and share sensitive personal information as a condition of accessing legal adult content.”

Then he added what feels like the line everyone will remember: “Leaving aside the constitutional concerns with burdening access to adult speech by requiring users to disclose age and identity data, this legislative approach was short-sighted and impractical.”

Short-sighted and impractical — two words that could describe half of internet regulation history. And yet, here we are again: another country, another crackdown, another surge in traffic to places no one can really control.

The question isn’t whether age verification “works.” It’s what we’re willing to trade — privacy, access, freedom — just to pretend it does.

Read More »

Canadian Privacy Commissioner Backs New National Age Verification Bill

Philippe Dufresne

Under a fresh Senate review, Canada’s privacy chief has thrown his weight behind a renewed push for nationwide age checks on adult platforms, while leading legal voices warn the plan could open the door to intrusive data practices and site blocking.

Philippe Dufresne, the privacy commissioner of Canada, told the Standing Senate Committee on Legal and Constitutional Affairs that Bill S-209 — “The Protecting Young Persons from Exposure to Pornography Act” — answers key concerns he raised about earlier proposals like S-210.

“In my appearance in May 2024, before the Standing Committee on Public Safety and National Security on a previous iteration of this Bill, I provided two primary recommendations,” Dufresne told the committee. “To limit the scope of application of the Bill; and to make certain enhancements to the criteria for prescribed age-verification and age-estimation methods to ensure that privacy is protected. I am very pleased to see that they have been incorporated in S-209. The added requirement to limit the collection of personal information to that which is strictly necessary for age verification or age estimation has also enhanced the Bill from a privacy perspective.

“I believe that it is possible to implement age-assurance mechanisms in a privacy-protective manner,” Dufresne added. “My Office is developing guidance on how this can be done.”

S-209 would introduce penalties of up to $500,000 for adult sites that fail to verify Canadian users’ ages. While Dufresne now views the bill’s safeguards more favorably than the previous iteration, the Canadian Bar Association (CBA) urged senators to proceed with caution.

In a letter to committee chair David M. Arnot, CBA Privacy and Access Section chair Christiane Saad argued that the statute leaves too much to future regulations and not enough in the legislative text itself. “This Bill addresses government data collection and retention only in broad, principle-based terms,” Saad wrote. “However, it lacks key specifics: no defined retention timeline, no clarity on the speed of destruction, no auditing or enforcement mechanisms, no requirements for storage location, and no remedies for users if data is mishandled. As a result, the Bill leaves many critical safeguards to future regulations, making enforcement and technical protections highly dependent on implementation rather than the statute itself.”

The CBA also highlighted the privacy risks inherent in any verification model that ties identity to content consumption. “An obvious by-product of such age-verification or age-estimation measures is the creation of a data set that links personal identifying data to data revealing that an individual accessed internet pornography as well as the specific sexual proclivities and interests of that individual,” the letter cautions.

Beyond data issues, the association flagged potential impacts on lawful speech. The letter notes that S-209 would grant Canada’s Federal Court “sweeping authority” to order internet service providers to block access to noncompliant sites. “Such measures risk over-blocking — removing lawful, non-pornographic content alongside the targeted material — and may inadvertently restrict adults’ access as well, resulting in collateral censorship, restricting freedom of expression and access to information,” the letter states.

Both S-209 and its predecessor S-210 were introduced by Sen. Julie Miville-Dechêne, who has championed national age-verification requirements in multiple previous attempts. Asked in a 2024 “Law Bytes” podcast interview about the possibility that adults could be prevented from accessing legal material, she replied, “I’m not worrying. Adults will continue to be able to watch porn.”

Read More »

The Age Verification Era Is Here

age verification

The age verification era has already arrived. Across the United States, Europe and other regions, governments are moving from debate to enforcement. New rules are active in several U.S. states and throughout countries such as the U.K. and France, while regulators and payment processors now expect full compliance. For adult platforms, the message is clear: meet the requirements or risk losing access to financial partners and audiences.

In the United Kingdom, the media regulator Ofcom has taken the lead on enforcement under the Online Safety Act, where the deadline for installing “robust age checks” has already passed.

“We’re seeing more and more services comply every day,” says an Ofcom spokesperson. “We will continue to work constructively with providers who are trying to comply to protect U.K. users. But services that do not comply can expect enforcement action.”

Because every jurisdiction defines its own rules and terminology — “effective age assurance,” for example — businesses often face confusing and inconsistent expectations. Large companies can dedicate compliance teams and budgets to meet those standards, but smaller sites and independent creators are left struggling to interpret vague mandates with limited resources.

Yet compliance doesn’t necessarily mean closing a business or draining finances. With the right approach, platforms can stay lawful, protect their income, and continue operating safely within this new regulatory landscape.

When Small Operators Face Big Rules

Free Speech Coalition board member Megan Stokes stresses that many underestimate how broadly these laws are written.

“One common misconception is that these laws only apply to the ‘big players,’” she notes. “In reality, the language is so broad that even an independent creator with a small personal site could be at risk. At FSC, we work with independent performers who run small branded sites alongside their fan platforms; if those sites aren’t using compliant verification, they could personally get pulled into a lawsuit.”

Jonathan Corona, COO of MobiusPay, echoes that view from a financial compliance perspective.

“AV mandates are applied to everyone equally,” he reports. “It doesn’t matter what size your business is.”

According to Stokes, the outcome is a growing divide between those who can afford compliance infrastructure and those who cannot.

“The wave of state-level AV laws has really created a two-tier system,” she explains. “Larger companies have the budget and legal teams to adapt quickly, whether that means paying for enterprise verification vendors or building complex compliance systems. Smaller operators don’t have those resources, so even the threat of one lawsuit can be absolutely devastating.

“For example, we see both independent creators and boutique studios who operate their own sites struggling to even understand what ‘reasonable’ compliance looks like in practice,” she continues. “On the other hand, the larger platforms they compete with simply hand the problem to a vendor.

“This goes far beyond just compliance,” Stokes adds. “At the end of the day, it’s more about who can even afford to stay in the marketplace. Big companies can adapt. For smaller operators, one lawsuit could completely shut doors. Unfortunately, these laws risk turning compliance into a privilege of size.”

Industry attorney Corey Silverstein argues that this imbalance is not coincidental.

“Sadly, all of these governments knew that these age verification laws would immediately put smaller operators out of business, because that is exactly what they were aiming to accomplish,” Silverstein says. “Lawmakers can continue to claim that AV is about protecting children, but I’m not buying that. These laws were meant to control free speech and sadly, that is exactly what is happening.”

Legal Hazards and Unclear Boundaries

For many website owners, the biggest concern isn’t the concept of age verification itself — it’s the legal uncertainty surrounding it. The wording of these mandates can be vague, making it easy to misinterpret obligations or underestimate enforcement risk.

“It is a mistake to think that simple age gates or geoblocking solve the problem,” says Stokes. “They don’t. VPNs make location checks unreliable and some states set very strict standards for what counts as ‘reasonable’ verification. If operators assume the old tools are good enough, they’re leaving themselves exposed to lawsuits.”

Mike Stabile, Director of Public Policy at the Free Speech Coalition, agrees that confusion is widespread.

“What counts as compliance is often vague and contradictory, and even companies with in-house compliance teams struggle to make sense of these laws,” he says.

Attorney Larry Walters warns that ignoring or mishandling AV rules carries significant risk.

“The most significant legal risk for smaller website operators and creators is the potential for being targeted by either a civil claimant or government agency,” Walters explains.

While most lawsuits have so far targeted major platforms, some smaller sites have already been named.

“Smaller operators are, in some ways, at greater risk since they may have fewer resources to defend a claim or pay a large monetary judgment,” he observes.

With little legal precedent, Walters adds, it’s uncertain whether liability could fall on both corporate entities and individual owners — or whether such judgments could even be discharged through bankruptcy.

Silverstein agrees that every operator must take compliance seriously.

“There is no room for anyone to misinterpret or ignore AV regulations,” he warns. “Additionally, some people are making the mistake of thinking that all of the AV laws are the same — they are not. Each state and country’s AV laws have many nuances that make them distinct from one another.”

Both attorneys emphasize that complexity and cost are not legal defenses. Operators must audit traffic patterns, document compliance measures and seek legal advice.

“All operators need to be conducting substantive traffic audits and consulting with an attorney to discuss how they may specifically be at risk,” Silverstein cautions.

Stabile adds that help is available.

“If you’re panicking, the FSC is here a resource,” he says. “We keep a running list of what laws are in effect and coming into effect in our Action Center, and we provide links to the legislation to help you understand the specifics of the law. That’s available for everyone in the industry, whether you’re an FSC member or not.”

The Coalition’s “AV Tool Kit” is designed to simplify this process.

“The tool kit is meant to be a practical guide, not just a policy overview,” explains Stokes. “Webmasters can use it to see, state by state, where their biggest risks are and what methods are recognized. Think of it as both a map and a checklist; it helps you see your risks clearly and adjust as the laws evolve. The key is to treat it as a living resource and check back regularly, since the laws are changing so quickly.”

Walters emphasizes that the laws demand full adherence:

“None of these state AV laws has a ‘good faith’ defense built in. Actual compliance is expected. Some laws contain prior notification requirements and an opportunity to cure before a claim can be asserted, but these are the exception to the rule.”

Silverstein is even more direct.

“Government agencies and private plaintiffs don’t care how hard someone tries to be in compliance with AV laws,” he says. “They will always take the position that anything short of 100% compliance is not compliance. Thus, webmasters should be documenting the agreements that they are entering into with third-party AV providers — and ensuring that they have thoroughly reviewed the AV provider and its product before signing up.”

Walters advises keeping detailed records on when AV tools were implemented, where they apply and how user data is handled.

“Any data that it is illegal to retain under these AV laws should be immediately and permanently deleted,” he warns.

Silverstein reinforces the importance of data protection.

“Securing, encrypting and limiting data access are key steps,” he says. “But far too many small sites are using inadequate practices to store sensitive data. A few key recommendations are: 1) Do not collect any data that you don’t absolutely need, 2) Do not keep data any longer than is necessary, 3) Data security is your responsibility, so you need to understand all of your data handling systems regardless of whether it’s done by a human being or some type of automated process.”

Payment Processing and Compliance

Even the most compliant website can’t function without payment processing. Credit card acquirers and banks now require proof that sites meet regional age verification rules before approving or maintaining accounts.

“Both Visa and Mastercard require compliance with jurisdictional age verification laws,” explains Cathy Beardsley, CEO of Segpay and FSC board member. “All new programs and any new URL submitted to a bank for approval will require compliance with age verification. The banks will be checking to make sure your site is compliant.”

Corona offers an example of how geography can complicate things.

“For example, if a company is based in a state that does not have an AV law on the books, but the customer is accessing the website from a state that does have one, then the company would need to either comply with the law or block access to their site from that state,” he notes.

Both Beardsley and Corona recommend formal written policies for handling compliance, which can reassure banks and card brands.

“Having a clear policy and procedure for how the company addresses AV requirements is the first step,” says Corona. “Regardless of whether the company is integrating to a third-party service to maintain compliance or blocking states that require age verification, having a solid P&P to show the processing patterns, sponsor banks and card brands will go a long way in demonstrating responsible operations.”

Beardsley suggests another practical workaround.

“Offering safe-for-work tours and then adding age verification after checkout is a good way to ensure the sale and not waste funds on an AV check if the consumer fails to follow through with the checkout process or if the transaction is declined.”

She adds that businesses can include verification costs in subscription pricing or turn to open-source tools to minimize expense.

Noncompliance, however, can be catastrophic.

Sites flagged for violations may lose payment processing immediately until problems are resolved.

“The company could face a simple reprimand and be tasked with remediation, such as implementing an AV service or demonstrating that the site is blocked from the specific jurisdictions that require AV,” Corona says. “On the extreme end, however, the company can face termination.”

Both executives underscore that proactive communication is key.

“We take a hands-on approach when advising our clients and ensure that they are complying with card brand regulations before putting the application forward for concurrence,” Corona explains.

The takeaway is simple: staying compliant with AV laws isn’t just a legal requirement — it’s critical for maintaining payment relationships.

Technology Providers and Practical Solutions

A growing ecosystem of technology vendors now offers tools to make age verification possible for sites of any size. Companies like VerifyMy, Incode, and are developing privacy-preserving systems tailored to international standards.

“One of the biggest challenges for webmasters is keeping up with the rapidly evolving regulatory landscape,” notes Andy Lulham, COO of VerifyMy. “New age verification laws are being introduced at pace and requirements often vary from country to country and even state to state.”

He adds that usability can make or break a compliance program.

“Age verification isn’t one-size-fits-all,” he says. “What works for one site or audience may not work for another. If the process is cumbersome or invasive, it can frustrate users, increase drop-off rates and directly impact revenue.”

Milo Flores from Incode says their framework revolves around three pillars: data minimization, encryption, and automatic data deletion.

“Systems that return only an ‘over 18’ result without exposing identity details show users that platforms aren’t collecting or keeping unnecessary personal data,” he explains.

Flores and other developers point to reusable credentials — where users store age tokens or ID wallets — as an accessible option for smaller sites.

“For users, they deliver more privacy and control,” says Flores. “Credentials can be stored in a secure wallet and shared only when the user chooses. Passkeys can be safely kept on the user’s device, making it easy to prove they’ve already completed an age check without exposing new data.”

Yoti’s representative agrees that reusable solutions lower friction for both users and site owners.

“Users will increasingly need to prove their age for sites in a number of sectors, including adult content, gaming, social media and dating,” the rep says. “The lower friction decreases cost for smaller operators. Plus, reusable tools allow small businesses to compete with larger platforms on trust and compliance — without having to become identity experts themselves.

“In the first months of the U.K. Online Safety Act, 25% of users on adult content sites are choosing to prove their age with a reusable age assurance option, including with reusable ID wallets or by setting up an age token,” the rep adds.

Flores advocates an adaptive “waterfall” model that starts with the least intrusive method and escalates only when necessary.

“This adaptive model means webmasters don’t have to juggle multiple vendors or build complex workflows themselves,” he says.

Looking Ahead: Control, Privacy and Consequences

While AV vendors highlight innovation, critics warn that the broader implications extend far beyond compliance.

“The answer to where AV technology is headed is nuanced, depending upon what you see as its true intention: protecting children or punishing adults,” says digital media analyst Stephen Yagielowicz.

Leaning toward the latter view, he calls AV “not a problem to be solved, but a process to be managed.”

“Not to detract from the very real need to safeguard the innocence of youth, but I suspect more sinister motives are behind the recent regulatory maneuvering,” he contends. “The elimination of personal privacy is the ultimate goal of these initiatives. This is why partisans and regulators so oppose device-level blocking, since anyone could ‘borrow’ a device or otherwise gain access to it — including consenting adults whose identity would then be masked.

“Age verification ‘to protect the children’ is merely the sugar coating intended to make the demise of online privacy more palatable,” Yagielowicz adds.

Despite those concerns, Yagielowicz acknowledges potential technological benefits ahead.

“Improved biometrics could be a positive development in terms of accuracy and verification speed, but the data security implications are profound,” he warns. “This is where blockchain technology can play a role by decentralizing and securing identity information from casual perusal, even if that data is available to governments. Technology is not a silver bullet against the challenges of adequate age assurance, but it will broaden the availability of tools and other resources and this can ease compliance.

“The winning pattern is data-minimizing, auditable and swappable across jurisdictions — aligned to how U.K./EU regulators describe ‘effective’ age assurance and transparency,” he concludes. “It won’t be easy, but it will be necessary.”

The Takeaway

“Necessary” may be the single most accurate word to describe this new landscape. Age verification has become standard policy in multiple regions, with regulators, financial networks and courts all enforcing compliance.

But while the challenges are significant, smaller platforms and independent creators are not powerless. Through careful documentation, use of verified vendors, and reliance on evolving privacy-protective technologies, they can continue to operate safely and sustainably in a fast-changing regulatory world.

Read More »

The Big Chill: If You Can Get People to Self-Censor, You Don’t Need a Ban By Morley Safeword

Cold weather

A couple years back, when many of the state laws requiring age verification on the part of adult websites were merely proposals floating around in state legislatures, a state legislator from one of the few states that already had such a law on its books publicly noted that Pornhub had begun blocking all traffic from the state, which meant the law was “already working” as intended.

I remember seeing a clip online of the legislator saying this and wondering if she knew just how right she was. The law she was talking about may have been presented as a measure to deter kids from accessing online porn, but in truth, it was about making it harder for anyone to access online porn.

While you might not realize it if you were to review some of the blatantly unconstitutional stuff they dream up, most state legislators do know they can’t simply ban speech they don’t like. They know this is true whether the speech they dislike is sexually-explicit material like porn, speech intended to “annoy” or “offend” people who share their sensibilities or speech from Jimmy Kimmel.

Legislators, censorious activists and other vile creatures of darkness also know the next best thing to a ban is an effective effort to cow people into silence, whether by criminal law or civil sanction.

When attorneys and scholars who are familiar with the First Amendment and free speech issues discuss these things, you’ll often hear them reference the “chilling effect,” which in the context of legislation refers to “government unduly deterring free speech and association rights through laws, regulations or actions that appear to target activities protected by the First Amendment,” as the Free Speech Center at MTSU puts it.

When the Supreme Court recently upheld the Texas law requiring websites that offer over a certain amount of content deemed to be “harmful to minors” to verify the age of users before displaying any such content, the court effectively said the chilling effect of the Texas law is insubstantial, easily outweighed by the government’s “compelling interest” in deterring minors from accessing pornography.

A casual observer might think: “Good! Why shouldn’t adult sites be required to do the same thing the store on the corner has to do before selling someone porn?” But the casual observer maybe hasn’t thought this one all the way through.

The casual observer probably hasn’t considered the difference between briefly flashing your ID at a bored store clerk who probably didn’t give it much of a look anyway before waving you onward, and uploading a copy of your government issued ID, with your home address on it, to some third-party age verification service about which you know nothing.

The casual observer also may not have thought much about the websites that aren’t even arguably “porn sites,” but could still host enough content deemed “harmful to minors” to be covered by the law.

Worse still, this chilling effect is attached to a measure that isn’t particularly effective at its stated purpose. When Louisiana’s law was first passed, it took a matter of minutes for users to find a workaround – one that didn’t even require knowing what the acronym “VPN” signifies.

As a beloved old teacher of mine used to say about things of dubious value, these laws appear to be “worth their weight in sawdust” – except when it comes to their chilling effect.

As Hannah Wohl, an associate professor of Sociology at the University of California, Santa Barbara, put it in a post for The Hill, age verification laws “fail to protect minors and threaten the free speech of all Americans in ways that go far beyond pornography.”

“This chilling effect is by design,” Wohl added. “Pornography has long been the canary in the coal mine for other restrictions against free speech. Indeed, some conservatives have acknowledged that age verification laws are a back door to fully criminalizing pornography.

No, age verification laws are not “blocks” or “bans” on pornography. But they certainly are barriers – and we’re not supposed to go around erecting barriers to protected speech in America, willy-nilly. There’s supposed to be a damn good reason (that “compelling interest” of the government’s), some indication that the law serves its purpose in furthering that interest, and an assurance that the law doesn’t burden too much speech that’s legal for adults to consume in restricting minors’ access to that same speech.

Or, as Vera Eidelman, senior staff attorney with the ACLU Speech, Privacy and Technology Project, put it when talking about the Supreme Court’s decision on Free Speech Coalition v. Paxton: “With this decision, the court has carved out an unprincipled pornography exception to the First Amendment. The Constitution should protect adults’ rights to access information about sex online, even if the government thinks it is too inappropriate for children to see.”

Unfortunately, to many of those who support and advocate for these laws, the chilling effect bemoaned by their critics is a design feature, not a bug.

Read More »

Apple Raises Privacy Concerns Over Texas Age Verification Law

Apple Store

Apple on Thursday outlined how it plans to comply with a new Texas law, SB 2420, which introduces strict age assurance requirements for app stores and app developers.

While Apple had already begun rolling out its own age verification tools earlier this year in anticipation of regulatory changes, the company voiced significant privacy concerns regarding the Texas legislation.

In a statement to developers, Apple said, “…we are concerned that SB2420 impacts the privacy of users by requiring the collection of sensitive, personally identifiable information to download any app, even if a user simply wants to check the weather or sports scores.”

The Texas law is part of a growing wave of state-level regulations across the United States. These laws have emerged as individual states step in where federal lawmakers have failed to enact comprehensive online protections for minors. Though the goals are similar—to safeguard children online—the methods differ widely between states.

While Apple has the resources to comply with such mandates, smaller app developers may struggle. The company noted that its new tools are intended to help those developers meet legal requirements. Some smaller startups are already feeling the impact; for example, social media platform Bluesky recently blocked its service in Mississippi, explaining it lacked the resources to comply with similar state laws.

When SB 2420 takes effect on January 1, 2026, Apple will be required to confirm whether Texas-based users are at least 18 years old. Users under 18 will need to join a Family Sharing group managed by a parent or guardian. Parents will have to give consent for all App Store downloads, purchases, and transactions through Apple’s existing in-app purchase system.

To align with the new requirements, Apple said it will enable developers to determine user age “in a privacy-preserving way.” Developers can currently use the company’s Declared Age Range API, which will be updated before the law’s effective date to include new age categories for users in Texas.

Apple also announced it will introduce new APIs later this year that allow developers to request parental consent if they make major changes to an app that alter its age rating. Parents will be able to revoke consent at any time if they decide an app is no longer appropriate for their child.

The company even acknowledged the potential household dynamics such tools could create, joking that “we can imagine this being used as a new punishment technique; no Instagram for a month!”

Apple also cautioned developers that similar age assurance laws are scheduled to take effect soon in Utah and Louisiana, urging them to prepare for additional compliance requirements.

Read More »

Apple Introduces New Tools to Protect Kids and Teens on Its Devices

Kid on tablet

Apple announced a series of new initiatives Thursday aimed at helping parents and developers create safer digital environments for kids and teens. Along with simplifying the setup process for child accounts, parents will now be able to share information about their children’s ages, allowing app developers to provide more age-appropriate content.

The App Store will also feature a new set of age ratings designed to give developers and users a clearer understanding of an app’s suitability for specific age groups. Product pages for third-party apps will soon include additional details to help parents make informed decisions—such as whether the app includes ads, user-generated content, or its own parental controls.

“These updates will roll out to parents and developers later this year,” Apple said.

The changes come amid ongoing national and state-level debates over how tech companies should protect children online. Nine U.S. states, including Utah and South Carolina, have recently proposed bills requiring app store operators to verify children’s ages and obtain parental consent before minors can download apps.

Apple has long advocated for app developers to handle age verification themselves, while companies like Meta have argued that app store operators should manage the process, given their direct access to user information.

Apple’s latest system represents a middle ground. The company will collect children’s age data directly from parents while requiring third-party developers to use that information to design age-appropriate experiences.

Simpler Setup for Child Accounts

Apple’s new setup flow for child accounts—required for children under 13 and optional for minors up to 18—makes the process smoother for families.

Parents can now select their child’s age range and verify their identity by confirming an existing credit card on file, rather than re-entering payment information manually.

If a parent isn’t available during setup, the child can still start using the device. Apple will automatically apply age-based web filters and allow access only to preinstalled apps like Notes, Pages, and Keynote. Neither Apple nor developers can collect the child’s data without parental consent during this stage.

When the child first visits the App Store and attempts to download an app, they’ll receive a reminder to ask their parent to complete setup.

Once setup is complete, the child can use Apple services with the content and app restrictions defined by their parent.

New Age Range API for Developers

Instead of asking children to manually enter their birthdays, developers can now use a new Declared Age Range API to access the age range information that parents provide during account setup. Parents can correct or revoke this data at any time.

Through the API, developers receive an age range—such as 9–12 or 13–15—without learning the child’s specific birthdate.

If an app requests age information, the child will see a pop-up asking permission to share it—similar to existing prompts for camera, microphone, or location access.

Apple says this approach is “more effective,” since “kids often lie about their birthday to access an app’s full experience.”

Developers must opt in to use the API, but future legislation could make its adoption mandatory for certain app categories.

Expanded Age Ratings on the App Store

Apple is also updating the App Store’s existing age rating system. Currently, apps are labeled 4+, 9+, 12+, or 17+. The new framework adds more detail, breaking down teen users into 13+, 16+, and 18+ categories while keeping the younger ranges intact.

Apple says an app’s rating is determined by developer responses about its content and the intensity or frequency of that material.

“This will help parents better determine if an app their child requests is age-appropriate,” the company explained. “If content restrictions are enabled, kids are prevented from downloading or updating apps that exceed their age range.”

In addition, age-restricted apps will not appear in curated sections like Today, Games, or Apps when a child is browsing.

Several of these new features for child accounts are available in the public beta of iOS 18.4. The ability to modify a child’s age after account creation, along with the Declared Age Range API and new App Store ratings, will launch later this year.

In response to Apple’s announcement, a Meta spokesperson described the update as “a positive first step,” but noted that “developers can only apply these age-appropriate protections with a teen’s approval.”

“Parents tell us they want to have the final say over the apps their teens use,” the spokesperson added, “and that’s why we support legislation that requires app stores to verify a child’s age and get a parent’s approval before their child downloads an app.”

Read More »

Wisconsin Proposals Could Criminalize VPN Use Under Age Verification Laws

Wisconsin flag

MADISON, Wis. — Lawmakers in the Wisconsin state legislature are gradually advancing an age verification bill that not only targets adult entertainment websites but also includes provisions restricting the use of virtual private networks (VPNs). The most recent movement on the proposal occurred during a state Senate committee meeting on Oct. 8.

The measure to limit VPN usage stems from two companion bills, Assembly Bill 105 and Senate Bill 130, both introduced exclusively by Republican lawmakers. Since the GOP controls both chambers of the legislature, the proposal stands a strong chance of passing in some form through the Assembly and Senate.

What remains uncertain, however, is how Democratic Gov. Tony Evers will respond if the bill reaches his desk—whether he would sign it into law or veto it. Taken together, AB 105 and SB 130 represent the latest effort to criminalize or restrict the use of commercially available VPNs when used to bypass age verification systems.

In neighboring Michigan, a group of far-right legislators recently proposed an even more extreme measure: a total ban on pornography that initially sought to outlaw VPNs and proxy tools.

Rep. Josh Schriver, who sponsored House Bill 4938, titled the Anticorruption of Public Morals Act, described the bill as a “public decency and public safety solution” aimed at curbing access to what he considers harmful online content. However, following public and stakeholder backlash, Schriver announced that he would amend the measure to remove any language referencing VPNs and proxy restrictions.

The Age Verification Providers Association (AVPA), a trade group for the age-verification sector, has long been criticized for sending mixed messages about VPNs. While its executive director, Iain Corby, has said the organization does not oppose VPNs, he has also stopped short of clarifying how they factor into bypassing age verification systems.

Across the Atlantic, similar discussions are taking place in the United Kingdom, where regulators are considering age verification requirements for VPN services. Dame Rachel de Souza, the Children’s Commissioner for England, told BBC’s Newsnight program in August that current age verification provisions under the Online Safety Act are “essentially useless,” given how easy it is for minors to download and use VPNs.

“Of course, we need age verification on VPNs—it’s absolutely a loophole that needs closing, and that’s one of my major recommendations,” de Souza said during the interview.

Read More »

Proposed Pennsylvania Bill Would Impose Tax on Adult Content Platforms

Pennsylvania flag

HARRISBURG, Pa. — A Pennsylvania lawmaker is proposing new legislation that would add an additional tax on online adult content platforms.

State Senator Marty Flynn (D–Lackawanna, Luzerne) introduced the measure, arguing that while adult content services already generate substantial revenue from Pennsylvania subscribers, the money they earn contributes little beyond the state’s standard sales and use tax.

Flynn said his proposal aims to ensure these companies pay their “fair share” to the Commonwealth. The legislation would impose a 10% tax on all subscriptions and one-time purchases made through online adult platforms. This new tax would be applied in addition to Pennsylvania’s existing 6% sales and use tax.

All revenue collected under the bill would be directed to the General Fund, where it could be used to support essential state programs and priorities.

Flynn noted that his proposal mirrors efforts in other jurisdictions that have sought to modernize their tax codes and ensure online-based businesses are taxed equitably alongside traditional industries.

Read More »

EU Releases Updated Age Verification App Blueprint as Social Media Access Rules Spark Debate

EU DSA logo

The European Commission has unveiled a second version of the EU’s age verification app blueprint, as debates continue across the bloc over how to prevent children from accessing social media platforms.

Originally introduced in July 2025 as a “white label” prototype, the app was designed to work seamlessly with upcoming EU Digital Identity (EUDI) Wallets. The blueprint serves as a foundation that EU member states and private sector developers can adapt to create their own local versions of an age verification system.

The latest version introduces several new features, including the use of passports and national ID cards—in addition to electronic IDs (eIDs)—as onboarding methods to generate proof of age. It also incorporates support for the Digital Credentials API, a system designed to streamline interactions among users, service providers, and credential issuers.

According to the European Commission, the blueprint’s purpose is to “support the implementation of the Digital Services Act (DSA)” and ensure stronger protections for minors online. The app allows users to prove they are over 18 when attempting to access restricted content, such as adult websites, without revealing personal details like their full birthdate or identity.

The project is being developed by the T-Scy consortium, a partnership between Scytales AB (Sweden) and T-Systems International GmbH (Germany), which also manages stakeholder engagement across the EU.

The blueprint is already undergoing trials in several countries, including Denmark, France, Greece, Italy, and Spain, which are building their own national age verification apps based on the EU model. By the end of 2025, the Commission expects the blueprint to integrate zero-knowledge proof (ZKP) technology, which would enable verification of a user’s age without exposing any private data.

However, despite this progress, widespread adoption remains uncertain, as not all EU members are on board.

Estonia and Belgium Push Back

Two member states—Estonia and Belgium—have refused to sign the Jutland Declaration, a ministerial pledge advocating for the adoption of a digital age of majority across the EU.

The declaration calls for privacy-conscious age verification mechanisms on social media and other digital platforms to “mitigate the negative impact of illegal and inappropriate content, harmful commercial practices, addictive or manipulative design, and excessive data collection, particularly affecting minors.” It also suggests introducing a formal “digital legal age.”

Initiated by Denmark, which currently holds the rotating presidency of the EU Council, the declaration has been signed by 25 other member states, according to The Brussels Times. Denmark has made child online safety one of its top priorities during its six-month term.

Read More »

FSC Clarifies California’s Device-Based Age Verification Law Excludes Adult Websites

Free Speech Coalition logo

LOS ANGELES—The Free Speech Coalition (FSC) issued a statement Wednesday clarifying that California’s newly enacted age verification law does not apply to adult entertainment websites.

California is home to a large portion of the adult entertainment industry, including studios, performers, and site operators.

In a post on its official blog, the FSC explained:

California’s recently passed age-verification bill, AB 1043, does not apply to adult websites. The bill requires device manufacturers and app stores to collect a user’s birthdate or age and provide an API for apps (but not websites) to determine whether a user is under 18 years old. Apps must use the age information to ‘manage the delivery of age-appropriate content.’”

The FSC further noted that although the bill originally included language extending the age signal API to adult websites, that provision was removed in a last-minute amendment. The organization added that it has long supported a device-based solution to help adult sites restrict access to minors using similar technology.

The new law is scheduled to take effect on January 1, 2027.

Read More »