Political Attacks

British Adult Creator Bonnie Blue Back in the UK Following Bali Deportation

Bonnie Blue

It still feels a little surreal when you read it out loud: British adult content creator Bonnie Blue is back in the UK after being deported from Bali and handed a 10-year ban from Indonesia. A police raid on a rented studio put an abrupt end to her trip, with the 26-year-old detained alongside a group of international travelers as authorities launched what they called a crackdown on potential pornography violations.

Bonnie Blue, 26, isn’t exactly new to attention. She’s built an international following through social media and subscription platforms, leaning into controversy with the kind of provocative marketing that either hooks you instantly or makes you roll your eyes. Her so-called promotional tours—often aimed at young adult audiences—have always lived right on that thin line between spectacle and scandal. This time, the line snapped.

During the raid, Indonesian police seized cameras, vehicles, and other equipment. Badung Police alleged that Blue and the group were playing a sex game in which the “winner would sleep with Bonnie Blue.” Later, though, a local Bali outlet reported that “no pornographic activities or acts have been found in the collaborative content.” That gap between accusation and evidence has hung over the story ever since.

Authorities also accused Blue of promoting a “BangBus” tour in Bali, supposedly involving explicit content with barely legal Australian visitors. BangBus, for anyone who’s somehow missed it, is a long-running adult franchise built around mobile productions—vans, buses, the whole voyeur-on-wheels concept. It’s been copied, remixed, and rebranded so many times that it’s basically a genre now, which only added fuel to the fire.

Because Indonesia’s anti-pornography laws are notoriously strict, the case quickly went global. Headlines speculated about serious prison time. Cameras followed every move. In court, Blue didn’t exactly play the role people expected—laughing, chatting with spectators, even sucking on a lollipop. In the end, the punishment was almost absurdly small compared to the buildup: a fine she later described as $20, followed by deportation.

Back on British soil, Blue wasted no time addressing the drama. “I’m rich and have good lawyers – did you really think I’d face jail time?” she told reporters. She said she was “excited” to show people “what got me into all this trouble,” joking that she needed to “recuperate” her “huge losses” from the $20 fine. It was classic Bonnie—half bravado, half wink.

In her first Instagram video after returning home, Blue pointed the finger elsewhere. “The girl that organized this whole trip for me, she was like: ‘Oh, I’ll sort security, hotels, lawyer, flights, everything,’” Blue said. She claimed the organizer “charged me £75,000 ($150,000) – she has taken a big chunk of the money and then has reported me to the police.” If true, it turns the whole saga into something closer to betrayal than bad planning.

Blue says she plans to tell “the whole story” to her fans soon. Whether that ends up feeling like a confession, a clapback, or just another chapter in a career built on shock value is anyone’s guess—but one thing’s certain: this isn’t the last time people will be talking about her.

Read More »

UK Lords Vote to Ban Depictions of ‘Choking’ in Adult Content

UK flag

LONDON — There’s a particular hush that settles over a chamber when lawmakers realize they’re about to redraw a line that can’t easily be erased. That moment arrived in the House of Lords, where members agreed to amendments to the pending Crime and Policing Bill that would make depicting “choking” in pornography illegal and classify it as a “priority offense” under the Online Safety Act.

On Dec. 9, the Lords voted to approve Amendments 294 and 295, measures that would turn the possession or publication of “pornographic images of strangulation or suffocation” into a criminal offense. The language is blunt. Intentionally so.

If the bill becomes law with those amendments intact, the consequences are no small thing. Possessing “choking” material could carry a prison sentence of up to two years, while publishing it could result in as much as five years behind bars. The stakes, suddenly, feel very real.

Parliamentary Under-Secretary of State Baroness Alison Levitt, speaking for the government, told fellow Lords that the law would hinge on whether the strangulation or suffocation depicted is “explicit and realistic.” It wouldn’t need to be real, she said—just convincing enough.

“For example, it can be acted or posed,” she explained. “Or the image may be AI-generated — provided that the people in the image look real to a reasonable person.”

Designating choking content as a priority offense, Levitt added, “will oblige platforms to take the necessary steps to stop this harmful material appearing online.” In regulatory terms, that’s a heavy label—one that triggers active duties, not polite suggestions.

At the moment, the “priority offense” category is reserved for some of the darkest corners of the internet, including CSAM and terrorism-related material. Sliding choking depictions into that same bracket signals how seriously lawmakers want platforms to take this.

The push to ban portrayals of nonfatal strangulation didn’t materialize overnight. It picked up steam after the release of a pornography review in February that recommended prohibiting adult content deemed “degrading, violent and misogynistic.” By June 19, the government made its position unmistakably clear, confirming its intention to outlaw content involving strangulation.

Baroness Gabrielle Bertin, a Conservative peer who served as the independent lead reviewer on that pornography review, welcomed the amendments with unmistakable conviction.

“This is not just another amendment,” Bertin said. “It is a light-bulb moment, a recognition that what has been normalized for too long is neither safe nor acceptable.”

Not every proposed expansion made it through. The government rejected other amendments that would have criminalized additional forms of adult content, including a proposal to

Read More »

House Panel Pushes Federal Age-Verification Bill Forward

US Congress

WASHINGTON — The vote landed quietly, but the implications didn’t. A U.S. House subcommittee moved Thursday to amend the SCREEN Act, pushing it one step closer to becoming federal law and putting site-based age verification for adult content squarely on a national track. With the amendment approved, the bill now heads to the full Committee on Energy and Commerce for review.

Behind the scenes, the Subcommittee on Commerce, Manufacturing, and Trade has been busy stitching together a broader package of online safety legislation, with the Shielding Children’s Retinas from Egregious Exposure on the Net (SCREEN) Act sitting near the center of it all. This isn’t a sudden burst of concern—it’s been building, piece by piece, vote by vote.

When Republican Sen. Mike Lee of Utah and Rep. Mary Miller of Illinois first introduced the bill in February, the enforcement teeth were already baked in. Violations of the SCREEN Act would be treated as violations of the Federal Trade Commission Act, meaning they’d fall under the umbrella of unfair or deceptive acts or practices. In plain terms: civil penalties of up to $10,000 per violation. That kind of number tends to get people’s attention.

On Thursday, Rep. Craig Goldman of Texas framed his amended version as familiar territory. Speaking to the subcommittee, he said it “follows the same playbook as Texas,” a nod to HB 1181—the state age-verification law that ignited the Supreme Court case Free Speech Coalition v. Paxton.

“It updates the SCREEN Act to align it with the successful Texas statute, and federalizes it across the country,” Goldman said. “The protections that the courts have already upheld for children in Texas should not stop at our border. Every child in America deserves the same consistent standard of safety as a child in Texas has. We must protect children from harmful online content, and we can accomplish this better by updating the SCREEN Act.”

The amended version tweaks the bill’s language in several places, though not in ways that clearly mirror Texas’ law line for line. That distinction matters more than it might sound like at first glance.

“This is not a mirror of the Texas law,” industry attorney Lawrence Walters observed. “It would likely render the Texas law unenforceable due to federal preemption.”

Federal law already tends to trump state law, but the amendment doesn’t leave much room for interpretation. It includes an explicit declaration of supremacy, stating, “No State or political subdivision of a State may prescribe, maintain, or enforce any law, rule, regulation, requirement, standard, or other provision having the force and effect of law, if such law, rule, regulation, requirement, standard, or other provision relates to the provisions of this Act.”

Right now, roughly half of U.S. states have their own age-verification laws on the books. If the SCREEN Act becomes law, those state rules wouldn’t just coexist—they’d be overridden.

For now, the amendment cleared the subcommittee by a voice vote, sending it onward to the full Committee on Energy and Commerce. It’s not the final word, but it’s a clear signal. The question isn’t whether this debate is going national—it’s how much ground will shift once it does.

Read More »

Indiana Lawsuit Claims Aylo Failed to Stop VPN Access to Adult Sites

Todd Rokita

INDIANAPOLIS—There’s something oddly familiar about this moment. Indiana Attorney General Todd Rokita, a Republican, stepped forward this week to announce that his office has sued Aylo, its affiliated companies, and ownership group Ethical Capital Partners. The accusation: that they violated the state’s age-verification laws by failing to fully block users who access sites through virtual private networks, or VPNs.

Aylo, the parent company behind Pornhub and several other free and premium adult platforms, has already shut the door on Indiana entirely. The company has blocked all Indiana IP addresses, choosing to withdraw from the state’s digital landscape rather than implement the sweeping age-verification requirements that took effect earlier this year.

“We know for a fact, from years of research, that adolescent exposure to pornography carries severe physical and psychological harms,” Rokita said in a statement released by his office.

“It makes boys more likely to perpetrate sexual violence and girls more likely to be sexually victimized. Yet, despite such realities, these defendants seem intent on peddling their pornographic perversions to Hoosier kids,” Rokita continued, explaining why his office brought the lawsuit. The framing is dramatic—but it also sidesteps the core issue quietly doing the real work here.

In the legal complaint filed in state court, Rokita advances a theory that places responsibility on Aylo not just for blocking Indiana users, but for failing to block access even when users disguise their location through VPNs or proxy servers that make them appear to be outside the state.

In other words, the argument treats the existence of VPNs themselves as a kind of open tunnel—one that, in the state’s view, leads minors straight to content that would otherwise be unavailable under Indiana’s age-verification rules.

Corey Silverstein, an attorney who represents adult-industry clients, said the lawsuit is unsurprising—and deeply troubling.

“It was just a matter of time before one of these state Attorneys General tested this theory,” Silverstein said. “We are going to monitor the case very carefully.”

He added, “I see substantial roadblocks for the government’s case, but, again, I’m not surprised because the states have been emboldened by the Supreme Court decision in Free Speech Coalition v. Paxton. Going after a VPN service provider would be a stretch, and Section 230 [of the Communications Decency Act of 1996] would stop it.

“That’s a dangerous concept, though, because what’s next? Power companies? Landlords that lease data center space?”

From a legal standpoint, the case itself feels thin—almost delicate in how much weight it tries to carry.

“As of the date of this filing, defendants’ websites (Aylo) identified above restrict access by users whose devices purport

Read More »

AVS Group Moves to Strengthen Age Verification After Costly Fine

Age verification

It landed like a thud you could feel in your chest. An adult content network suddenly staring down a roughly $1.3 million penalty, and the quiet realization that the rules had shifted while everyone was still arguing about whether they ever would. In the wake of that fine, AVS Group has begun rolling out tougher age checks on some of its sites, responding to U.K. regulator pressure tied to the Online Safety Act.

A spokesperson for the regulator said the company has now put age-assurance tools in place on portions of its network that are “capable of being highly effective at correctly determining whether or not a user is a child.” That phrasing matters. It’s careful. And there’s a catch. The same spokesperson made it clear that further penalties are still on the table until the regulator is fully “satisfied” that changes have been made across every platform named in the investigation.

The fine itself followed an inquiry that found AVS Group had failed to implement robust age checks on 18 adult websites. According to the regulator, some of those sites had no age-assurance systems at all, while others relied on methods that simply didn’t meet the standard of being “highly effective.” It’s the kind of language that sounds dry on paper but carries real weight when money and access are on the line.

And the money isn’t the scariest part. Enforcement powers under the law stretch much further — fines that can climb to 18 million pounds or 10% of qualifying worldwide revenue, whichever hits harder. There’s also the nuclear option: court orders that could force payment providers or advertisers to walk away, or even compel internet service providers to block a site entirely within the U.K. That’s not a slap on the wrist. That’s existential.

A request for comment has been sent to AVS Group’s parent company, TubeCorporate, but no response has been issued so far. Which leaves a lingering question hanging in the air: if this is what partial compliance looks like, what does “satisfied” actually mean — and who’s next to find out the hard way?

Read More »

Australia Tells Search Engines to Blur Pornographic Images

The Australian flag

Some policies don’t land with a thud so much as a slow, unsettling echo — the kind that makes you sit up a little straighter and wonder what pushed things this far. That’s how it felt when Australia’s eSafety commissioner, Julie Inman Grant, announced that search engines across the country will soon be required to blur pornographic and violent images. The rule kicks in on December 27, and you can almost sense the mix of urgency and inevitability behind it.

“We know that a high proportion of this accidental exposure happens through search engines as the primary gateway to harmful content, and once a child sees a sexually violent video, for instance, maybe of a man aggressively choking a woman during sex, they can’t cognitively process, let alone unsee that content,” Grant said.

There’s something chilling about that last part — the idea of “can’t unsee.” It’s true for adults, too, but with kids it lands differently. Grant continued: “From 27 December, search engines have an obligation to blur image results of online pornography and extreme violence to protect children from this incidental exposure, much the same way safe search mode already operates on services like Google and Bing when enabled.”

Blurred results aren’t new. They’ve been tucked inside safe search settings for years, hidden like a fire extinguisher behind a glass panel — available, but only if someone remembers to activate it. What’s shifting now is the weight of responsibility: it’s no longer on parents to flip a switch; the platforms have to build the guardrails themselves. And all of this is happening alongside another major change hitting Australia on December 10, when the country’s social media ban for under-16s comes into force.

Earlier this month, it was estimated that around 150,000 Facebook accounts and 350,000 Instagram accounts belonging to under-16s will disappear from the digital landscape, like entire classrooms quietly going offline. Grant framed the larger moment this way: “These are important societal innovations that will provide greater protections for all Australians, not just children, who don‘t wish to see ‘lawful but awful’ content.”

That phrase — “lawful but awful” — hangs in the air. It’s complicated, a little subjective, and strangely honest. And maybe that’s the real story here: a country trying to redraw the line between what the internet allows and what people can realistically live with.

Read More »

Oh Good, Warning Labels Are Back Again by Stan Q. Brick

Cigarette warning label

Good news, everyone: The Nanny State is back and coming to a computer screen near you!

In fact, if you live in Washington state or Missouri, the Nanny State is coming to a computer screen very near you indeed, because it will be your own computer’s screen. Or smartphone screen, or smart watch screen, or pretty much any other screen you can connect to the internet.

As you may have read here on The War on Porn or elsewhere, both states currently are considering bills which would not only impose age verification requirements on adult websites but would require such sites to publish warning notices about their content, as well.

The Washington bill is the murkier of the two, stipulating that the warning labels to come are “to be developed by the department of health.” The Missouri bill, on the other hand, is quite specific indeed.

The legislation being pondered in Missouri would require sites to publish warnings stating that “Pornography is potentially biologically addictive, is proven to harm human brain development, desensitizes brain reward circuits, increases conditioned responses, and weakens brain function;” that “exposure to this content is associated with low self-esteem and body image, eating disorders, impaired brain development, and other emotional and mental illnesses;” and finally that “pornography increases the demand for prostitution, child exploitation, and child pornography.”

To say that these claims are disputed would be to put it mildly. Most of the evidence for these assertions is anecdotal in nature, in part because it’s very difficult to evaluate them without intentionally exposing a group of minors to pornography (which is illegal to do) in the context of clinical study.

Regardless of their basis in fact (or lack thereof) these labels are what attorneys and Constitutional scholars call “compelled speech,” something which is a bit of a no-no under First Amendment jurisprudence and the appropriately named “compelled speech doctrine.”

As explained by David L. Hudson Jr., writing for the Free Speech Center at Middle Tennessee State University, the compelled speech doctrine “sets out the principle that the government cannot force an individual or group to support certain expression.”

“Thus, the First Amendment not only limits the government from punishing a person for his speech, but it also prevents the government from punishing a person for refusing to articulate, advocate, or adhere to the government’s approved messages,” Hudson adds.

The compelled speech doctrine has been invoked by the Chief Justice John C. Roberts-era Supreme Court as recently as the case Rumsfeld v. Forum for Academic and Institutional Rights.

“Some of this Court’s leading First Amendment precedents have established the principle that freedom of speech prohibits the government from telling people what they must say,” Roberts wrote for the Court in 2006.

When some folks hear about these labels, doubtlessly they say to themselves something like “How is this any different from requiring cigarette packages to carry warning labels?” And that would be a good question, if cigarettes were a form of speech that presumptively enjoys protection under the First Amendment.

Beyond that distinction, there’s another obvious difference here. Cigarettes, unlike pornography, have been subjected to extensive clinical study, research which has confirmed that nicotine is addictive, and that tobacco (along with the myriad other substances found in cigarettes) is strongly associated with the development of lung cancer and various cardiopulmonary disorders and diseases.

In short, the analogy between pornography and cigarettes is a terrible one, scientifically and legally.

There was a time when I would very confidently assert that the Supreme Court will eventually reject these warning labels as textbook compelled speech and shoot down at least the labeling requirements in the bills pending in Washington and Missouri. But after their decision in Free Speech Coalition v. Paxton, I’m not so sure.

For those who like the contours of our First Amendment just the way they are, this uncertainly should be even more alarming than the warning labels the Nanny State wants us to start seeing on porn sites.

Read More »

Washington AV Bill Adds Controversial ‘Health Warning’ Requirement

Washington House

OLYMPIA, Wash. — Every so often, a bill shows up that feels like it’s carrying baggage from an old argument — the kind you thought had already been settled. That’s the energy surrounding a new age-verification proposal in Washington state, which would require adult websites not only to verify users’ ages but also to post warning notices about alleged health risks — despite a previous federal court ruling that struck down similar requirements.

Sponsored by Republican Rep. Chris Corry and Democratic Rep. Mari Leavitt, HB 2112 — known as the “Keep Our Children Safe Act” — would require adult sites to verify the age of users accessing content from within the state. Structurally, it mirrors many of the AV laws already enacted or currently moving through legislatures across the country.

Where it veers into new territory is with an extra mandate: websites subject to the law would also need to display “information pertaining to the youth health risks associated with adult content” on their landing pages and in advertisements, alongside contact details for the Substance Abuse and Mental Health Services Administration helpline. It’s the kind of addition that shifts the bill from regulatory to ideological — and into legally risky waters.

Notably, the legislation doesn’t spell out what those warnings would actually say. Instead, it leaves the wording “to be developed by the department of health,” a placeholder that creates as many questions as it answers.

A similar approach surfaced just last week in Missouri, where lawmakers introduced their own bill calling for nearly identical warning requirements on adult websites.

Missouri’s proposal, HB 1831, explicitly demands that the warning language match what was originally written into Texas’ HB 1181 — the age-verification statute at the center of the landmark Supreme Court case Free Speech Coalition v. Paxton.

That ruling ultimately cleared the path for states nationwide to begin enforcing AV laws. But the court upheld a version of the Texas law that no longer included the so-called health warnings. Those provisions had already been rejected by the U.S. Court of Appeals for the 5th Circuit, which upheld a lower-court injunction finding the warnings amounted to unconstitutional compelled speech.

Given that precedent, it’s far from clear how either Missouri’s HB 1831 or Washington’s HB 2112 would survive legal scrutiny. Asked about this potential obstacle, Rep. Corry acknowledged he wasn’t aware of the 5th Circuit decision and said he planned to look into it.

“As we move through the process we can amend as needed,” Corry said. “I am sure we will hear from both proponents and opponents of the legislation. It is rare that a bill does not need amending after being introduced. I am committed to making sure all legislation follows both the federal and state constitution.”

Industry attorney Corey Silverstein remains “cautiously confident” that the bill as written won’t withstand a constitutional challenge.

“The health warnings are blatantly unconstitutional, and I can’t imagine a world where even the most conservative justices turn their back on this level of First Amendment violations,” Silverstein said. “One of the biggest drawbacks to the Paxton decision was always going to be overly aggressive legislators trying to push the envelope as far as they can. It emboldened many conservative lawmakers to push forward in their anti-porn crusade.

“Washington and Missouri won’t be the last two states to engage in these tactics,” he predicted. “Since Paxton, state lawmakers are basically trying to one-up each other as to how far they are willing to cross the line. My home state of Michigan is the best example of this madness, with the introduction of the Anticorruption of Public Morals Act, which aims for a total ban on online pornography in Michigan.

“The industry must continue to aggressively fight these laws,” he added. “Without resistance, things will only get worse.”

Proposed Warnings Contradict Public Health Experts

Earlier this year, a North Dakota House committee amended a resolution that originally sought to label pornography a “public health hazard,” dialing the language back and replacing it with a call for further research into whether that designation was even appropriate.

The original version of the resolution claimed pornography was a “critical public health issue,” arguing that adult material “perpetuate[s] the demand for sex trafficking, prostitution, child pornography, and sexual abuse images,” and asserting that it had been “linked to detrimental health effects,” including purported impacts on brain development, emotional health, intimate relationships, and the creation of sexual addiction.

At the time, state Rep. Cynthia Schreiber-Beck publicly pushed back on those claims, noting that pornography “does not fulfill the public health field’s definition of a public health crisis.”

“After doing some research… this doesn’t really fly,” Schreiber-Beck told members of the House Education Committee.

Between 2016 and 2020, several states adopted similar “public health crisis” resolutions around pornography. But health and medical authorities have consistently rejected that framing, including in a study published by the American Public Health Association.

The study’s authors concluded: “The movement to declare pornography a public health crisis is rooted in an ideology that is antithetical to many core values of public health promotion, and is a political stunt, not reflective of best available evidence.”

Sometimes laws don’t reveal their real purpose until you scan the fine

Read More »

EU Regulators Set to Expand Age-Verification Crackdown to Smaller Adult Sites

France flag

DUBLIN — There’s a quiet shift happening behind the scenes — the kind that doesn’t make noise until suddenly it does. Ireland’s media regulator is signaling that enforcement of age-verification rules is about to widen its reach, stretching past the big, familiar adult platforms and into the long tail of smaller sites across Europe.

Digital Services Media Commissioner John Evans laid it out plainly for members of Ireland’s lower house of Parliament, warning that targeting only the giants won’t actually solve the problem. “If you come down hard on a few platforms, users, including minors, will simply move to smaller ones. So we and many other digital services coordinators are mapping below-threshold pornographic service providers and will tackle those at a national level.”

Ireland’s Online Safety Code has technically been in effect since July, requiring adult sites headquartered in the country to move beyond self-reported age confirmation and implement real age-assurance measures — a meaningful shift that has already begun reshaping compliance conversations behind the scenes.

For sites based elsewhere in the EU, oversight falls under the Digital Services Act (DSA). National digital service coordinators work together beneath that framework to enforce shared standards, including age-verification mandates intended to curb underage access.

So far, formal investigations launched by the European Commission have largely centered on the biggest names — especially those labeled “Very Large Online Platforms” (VLOPs), defined as services reaching at least 45 million monthly users. These platforms face the most punitive regulatory scrutiny, though not all have accepted that classification without legal pushback.

What’s changing now feels familiar. The strategy mirrors a move announced in September by France’s media regulator, Arcom, which declared it would begin ramping up enforcement efforts toward smaller adult platforms, rather than focusing exclusively on the market heavyweights.

That approach has found some supporters in Ireland’s parliament. During the meeting, several lawmakers reportedly pointed to France’s Law Aiming to Secure and Regulate the Digital Space (SREN) as a possible template for tougher regulation. Arcom’s aggressive cross-border enforcement has already sparked ongoing legal disputes over whether individual EU nations can compel compliance from adult sites based elsewhere inside the bloc. In September, an advocate general at the European Union’s Court of Justice offered a nonbinding opinion supporting France’s authority to enforce its AV rules on foreign-based pornographic websites. The court’s final judgment is still pending.

For Evans, though, the direction of travel feels clear — and increasingly urgent. “There is a significant amount of enforcement activity under way on age verification and pornographic services,” he told Irish lawmakers. “We are hopeful we will see changes soon.”

Sometimes the law doesn’t arrive with a bang. Sometimes it just quietly redraws the map — and only later does everyone realize the borders have moved.

Read More »

Ofcom Hits AVS Group with $1.3M Penalty Over AV Violations

Ofcom logo

London, United Kingdom: The number—one million pounds—lands heavy, doesn’t it? On Wednesday, U.K. regulator Ofcom brought the hammer down on AVS Group Ltd., slapping the company with a penalty worth roughly $1.3 million after concluding it hadn’t bothered to put truly solid age checks in place across 18 adult websites. One of those moments where you read the headline and feel that quiet uh-oh ripple through the industry.

Back in July, it was reported that Ofcom had widened its lens, investigating four companies operating adult platforms to see whether they were meeting the strict age-assurance demands of the U.K.’s Online Safety Act (OSA). The law isn’t coy—if you publish pornography, you’re required to use “highly effective” age checks specifically designed to keep minors out. AVS Group was on that list.

By October, Ofcom confirmed that its inquiry had uncovered what it called “reasonable grounds” to believe AVS was falling short of those obligations, triggering a provisional notice and a 20-working-day window for the company to respond. It was the regulatory equivalent of a warning shot across the bow, a chance to explain—or fix—what already looked shaky.

That chance didn’t change the outcome. In a fresh update posted Thursday, Ofcom stated, “From 25 July 2025 until at least 25 November 2025, each of the AVS Group websites either did not implement any age assurance measures or implemented measures that were not highly effective at determining whether a user was a child.”

Zooming in on the details, the agency took particular issue with the company’s photo-upload checks, noting that AVS “deployed a photo upload check on its services that does not include liveness detection and as such is vulnerable to circumvention by children (for example, by uploading a photo of an adult). Ofcom considers that this method is not capable of being highly effective within the meaning of the Act.” In other words: nice try, not even close.

The AVS-run websites scrutinized by the regulator include pornzog.com, txxx.com, txxx.tube, upornia.com, hdzog.com, hdzog.tube, thegay.com, thegay.tube, ooxxx.com, hotmovs.com, hclips.com, vjav.com, pornl.com, voyeurhit.com, manysex.com, tubepornclassic.com, shemalez.com and shemalez.tube.

On top of the main penalty, Ofcom tacked on an additional £50,000 fine for AVS’s failure to respond properly to information requests during the investigation—a sort of regulatory side-eye for not cooperating when questions were asked.

And here’s the real mic-drop: what AVS received isn’t even close to the upper ceiling. Ofcom has the authority to levy fines of up to £18 million or 10% of a company’s qualifying global revenue, whichever is higher. Beyond money, the agency can seek court orders pushing payment providers or advertisers to cut ties—or, in the most extreme cases, require U.K. internet service providers to block access to sites entirely. It’s a reminder that in the new regulatory climate, the fines are loud… but the silence that follows losing access can be even louder.

Read More »