June 30, 2025
The Scam of Age Verification
June 30, 2025
Introduction
What is AV and why it doesn't work
What will happen?
The scam
What to do then?
Are sex and porn really harmful?
The United Kingdom
The United States
France
The European Union
Conclusion
TLDR
Introduction
A lot has happened in the past month: the EU Commission (the executive branch of the EU) publicly attacked the three largest porn sites — including us — over our supposed obligation to prevent minor access, while completely ignoring far larger mainstream platforms.
AV implementation was also scheduled to begin in France in June 2025, but was later halted — though only temporarily. However, it is set to come into effect next month in the UK — July 2025.
And just yesterday — June 27 — the U.S. Supreme Court (SCOTUS) issued a devastating decision that opens the door to broad state regulation of adult content, effectively allowing AV laws with minimal constitutional constraint.
What is AV and why it doesn't work
“Age verification” (AV) is the requirement for online platforms to implement strict methods to verify the age of their users, in order to prevent minors from accessing adult content.
By “strict,” we mean methods such as ID uploads, facial age estimation, credit card checks, or mobile operator verification. The allowed methods vary depending on the country.
At face value, it may sound reasonable — even like a good idea.
However, there are countless problems with it — many of which have been pointed out by credible observers, repeatedly.
Note that there has never been any credible evidence that site-level AV works either (especially when done selectively, like it has), while there have been countless warnings and demonstrations that it doesn’t.
Everywhere AV has been implemented, we’ve seen the same pattern: a handful of large porn sites are targeted (usually, us and Pornhub) — sometimes with a few token smaller sites — and that’s it.
The one and only so-called “argument” in favor of AV is that it’s been used for gambling and other restricted services, so it could be applied to porn. But that comparison is dishonest: on a gambling or merchant site, users already expect to submit personal data — credit card info, name, phone number, address. They are paying for something.
On a free site, users do not expect to hand over private data. They simply refuse — and move on to other sites. Why wouldn’t they?
AV is instantly and effortlessly circumvented: porn remains accessible through search engines, social media, messaging apps, file-sharing (direct and peer-to-peer), VPNs, proxies, and an astronomical number of adult sites — it’s conservative to estimate there are over a million. Some users might even be tempted to turn to the dark web to escape this wave of state overreach — though we certainly don’t recommend it.
Not only is porn still available from all these channels — but the largest, most obvious mainstream platforms that host or link to porn are systematically exempted or spared enforcement.
That alone should make anyone question the real motivations behind these regulations. We’re told it’s about protecting children — but the sites most known and most used by children, and which host porn, are conveniently untouched?
What will happen?
We’ll have to implement AV wherever it is legally mandated. It’s not like we have a choice. Legal challenges were our only option — but now, even the courts have been swept up in the hysteria.
The largest established adult sites, such as ours, will be immediately destroyed.
We know that only about 10% of the user base will remain after AV is implemented — and the 10% who stay (thanks, by the way) are very costly to verify.
So much so that we expect to operate at a financial loss.
This will completely distorts competition, as our visitors will switch to various other sites and services that did nothing to earn them. Preserving fair competition is one of the obligations of most states — but they simply don’t give a fuck about it. Right now, there are almost 3,000 (not an exaggeration) clones of our sites — not owned by us, but designed to look like our platforms, sometimes with a different makeover — stealing our content, and soon to be massively rewarded.
Regulators have no clue where people will go — but what’s likely is that users will scatter across so many sites, apps, proxies, and channels that they’ll become untraceable, guaranteeing the failure of future regulations. And unlike today, many of those new destinations will be dangerous, unmoderated, and openly hostile to enforcement.
People will massively move to VPNs (some of them are permanently free but with with slower speeds). You'd think it would absorb some of the losses, but VPNs nowadays integrate ad-blocking features, effectively ripping off content creators and us. Concretely, VPNs are pushing us even further in the negative.
Adult companies are already treated as barely tolerated, second-class entities, and every form of unfair treatment and discrimination has long been prevalent — in banking, for example. Digital exclusion through regulation was already a reality for adult sites — but AV will deepen this unequal treatment, pushing them out through reputational and compliance burdens.
So it will become impossible for any large, free adult platform to exist. This very business model is driven to extinction.
A new landscape will emerge — devoid of established, safe, and large adult sites where legitimate content creators can showcase their work and be rewarded for it. They will be penalized along with us.
We also predict that if AV is forced onto smaller merchants, it will destroy them too — they won’t even get a chance to show their content before users bail out, unwilling to pass yet another fucking age verification. And these merchants won’t have a chance to present their content on our platforms either — like they used to — because we’ll be gone.
All the years of “trust and safety” efforts from the largest adult platform — all trashed.
Favoring verified channels → no more
Payouts to content owners → gone
Verifying all uploaded content → for nothing
Cooperation with authorities → useless with no users left
Furthermore, it’s only a matter of time before user databases are hacked and leaked. Even recently, huge platforms with the best technological capabilities had massive password databases leaked.
We also expect that at some point in the future (though it may take years) — especially once parents realize they’ve been SCAMMED, that their children are no safer online, and that charlatans lied to them all along — AV regulations will be exported to devices and app stores, while remaining an oglibation for platforms, effectively forcing users to identify themselves over and over again.
It seems to genuinely please anti-porn activists to force adults to endure these repeated little humiliations — like having to do a face scan before viewing content.
The scam
Let’s be clear: AV enforcement has always targeted porn companies only.
What’s going on is obvious: “protecting children” is a false pretense. AV is being used to attack porn and those who watch it. It was never about children. It was always driven by anti-porn crusaders and control-obsessed ideologues.
These same people pretend to stand on a moral high ground, while lying through their teeth about their true intentions.
Citing “child protection” is an effective tactic to silence critics. If you oppose AV — even for sound, technical reasons — you’re immediately hit with emotional blackmail and cheap outrage.
Another “argument” we hear commonly to defend AV is: “At least we’re doing something.”
Yes — something ineffective, destructive, and stupid. Passing laws that don’t work, wasting everyone’s time, wrecking legitimate businesses and killing competition — all so some politician, bureaucrat, or activist can pose as a force for good.
We are the sacrificial lamb of this story — the scapegoat in a very bad political play.
We — and our content creators — are being sacrificed for nothing. User privacy is being sacrificed for nothing. Children and adults will actually be less safe after we are gone.
Announcements about Site-based AV give parents the illusion that their kids are protected — when in reality, nothing changes. Porn will still be everywhere.
It’s an absolute joke — a placebo solution pushed by imbeciles who clearly don’t give a shit about minors.
Device-level parental controls have existed for years, and can actually block a million sites. But politicians can’t take credit for them. So instead of empowering parents, lawmakers give them a headline and a false promise.
Watch these pricks go on TV or social media and pat themselves on the back for “making the internet safer” — while anyone can still run a basic Google search and instantly find billions of porn images. Bold-faced liars.
The one “good” thing about AV is that it’s a clear sign of political incompetence. If your lawmakers passed this kind of law, you can be sure they’re either corrupt, lying to you, stupid — or some combination of all three. At least it makes them easier to spot.
It might make some sense — even if we’d still disagree (because there are better solutions, see below) — to enforce AV on major mainstream platforms to prevent accidental exposure to porn. But the hypocrites in charge are systematically and deliberately sparing those platforms.
What to do then?
We agree with Meta (Facebook, Instagram), Aylo (Pornhub), and many others that the only effective solution to the problem of minors accessing porn is a systematic approach — applied at the device level, or at least at a higher layer than individual sites, such as app stores.
There are many ways to implement AV at the device level — and there’s no question that any of them would be far more effective than site-based AV. Because they apply system-wide, device-level controls scale across all apps and browsers — not just one site at a time.
However, we don’t share Aylo’s position — or others’ — that Google and Apple need to implement something new. Traditional parental controls have existed for ages, they work, and their usage can easily be expanded, as we explain further below.
It’s worth noting that neither Google nor Apple seem interested in implementing AV directly on their operating systems. Instead, they apparently prefer to offload the responsibility to each individual service, app, or website — separately and redundantly. But their position might change. Things are happening as we write, for example, Google just announced (July 1st, 2025), a partnership with an AV provider.
In any case, it’s mind-boggling that platforms are punished, adults are forced to take privacy risks, and everyone is expected to endure a disastrous user experience — all because we’re supposed to accept that it’s “too difficult” for parents to spend two minutes setting up parental controls.
And if some parents struggle with tech, the answer is education and support — not mass surveillance and regulatory theater.
In reality, this is just the state outsourcing its child-protection duties to private companies, who are now expected to police everyone else’s kids.
Shifting responsibility away from parents is exactly the wrong direction. Critics say we can’t rely on all parents — and that may be true — but there is a way to ensure all of them are involved.
That solution is simple: require all parents to install a parental control app — and have teachers verify its presence in school, once a year. The app could display a small icon on the home screen, making it instantly visible without needing to unlock the device.
We believe that’s the safest, most effective, and only sane global solution. Of course, opponents will say it’s “too difficult” — as if we don’t already expect parents to do far more, like checking homework or even just cooking for them. But then again, all reason has long since left this debate. “Child” is now a magic word that vaporizes logic the moment it’s spoken.
Are sex and porn really harmful?
The idea that adult content is inherently harmful to teenagers is a fallacy. In social sciences, it’s always possible to find a study that supports almost any conclusion.
In fact, Ofcom (UK regulator for AV) acknowledges that research into pornography’s impact on children is limited and inconclusive — prompting calls for further study.
It’s a debate very similar to the old panic over video game or movie violence — where the consensus today is that these things are not inherently harmful.
This is a textbook case of moral panic. We've seen it before — over rock music, comic books, Dungeons & Dragons, rap lyrics, and video games. Each time, fear was whipped up around the idea that a new form of media would corrupt “the youth.” And each time, the panic faded once it became obvious that society had not collapsed. Today’s AV push is just the latest version of that same irrational reflex.
It says a lot about a society that will move faster to restrict the viewing of sex — a universal and natural human drive — than it will to ban depictions of torture, murder, bombings, or decapitations.
Religions and ideologies have long exploited the universality of sexual desire to instill guilt — because guilt is a powerful tool of control.
Some claim that porn “isn’t real sex” — but porn is simply a representation of the fantasies that exist in our societies. And fantasies often reveal far more about who we really are than the polite masks we wear in the so-called “real world.”
In the following sections, we are discussing what happened in certain countries or regions.
The United Kingdom
In the West, it was the UK that first attempted to implement AV, starting around 2015.
But even before that, the UK had enforced ISP-level porn blocking by default. Users must log into their ISP or mobile provider account to opt out. These filters apply at the household or device level, meaning most internet connections in the UK are already filtered — unless someone deliberately disables the block.
So why add site-level AV on top of that?
The government claims it’s needed to catch edge cases — but in reality, it's about shifting responsibility away from parents and ISPs, and dumping it onto websites like ours. This, despite the fact that a parent already had to take conscious steps to enable access to adult content for their household, and was prompted to set up parental controls on every device they purchased.
Now, even after that explicit decision, users will be forced to go through site-level AV — potentially dozens of times — just to access the content they’ve already unlocked at the ISP level. This creates an absurd and invasive double burden, for no measurable gain in protection.
Critics have rightly called this system redundant, ineffective, and a threat to privacy. But the government pushed ahead anyway — not because it works, but because it looks like action. It’s political theater, designed to produce headlines, not results.
The timeline of the restriction of access to adult content in the UK:
2013 (late): Default ISP filtering begins across major providers.
2015: AV laws introduced via the Digital Economy Act.
2019: AV implementation scrapped amid concerns over privacy, enforcement feasibility, and data security.
2020–2022: AV advocates and the Age Verification Providers Association lobby relentlessly for revival.
2022–2023: AV re-emerges under the Online Safety Bill, now part of a broader regulatory framework.
2023: Online Safety Act passes, embedding AV into law — despite having been shelved just four years earlier for good reasons.
July 2025: AV is scheduled to take effect.
Despite being scrapped in the UK for good reasons, AV has since been blindly copied by lawmakers in France, the US, Germany, Italy, and Spain — as if the failure never happened, in a race to the bottom.
One rare redeeming aspect is that the UK allows for SMS-based verification — one of the least intrusive verification methods, and the one our users preferred, according to a large-scale survey in France.
It remains to be seen whether the UK will apply AV intelligently (as much as a dumb measure allows) and universally, or simply use it to crush a few adult sites and declare mission accomplished. But if the past is any indication, we already know the answer.
The United States
At least 20 U.S. states have now passed age verification laws.
There were strong reasons to believe these laws were unconstitutional under U.S. law. As the ACLU (American Civil Liberties Union) noted: “The Supreme Court repeatedly heard cases on this issue in the past — many brought by the ACLU — and consistently held that requiring users to verify their age to access protected content is unconstitutional when less restrictive alternatives, like filtering software, are available.”
For example: Reno v. ACLU (1997), Ashcroft v. ACLU (COPA I, 2002), Ashcroft v. Free Speech Coalition (2002), Ashcroft v. ACLU (COPA II, 2004), and Florence v. Shurtleff (2021).
So the Free Speech Coalition (an adult industry group) challenged Texas’s law, taking the case all the way to the Supreme Court — once again.
Unfortunately, on June 27, 2025, a divided SCOTUS — with a majority led by Justice Thomas — ruled 6–3 against its own precedent, claiming that “times have changed” and lowering the standard for suppressing certain speech.
This 6–3 split, with conservative justices outvoting the liberal ones, has become common in recent rulings.
In a forceful dissent, Justice Kagan argued that Texas’s AV law invades privacy by requiring ID or personal data submission to unknown entities, chills lawful speech — as adults will self-censor rather than risk exposure — and should be held unconstitutional under strict scrutiny due to its broad and intrusive design.
We previously wrote that SCOTUS was allowing AV with minimal constraints — but in reality, it’s more like no constraints. Judge for yourself:
Privacy: States are not required to offer privacy-preserving options. ID upload mandates are permitted without limits on data retention or third-party handling.
Speech burden: The Court dropped strict scrutiny in favor of intermediate scrutiny, meaning heavy burdens on adult access are tolerated.
Efficacy: States are not required to show that AV is effective, nor to prefer less restrictive alternatives like parental controls.
In short, the ruling removes nearly all federal constitutional barriers to AV laws — unless they are wildly overbroad or irrational, which is a very high bar.
We are particularly shocked that SCOTUS now allows individual states to pass laws known not to work — even when better alternatives exist — effectively protecting state-level incompetence and/or hidden agendas.
The “do it for the kids”, emotional yet empty argument was successfully used to strip away your rights.
Now, minors and adults alike will be durably less safe, with less freedom and more state control.
By the way, Texas’s penalties are so extreme — many times higher than in any other state — that they cross the line into punitive, chilling, and constitutionally disproportionate enforcement.
Based on Texas’s insane $10,000-per-day penalty, and assuming one million adult sites currently lack AV, the State of Texas stands to gain ~$6,000,000,000,000 — that’s 6 trillion dollars — more than what the entire industry has ever made, worldwide, and counting. As we said before: all reason is long gone.
Now consider this: Texas (and other states) act tough on adult companies in the name of protecting minors. That much is certain.
But they also voted to exempt search engines and social media — so those platforms can happily continue distributing as much porn as they want. And those are exactly the sites minors use from a young age. Isn’t that “wonderful”? Hypocrisy? Charlatanism? You be the judge.
France
French AV is the poster child for everything that’s wrong with age verification — the worst implementation imaginable.
This idiotic law was passed by so-called moderate political parties (first and foremost the party of President Macron), while so-called “extreme” parties, both left and right, opposed it with reasonable arguments.
We were initially attacked even before the law came into effect, as shown in our previous blog post (in French). Later, we were attacked again for failing to implement AV — even though the guidelines hadn’t yet been produced. Absolute hysteria. They COULD NOT WAIT to go after us. For some reason, it tends to be people from your own country who plant the knife in your back.
ARCOM — the body in charge of defining platform obligations and oversight — demands that users be verified on every single visit or session. This multiplies AV costs (they couldn't care less) and creates unnecessary friction that drives users away. They do not care in the slightest.
Another incredibly dumb feature of France’s AV system is its ban on credit card use — even though credit cards are one of the best tools for merchants, enabling them to both verify age and acquire a paying customer at the same time.
Their so-called “double anonymity” standard is a lie too. Platforms are required to pay AV providers — so of course those providers know which site the user is verifying for.
Since the initial law was passed in 2020, we’ve heard policymakers and AV proponents say again and again that they “don’t care how it’s done.” They are blinded by one goal: to see us destroyed. We imagine them foaming at the mouth, eagerly anticipating our demise through “regulation.”
The French government also shows zero concern for the disruption of competition.
What’s particularly disturbing is that French journalists never ask how AV will actually protect minors — or whether it’s normal for any free site implementing it to instantly lose almost all of its users. No one questions why the government consistently protects the biggest mainstream platforms, which are flooded with porn. Instead, they repeat simplistic talking points like “it’s just like showing your ID at a club” — a comparison that is utterly absurd when applied to the internet.
All of this is reinforced by years of media-driven demonization of porn — particularly in left-wing newspapers and state-funded TV and radio. We documented the many lies in just one French TV broadcast in a previous blog post.
Just in the past few weeks, we’ve been accused in three separate French news articles of:
Being cybersquatters (Libération),
Being spies with ties to Ukraine (Intelligence Online),
Producing “extremely violent” content (Mediapart).
All of these claims are complete fabrications — but French journalists are so protected that suing them is a pointless exercise. In court, they don’t even have to prove that what they wrote is true — only that they “worked semi-seriously” on their so-called reporting. It’s a very low standard that French courts have repeatedly upheld, for the sake of protecting real journalism.
We’ve also been relentlessly harassed by the French state in recent years — clearly as revenge for protesting their idiotic laws.
Know this: the French state is riddled with extremist ideologues, including institutions like the Haute Autorité pour l’Égalité entre les femmes et les hommes, which published a report suggesting that all porn is criminal. For some reason, only radical, misandrist feminists seem to be appointed to lead that agency.
The idea that porn should be restricted because it is supposedly violent or harmful is repeated in every country — but nowhere is that narrative pushed harder than in France, where the state holds a particularly regressive view of sex work. They want it completely marginalized — failing to understand that it is the opposite policies that would clean up both the streets and the websites.
When Pornhub exited France, former “Minister of the Internet” (now Foreign Minister) Jean-Noël Barrot publicly celebrated, saying “Good riddance.” The French government doesn’t care that the most regulated websites are being driven out. That tells you everything: their goal is not safety — it’s the dismantling of the adult industry, which they treat as criminal.
In short, the French state has proven itself deeply incompetent — and the French media no better. We are talking about hundreds of people in Parliament, in the press, and in regulatory bodies who have either failed to recognize the glaring flaws of AV — or have willfully closed their eyes.
The European Union
As you may have heard, the EU has passed far-reaching laws to supervise and control internet companies, known as the Digital Services Act (DSA), which imposes extra obligations on so-called Very Large Online Platforms (VLOPs).
We were designated as a VLOP — based on inaccurate traffic data (reliable figures were impossible to compute due to incognito sessions) — but chose to engage constructively, expecting fair and equal treatment — we were wrong.
This forced us to allocate significant resources adapting to the DSA — especially to meet its transparency obligations, which we accepted in good faith, recognizing the demand for more openness. As for matters like content moderation and user safety, we were already exceeding expectations.
Then, on May 27, 2025, the EU Commission launched a loud, preemptive public “probe” into adult VLOPs, including us — without any prior warning or dialogue. The issue at hand? Alleged failures to protect minors.
Instead of contacting us directly — as they reportedly did with non-adult platforms facing other regulatory concerns — they tipped off major media outlets in advance. Some journalists were better informed than we were, even before we received formal notice.
This happened even before the EU’s own consultation on AV policy had closed (deadline: June 10, 2025). In other words, we were being punished for not solving a problem the EU was still seeking advice on.
This wasn’t about solving problems anyway. It was a reputational hit campaign, engineered for maximum media exposure rather than constructive engagement. A textbook case of double standards: one for politically untouchable platforms, and another for us.
Judging by this level of hostility, it’s clear the EU is eager to harm the few European VLOPs that exist.
But what is this really about?
Under the DSA, VLOPs must assess and mitigate so-called “systemic risks” — including risks to democratic processes, public health, minors, and fundamental rights. In theory, this sounds reasonable. In practice, it’s dangerously vague.
Critics have warned that the DSA’s “systemic risk” rules force platforms to confess to vague, undefined dangers — with no clarity on how disclosures will be judged. The EU can then claim: “You admitted the risk but didn’t fix it,” and impose crushing fines.
This setup invites regulatory overreach, political pressure, and selective punishment — all without proper safeguards.
Especially in the U.S., experts have described this as granting the EU quasi-judicial powers over global platforms, which can be abused to push agendas on sensitive topics like misinformation, gender identity, or adult content.
That’s exactly what’s happening. The Commission is twisting vague systemic-risk provisions to punish us for not implementing something as risky as site-level age verification (AV) — despite AV being unproven, unscalable, and deeply problematic.
This also blatantly violates the EU’s own principle of proportionality: AV is ineffective, burdensome, and invasive. But by targeting adult platforms first, the Commission likely expects less resistance from the courts than if similar pressure were applied to mainstream giants.
The Commission also said it will “encourage” national authorities to go after smaller adult sites too. Should that really happen, we have to expect an exodus of European companies, to move outside of the EU.
Meanwhile, in a shocking display of selective enforcement and discrimination, other VLOPs — many of which host vast amounts of adult content — remain untouched, even though they’ve been under DSA supervision for longer than us.
Henna Virkkunen (Executive Vice-President of the European Commission) stated: “Our rules are very fair, because they are the same for everybody.” But rules mean nothing if enforcement is politicized or selectively applied.
This bias raises serious questions: Is it ideology? Incompetence? Lobbying influence? Or something else entirely?
Even worse: the Commission is pushing AV mandates before its own official “Digital Identity Wallet” — which includes a built-in age verification mechanism — is even functional. So why attack platforms now?
Perhaps this is about forcing the adoption of the EU’s future tools, by creating artificial crises today.
Former EU Digital Commissioner Thierry Breton publicly reprimanded France twice for pursuing a national AV system, saying the EU was working on a unified approach. But if that’s the case, why undermine platforms before that system exists?
The EU Commission is upholding its reputation for ideological rigidity and lack of pragmatism. It's also increasingly viewed as adopting totalitarian measures.
November 2024, Věra Jourová (from the Czech Republic), then outgoing EU Vice-President, cautioned the Commission against “overreaching” and “nannying” citizens. We wish she would have been heard.
Conclusion
The current moral panic about porn (and social media) is showing the limits of the political and media systems of the West — when even a relatively simple issue like minors accessing adult content is handled this poorly.
We say “the system” rather than individual people, because the outcome was nearly identical in every country, regardless of political leaning.
And by “poorly,” we mean without any rational, pragmatic approach.
The debates were dominated by emotional appeals and pathetically flawed reasoning, delivered by ever-panicking actors — while the many logical arguments against AV were simply ignored.
This wasn’t policymaking — it was media-fueled hysteria, where fear, outrage, and clickbait replaced evidence, expertise, and basic logic.
Anti-porn activists played a central role. They’ve been trying to nuke porn for decades — and now they’ve partly succeeded.
Their tactics relied on misrepresenting AV’s effectiveness and attacking the morality of anyone who dared to question their “protect the children” narrative.
They pushed their agenda without the slightest accountability for blatant falsehoods — like claiming “AV works.”
The press, far and wide, didn’t challenge any of it. Journalists let grotesque claims go unchecked — abandoning the scrutiny they claim to stand for.
But of course — they were likely too busy with their usual political propaganda routines.
Some even went out of their way to demonize porn sites — fabricating storylines and spreading outright lies.
Watching all this unfold, we don’t even want to imagine how more complex problems are being handled — there’s every reason to believe it’s even worse.
Ideology has replaced common sense entirely.
It’s long been our view that the way a society treats porn is a strong indicator of how free it truly is. And on that front, we can say one thing with certainty: freedom is regressing everywhere in the West.
Censorship is rampant — but that’s a topic for another time, if anything is left by then.
TLDR
The idiocracy has won. AV is coming into force across multiple countries — many around the same time. We'll be forced to verify your age, and we already know we'll lose almost all our users in the process.
Only a few sites like ours are being targeted, so porn will remain available everywhere else. Minors won’t be safer — just redirected to social media platforms or darker, unregulated corners of the internet.
This is the result of an ongoing moral panic, carried by dishonest ideologues, opportunistic politicians, and a media class that thrives on fear and outrage.
We're witnessing censorship disguised as “protection,” incompetence dressed up as virtue, and a total collapse of rational policymaking. And everyone will pay the price.