Chatcontrol: year zero
What the European Union doesn't want you to know.
On November 26th, 2025, after three years of tug-of-war, the COREPER committee (Committee of Permanent Representatives) of the Council of the European Union approved the final text of the âChatcontrolâ Regulation. Soon it will be presented to the European Parliament to begin the legislative process leading to its adoption as law.
Iâve been writing about Chatcontrol â or more precisely, the Child Sexual Abuse Regulation â since 2022.
In short: the European Union claims to be âfighting online child sexual abuseâ at any cost, by transforming Europeâs entire digital communication infrastructure into a semi-automated machine for mass surveillance and suspicion profiling targeting 500 million people.
When this Regulation becomes law, the Internet will undergo an ontological mutation: a ritualistic apparatus of preventive surveillance. Chatcontrol will neither save children nor stop child abuse online, but every digital gesture will become a potential clue in an endless hidden investigation.
We will all become potential suspects â permanently under surveillance. You deserve to know about it.
Scope and Applicability
Chatcontrol presents itself as a law to prevent and combat the dissemination of child sexual abuse material online. To do this, lawmakers want to build a permanent infrastructure of identification and mass surveillance within every system of online communication and interaction. The idea is that CSAM and grooming attempts can be intercepted quickly.
In reality, Chatcontrol designs a computational Panopticon that dissolves the line between private and public life.
Everything we use to communicate or search online will be subject to this apparatus, ready to observe and judge every whispered message, every shared image, every transmitted video:
Hosting (cloud services, social platforms, file sharing)
Messaging services, even secondary ones (e.g., in online games)
App stores
ISPs
Search engines
Detection Orders: activating the Panopticon
Providers falling under Chatcontrol can be forced by authorities to scan their services for known or new CSAM, or grooming attempts. This means they must install surveillance and data-extraction technologies, always ready for activation.
When a Detection Order is issued, providers must:
Identify involved users, including their location
Transmit all potentially relevant data to authorities: IP addresses, timestamps, device identifiers, metadata and of course content: text, images, videos, audio.
How far back these orders can reach is unclear. One can assume: very far.
Who can be forced to scan communications?
Only âhigh-riskâ services, or those for which there is reasonable suspicion of CSAM or grooming, can be compelled to scan user communications.
However, a simple reading of the text reveals that most of what we use daily will be classified high-risk by default:
Social media
Electronic communication services
Online gaming
Adult platforms
Forums and chatrooms
Marketplaces
File storage/sharing services
Web hosting
Online search engines
Services targeting minors
Providers must also assess, for risk-scoring purposes:
how many minors use the service
usersâ behavioral patterns (mass profiling)
the ability to share images and videos
Risk automatically spikes if a platform cannot identify at least 60% of its users â for example, where identity verification is absent or disposable accounts are allowed.
Anonymity and pseudonymity are classified as risk factors, and platforms will do everything possible to eliminate that risk. Combined with age-verification obligations, the result is catastrophic for online freedom.
How scanning will be performed
The law does not impose a specific technology. Over the years, various proposals emerged:
On-device scanning (Appleâs NeuralHash)
Hybrid systems
Automated AI scanning of content (text, audio, video)
Latest reports show an accuracy of 87â88%.
Sounds high? Think again. Letâs assume 400 million active digital users in the EU: even with a 10% false positive rate, weâre talking about 30â40 million innocent people wrongfully flagged every year as sexual predators, based on misinterpreted chats, selfies, and family photos.
Permanent scanning means our most fragile spaces â where identity, affection, and desire take shape â will be crossed and judged by foreign agents.
Whether it happens on your device, in the cloud of your favorite social network, or inside your gaming chatsâŚ
Somewhere, someone will read your private messages, and scroll through your intimate photos. Yours â and your childrenâs. Kids are not excluded. This law is âfor them,â after all.
Who gets the privilege of exemption
There are exceptions to this mass-surveillance nightmare, but not for you.
National-security communications
Private corporate or governmental networks (intranets)
In other words: everyone is surveilled except corporations, politicians, military, and bureaucratic elites.
How the Internet will change
Letâs get practical. how will the European Internet look after Chatcontrol?
All anonymous or pseudonymous spaces â where you talk about fears, politics, sexuality without name and passport â will automatically become âhigh-risk.â
In other words: they will gradually disappear.
For platforms like Reddit or Discord this means:
Age and identity verification at signup (usually via biometric facial scanning)
Active surveillance on private messages
VPN bans, no more multi-accounts
Extensive collection of sensitive data (e.g., device geolocation)
Telegram? Even worse. Extremely high-risk (also politically), and thus a primary target for cascading Detection Orders. Expect:
ID and biometric checks
Geofencing and IP-location enforcement
Deep surveillance of every channel and group
Even WhatsApp wonât escape. Meta already collects mountains of metadata. Chatcontrol simply legalizes deeper content access.
The insult added to injury
Chatcontrol is not just mass surveillance.
It is a ritual of submission to overseas digital empires.
European lawmakers, who love preaching âdigital sovereignty,â are handing the role of moral guardians to American and Chinese mega-platforms.
So much for GDPRâs promised paradise.
We keep pretending the European Union is a haven of privacy and civil rights, while walking in formation toward our collective digital burial site.
What can ordinary people do?
Chatcontrol marks the beginning of an era where privacy becomes a technique of survival.
If Europe wants to turn the digital world into a laboratory of predictive profiling, the only reasonable response is to become more competent than the systems watching us.
This means adopting what some would label crypto-anarchism, but which now is simply the defense of human dignity:
take control of your hardware and software
migrate your communications to truly encrypted, decentralized systems
protect data, identity, and habits through correlation-minimizing tools
learn the operational craft of privacy
A toolbox already exists: sovereign operating systems, secure protocols, strategies of digital opacity. Some are gathered in the Digital Grimoire, others will soon arrive in a dedicated manual for Cyber Hermetica initiates (subscribers).
You donât need to become invisible. No one can anymore. You just need to remain unpredictable.
If this resonated with you, share it!



Ever since we first sat at any device we have been monitored. By the programme (Facebook), our employer (using a colleague to spy), or even watching the employee and not telling them. Our lives have *never* been private.
If they want to hear me ask about my brother's health, or my sister's move across the country, they can join the club! Or they can read about my reading St John of the Cross, Robert Browning, or the letters of St Francis Xavier SJ, or even Marcus Aurelius I will be delighted to join the chat.
So, please, no more nonsense about any 'rights' we may think we have when this is simply another illusion fostered by whoever wants to join the obfuscation.
In the usual phrase, 'I have but one soul to save (mine) and one Judge' when I shall be judged at the pearly gates. Am I worried? No. Because I mind my own business, try to be kind, try to do and think exactly what I should (10 Commandments and The Beautitudes), and probably fail therein just like everyone else.
Are private chats really so private?
In our family we always assume there's no expectation of any privacy given to us. Not all children behave in this way. Usually they have very varied interests that are not based on self.
Photographs can be so easily manipulated and I remember one policeman telling me about this idea in a fraud case he was concerned with. He said it was unusual, but it did happen.