Chatcontrol: global surveillance of communications
The European Commission's proposal to "protect children online" promises to turn the EU into the largest surveillance regime in the world.
Today we shall talk about how the European Union wants to create the worst mass surveillance regime ever seen in the West.
Conspiracy? Science fiction? Nah, that's the aim of a law just proposed by the European Commission. Friends call it Chatcontrol 2.0, but its current full title is “Regulation laying down rules to prevent and combat child sexual abuse”.
I will try to explain to you in a simple way what Chatcontrol is about, how this crazy and totalitarian surveillance regime will apply, and if it is possible to protect yourself.
The story so far
Before going into the details, let's quickly summarize how we arrived at Chatcontrol 2.0.
In March 2020, the United States, Australia, the United Kingdom, New Zealand and Canada signed an international agreement "Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse". The agreement defined the principles by which governments, companies and law enforcement agencies should be inspired to fight child abuse and online child pornography.
In July 2020, the European Commission announced the new strategy (2020-2025) to fight child abuse and online child pornography. The strategy had the objective of “Building a strong legal framework for child protection and facilitating a coordinated approach among all actors involved in child protection”.
The regulation we are talking about today was preceded in 2021 by another "ad interim" regulation, which we can call Chatcontrol 1.0.
In Chatcontrol 1.0 there wasn't much actually. It was a derogation from the ePrivacy Directive which allowed communication service operators to voluntarily scan messages (unencrypted, like Messenger) to identify and report child pornography content (which from now on I will call CSAM1).
What Chatcontrol 2.0 is about
Chatcontrol 2.0 was formally created to protect minors from online abuse, counter the spread of child pornography and create a "safe, predictable and trusted" online environment.
But behind the 135-page wall of legalese and technicalities lurks a system of mass surveillance that would be the envy of China.
First of all, the law will oblige every hosting and communication service provider and app store to identify, analyze and evaluate the risk that their services could be used to abuse minors in any way or to disseminate child pornography content.
Risk assessment is the step that precedes the implementation of mitigation measures, such as content moderation, age verification (therefore identification) of people and particular surveillance mechanisms.
In some cases the assessment will not even be necessary, because it is the law itself that says that in some cases, such as for communication services, a high risk must always be assumed.
The icing on the cake will be the creation of a European Center on Child Sexual Abuse, which will be the hub of the whole system. It will be based in The Hague, near Europol - with which it will collaborate, using its resources, both human and technological.
How European mass surveillance will work
As for the surveillance activities, which are of interest to us, there will be two types:
1. Surveillance and analysis of communications, to prevent the risk of online grooming.
2. Surveillance and analysis of photos and videos, to prevent the spread of child pornography (CSAM)
The first concerns the text of our private conversations. But how do you identify, among billions of conversations, those rare cases where someone is really trying to abuse a minor? It's not that easy.
This is why the legislator wants to force communication service providers to use tools to scan and analyze communications in an automated way, with machine learning algorithms:
“[…] the detection process is generally speaking the most intrusive one, since it requires automatically scanning through texts in interpersonal communications. […] such scanning is the only possible way to detect it […]. Detection technologies have also already acquired a high degree of accuracy, although human oversight and review remain necessary”.
The role of algorithms will be to understand if we are talking to a friend or if instead we are making sexual advances towards a child.
But algorithms are certainly not infallible. The risk of being flagged as a sexual predator by an automated algorithm exists and is quite concrete. As written in the text of the law, the most advanced algorithms do not exceed 88% accuracy.
This is where people come into play: human review will be a fundamental piece of the whole puzzle: there will be people with the task of deciding, after reporting the algorithm, whether we are paedophiles or not.
In the EU live about 500 million people. With an average error rate of 10% (to be very optimistic) every year 50 million innocent people could be wrongly reported as pedophiles, criminals, sexual predators. Good luck with that.
Monitoring and analysis of photos and videos
The second type of surveillance concerns the analysis of photos and videos. Here the discussion becomes very technical and the law does not specify the exact ways in which it is proposed to do so.
But I can say with certainty2 that the images and videos transmitted through communication services and stored in the Cloud will also in this case be analyzed by machine learning algorithms. This time, however, the algorithms will not have to understand anything, but simply scan the contents and create "hash digest3” which will then be compared with an external database of hashes relating to already known child pornography content.
There are many ways to do this.
Apple, for example, last year proposed a way to scan content directly to the device's memory, then send the hashes along with messages, as if they were attachments - thus simplifying the process of scanning and detecting CSAMs. Below is a technical scheme of how such a system could work:
Also in this case there would be errors. According to the Swiss Federal Police, more than 87% of reports that arrive using hashing mechanisms are irrelevant4.
But the technical solution is unimportant. What matters is that with Chatcontrol, a regime of systematic surveillance of private communications, images and videos is created, both directly on the devices and through communication services (online platforms, chats, emails, etc.) or Cloud hosting. It will be the end of any private communication. There will no longer be any safe space.
The War on Cryptography
It is not by systematically monitoring 500 million people that children will stop being victims of violence.
Why not invest resources to actually fight the phenomenon instead of surveilling 500 million innocent citizens like criminals? Or perhaps, according to the Commission, are we all potential paedophiles?
At the risk of sounding like a conspiracy theorist, the answer is very simple: the goal is not to combat child abuse but to combat encryption.
It is no coincidence that the EU announced its strategy just a few months after the 5 Eyes5 (United States, Canada, UK, New Zealand and Australia) signed an international agreement on the fight against online child abuse.
For years, members of the 5 Eyes have been pushing politically to limit the expansion of encrypted communications. Failing to succeed, today they try by all means to bypass encryption through technological and legal tools, such as the one proposed by Apple last year (NeuralHash).
And it is no coincidence that in November 2020, right after the announcement of the European strategy, a statement from the EU Council came out entitled "Security through encryption and security despite encryption", which described encryption as an obstacle to the prosecution of crimes.
And finally, it is no coincidence that the new European Center will be closely linked to Europol, which for some years has been making no secret of its anti-cryptography agenda. It is impossible not to see the convergence of intents.
Just as terrorism justified the US Patriot Act in 2001, one of the worst mass surveillance laws of the last 20 years, child pornography today is the Trojan horse to push the West towards a global surveillance regime.
Chatcontrol will also have a deterrent effect towards end-to-end encryption. Vendors will have to choose between protecting user privacy or protecting themselves from fines. What do you think they will choose?
Can we protect ourselves?
It will not be easy to protect yourself from such a surveillance system.
End-to-end encryption of communications will help up to a certain point, because as we have seen there are already tools designed to bypass the protection, by scanning the contents directly on the device memory, at the operating system level.
The only option in that sense would be to install an open source operating system such as GrapheneOS or CalyxOS, use p2p communication services such as Session, which will hardly be forced to use surveillance technologies and private or decentralized Cloud storage and data transfer systems (such as Skysend or onionshare). But the problem remains. Taking refuge in technology does not help solve the problem.
Moreover, it will be difficult to convince billions of people who today communicate on Twitter, Instagram, Facebook, Whatsapp, Telegram to use different systems that also require a certain type of awareness and technological expertise.
The best option is to ensure that Chatcontrol 2.0, like many other laws, never becomes a reality, making people understand what is going on and discouraging politicians from pursuing the path of mass surveillance.
I hope that together we can help as many people as possible to understand in which direction we are going, to take a strong turn before it's too late.
CSAM = Child Sexual Abuse Material
The certainty derives from having read technical documents of the European Commission in which a dozen technical options were described to obtain this result.
In practice, a unique numeric string that is obtained starting from a certain input (e.g. a photo).
https://www.patrick-breyer.de/en/posts/messaging-and-chat-control/
The largest spy alliance in the world.