Yaël Ossowski is the deputy director at the Consumer Choice Center, a leading advocate of privacy rights and freedom of expression, with offices in North America, Europe, and Asia.

He spoke with europeanconservative.com’s Rafael Pinto Borges about ‘Chat Control,’ the controversial electronic messaging surveillance law currently under discussion in the European Council. The Council is expected to formalize its position on the Danish version of Chat Control on September 12th in preparation for a vote on October 14th.
Described as the EU’s attempt to end private messaging, Chat Control would mandate the scanning of all electronic communications—including encrypted messages, before encryption—ostensibly as part of efforts to combat child sexual abuse material. First introduced in 2022, the proposal has faced strong opposition from several member states—and with good reason, says Ossowski.
At its core, the Chat Control law proposes scanning all private communications ‘for safety.’ Many critics see this as digital mass surveillance. How do you view the law in terms of proportionality and basic civil liberties?
The EU’s Chat Control regulation has an intended purpose, namely to compel messaging providers to detect and report child sexual abuse material (CSAM) and to flag and disable the accounts sharing it. Police agencies in member states are supposed to be the liaisons. This is a new ‘duty’ forced on private providers of encrypted software that effectively breaks the encryption algorithms that currently protect private messages from being read or seen by anyone apart from their intended recipient.
Once more, this is an example of using an egregious example that any rational person would abhor—images of children being abused—and using this as a justification to grant police new powers to eavesdrop on private communications. The regulation purports to be a crime-fighting tool, but police already have discretion to file judicial orders to gain access to devices. Realistically, this is an opening of the Overton window to normalize our state institutions taking a peek into our devices that will inevitably be abused to put otherwise innocent people in legal jeopardy.
Supporters of Chat Control frame it as a child protection measure. How do you respond to the idea that opposing it means putting privacy above safety—and is that framing deliberately misleading?
It’s misleading because the power being sought, forced client-side scanning and detection of certain material, effectively breaks the algorithm that powers and supplies the most modern encrypted messaging apps. There is no discussion about how police authorities already have immense power using legal tools and processes.
The European authorities want to implement backdoors to end-to-end encryption, which means we no longer have end-to-end encryption but a deliberate ‘man-in-the-middle’ attack that will ensure there is no offensive content being shared between people. Again, we have the intended effect of catching CSAM, but realistically, there is nothing that would stop national authorities from also mandating searches of various violations related to speech, vaccines, misinformation, or otherwise legal but offensive content. The regulation opens the door to using this power, which means it could easily be adapted by any member state to combat illicit markets or any other type of crime, whether perceived or not.
The proposal essentially mandates ‘client-side scanning.’ What risks does this pose not only for ordinary citizens, but also for journalists, activists, and political dissidents who rely on secure communications?
Client-side scanning means that before content is delivered from your device to the intended recipient, it will be uploaded to another database to screen for offensive or illegal material. In a normal end-to-end encrypted conversation, each party to the message has a private and a public key. By forcing content to be filtered, this obliterates the usefulness of even having the private-public key pair that ensures content is delivered as intended.
Using an easy example, if a despotic government wants to track down who is organizing protests or rallies using these apps, all they would need to do is adjust the filter to find those attending these events and easily get device IDs to find and arrest those individuals. The same applies to dissident journalists or activists who are using technology to further their own individual rights. We do not need to imagine such a far-fetched scenario, because it is already used by governments around the world to track messages in non-encrypted messaging software.
From a sovereignty perspective, this is an EU-wide mandate. How does this affect the ability of member states to craft their own balance between privacy, security, and individual freedom?
As with all EU regulations, there is some ability to adapt the actual text of the law as it applies in the member state, but only if the baseline is the same. That means that any country could easily amend the law and add categories in addition to CSAM needing filters for law enforcement: prostitution, drug trafficking, terrorism, violence, but also speech violations, and more. As free Europeans, we detest and abhor these activities, but we should also recognize that our authorities must use legal and lawful tools in order to build their cases and make arrests. Otherwise, we risk criminalizing otherwise normal people who are only sharing information.
Chat Control seems to undermine end-to-end encryption, a cornerstone of digital privacy. Do you think this initiative could set a precedent for further erosion of encryption in Europe and globally?
Absolutely. Encryption is not just some fringe implementation used in messaging. Every time we connect with an HTTPS website, log into our bank, pay our taxes online, or upload sensitive information to our doctors, we are using encrypted protocols to safeguard our information from prying eyes. The police and military use encrypted messaging on radios and written communications to keep soldiers and policemen safe while they do their jobs. Once Chat Control is implemented, it will mean that there is a legal process to undo those protections and mandate that all information be made available to authorities at all times. Not only does this defy the actual mathematical logic of encryption—which makes information unreadable to those without private keys—but it also creates a danger that this could easily be abused to the detriment of all of our rights and liberties.
Critics warn of ‘function creep’—tools introduced for child protection later being used for political or commercial monitoring. How real is that danger, given Europe’s political and bureaucratic environment?
We can already see, through the EU’s Digital Services Act and Digital Markets Act, that regulations forcing compliance in the delivery of digital services are being justified for reasons that never seem to end. Whether it’s protectionism or policing of political narratives and communication, there will always be a reason for national governments to want to compel certain types of information from being discussed or widely shared. What’s more, every iteration of European regulation mandates severe practices and reporting that inevitably end up making the consumer experience for people simply wanting to use technology just that much worse.
The fear is not that the EU or any other member state suddenly becomes totalitarian. The fear is that we allow our rights and liberties to be legislated away with our own say, giving tools of oppression to anyone clever enough to use them.
We often hear that Europe is a champion of human rights and privacy protections, especially compared to other blocs and regions of the world. Does Chat Control represent a betrayal of that self-image?
Chat Control, as currently designed, betrays Europe’s self-professed image as a protector of human rights and privacy. Removing the ability for people to use math to communicate, which is essentially what encryption is, means that EU authorities want to put restraints on communication and free speech. Again, we do not need to break entire encryption protocols in order to have a society with law and order. We just need effective legal tools, which already exist, so that we can prosecute the actual people doing harm rather than ordinary citizens caught in the dragnet of mass surveillance.
Geopolitically, Europe is caught between American big tech dominance and Chinese state surveillance models. Do you see Chat Control pushing Europe closer to one of those extremes, and what alternatives should be pursued?
Forcing client-side scanning of encrypted messaging is more akin to the Chinese Firewall and state surveillance, and that should give many Europeans pause. We can only imagine what the East German Stasi would have done with such rules if the technology existed in their time.
There is plenty of skepticism and hatred toward American Big Tech in Europe, but the fact remains that Europeans can today use encrypted messaging protocols from these same companies in order to freely communicate whatever they want. Even if it’s hatred of Big Tech! The genie of innovation and encryption cannot be put back in the bottle because it does serve to benefit hundreds of millions of people in Europe and billions more around the world. European authorities would have much more success if they better articulated how they intend to use their already legal and lawful tools to detect and deter crime. Forcing technology developers to dumb down their technology to make things easier for the police is not the sign of an innovative or forward-looking society.
Finally, if Chat Control is passed, what are the long-term consequences for Europe’s digital ecosystem? Could it drive talent, businesses, or even free thinkers out of Europe in search of more privacy-respecting jurisdictions?
The implementation of Chat Control would certainly be a death knell for many different innovators in Europe’s digital ecosystem. We have new artificial intelligence tools that are actively using encryption to safeguard information and privacy for users, whether they be companies, governments, or individuals. Better messaging apps, and even cryptocurrency protocols, rely on encryption to relay information securely and safely. Introducing roadblocks, or road checks, means that any developer or service offering encryption will have to compromise at some point, making all of our interactions on the Internet much less secure and vulnerable to exploitation and hacking.
Europe must empower its citizens to use the fruits of technology that have such great potential for increasing our economic productivity, standard of living, and progress that will make humans much better off. Taking a detour to force scanning that will break encryption does none of this and, unfortunately, would leave Europe on an island of its own.
Published in European Conservative (archive #1, #2)