EU to Decide on Requirement for Tech Firms to Scan Encrypted Messages

A law enforcement meeting scheduled for 12 September will determine whether technology companies, including Signal and WhatsApp, are required to scan all encrypted messages and communications before they are transmitted. The Danish presidency of the EU Council is pushing for a vote on the proposals, dubbed "Chat Control," by 14 October.

More than 500 cryptographers and security researchers have signed an open letter warning that the proposals are technically unfeasible and would "completely undermine" the security and privacy of all European citizens. The letter states that mass scanning of mobile phones and computers to identify suspected child abuse material sent by encrypted communications services used by the public is not possible and could be exploited by hackers and hostile nation-states.

WhatsApp has expressed concerns about the EU's draft proposals, stating that they "break end-to-end encryption and put everyone's privacy, freedom, and digital security at risk." The European Commission first proposed mandating tech companies to scan emails and messages for potential child abuse content in 2022, but the plans were put on hold due to opposition from member states.

The Danish presidency has proposed a compromise in July 2025, which aims to strike a balance between maintaining the security of encrypted communications services and identifying potentially illegal content. The proposal asserts that nothing in the regulation should be "interpreted as prohibiting, weakening or circumventing" encryption, and expressly permits technology companies to continue offering end-to-end encrypted services.

However, the proposal also requires technology companies to introduce "vetted technologies" on phones and computers to scan messages for images, videos, or URLs that could be associated with known child abuse content before they are encrypted and transmitted. Tech companies will also be required to deploy artificial intelligence (AI) and machine learning algorithms to detect previously unknown abuse images.

As of 10 September, 15 member states have supported the Danish proposals, while six member states are undecided and six in opposition. The dissenting countries include Belgium, Poland, Finland, and the Czech Republic, which have raised concerns about mass surveillance of citizens' communications. Supporters include France, Italy, Spain, and Sweden.

Each member state receives votes based on the number of representatives it has, with large countries having more sway over the final decision. The debate surrounding Chat Control highlights the tension between maintaining security and protecting privacy in the digital age.

"The new proposals, similar to its predecessors, will create unprecedented capabilities for surveillance, control, and censorship, and have an inherent risk for function creep by less democratic regimes," said cryptographers and security researchers representing 30 countries in their open letter. They argue that the proposals would "completely undermine" the security and privacy of all European citizens.

The scientists warn that existing research confirms that state-of-the-art detectors would yield unacceptably high false positive and false negative rates, making them unsuitable for large-scale detection campaigns at the scale of hundreds of millions of users. They also claim that proposals using AI and machine learning to identify unknown abuse images are flawed due to their inability to accurately detect illegal content without making large numbers of mistakes.

German encrypted email provider Tuta Mail has stated that if the EU's Chat Control proposals are adopted, it would take legal action against the EU rather than betray its users by introducing backdoors into its encrypted messaging service. CEO Matthias Pfau said the proposals would undermine trust in European technology.

Alexander Linton, president of the Session Technology Foundation, another encrypted messaging service, stated that it is not possible to introduce scanning without creating new security risks. He said that none of the technologies available achieve a standard where they do not introduce new unmitigable risks.

Matthew Hodgson, CEO of Element, a secure communications platform used by European governments, described the proposed Chat Control regulation as fundamentally flawed and would put the privacy and data of 450 million citizens at risk. He stated that undermining encryption by introducing a backdoor for lawful intercept is nothing other than deliberately introducing a vulnerability that always gets exploited in the end.

Signal has also expressed concerns about the proposal, stating that it would pull its messaging service out of the European Union rather than undermine its privacy guarantees. Callum Voge, director for government affairs and advocacy at the Internet Society, a non-profit organization, argued that client-side scanning creates opportunities for bad actors to reverse engineer and corrupt scanning databases on devices.

"If breaking encryption is like having the envelope ripped open while a letter goes through the Post Office, client-side scanning would be like someone reading over your shoulder as you write the letter," Voge said. He argued that even if AI scanning were 99.5% effective at identifying abuse, it would lead to billions of wrong identifications every day.

Voge emphasized that policymakers should prioritize approaches that protect children but also foster an open and trusted internet. "That means more resources spent on targeted approaches – things like court-authorised investigations, metadata analysis, cross-border cooperation, support for victims, prevention and media literacy training," he added.