An open letter signed by 270 scientists and researchers across 33 countries has raised major technical concerns about the EU’s proposed regulation mandating the scanning of messaging apps for child sexual abuse material (CSAM). The signees argue the techniques are fundamentally flawed and will “completely undermine communications and systems security.”
“From a technical standpoint, to be effective, this new proposal will also completely undermine communications and systems security,” the letter states. “The proposal notably still fails to take into account decades of effort by researchers, industry, and policy makers to protect communications.”
Under the draft regulation, service providers would be required to scan for known CSAM, new CSAM, and grooming behaviour. While changes were made in March to make the orders more targeted and protect encrypted data, the experts say this fails to address their main concerns around the scanning techniques and impact on end-to-end encryption (E2EE).
A key issue raised is the high error rates of automated detection systems, which the experts say are “easy to circumvent by those who want to bypass detection, and they are prone to errors in classification.”
“Given that WhatsApp users send 140 billion messages per day, even if only 1 in hundred would be a message tested by such detectors, there would be 1.4 million false positives every single day,” the letter explains, even assuming a highly optimistic 0.1% false positive rate.
Attempting to reduce these errors by requiring multiple detections could make the system ineffective at catching CSAM, the experts warn: “The number of false positives due to detection errors is highly unlikely to be significantly reduced unless the number of repetitions is so large that the detection stops being effective.”
The letter states that client-side scanning required for targeted orders is incompatible with E2EE, which is designed to ensure only the communicating parties can access content.
“The protection given by end-to-end encryption implies that no one other than the intended recipient of a communication should be able to learn any information about the content of such communication. Enabling detection capabilities, whether for encrypted data or for data before it is encrypted, violates the very definition of confidentiality provided by end-to-end encryption.”
The experts call for the regulation to be halted until proper technical consultation is done on what’s feasible while preserving secure communications.
“We strongly recommend that not only should this proposal not move forward, but that before such a proposal is presented in future, the proposers engage in serious conversations about what can and cannot be done within the context of guaranteeing secure communications for society.”
Instead of relying on flawed scanning technology, the letter recommends proven approaches like education, reporting hotlines, and better moderation through prioritising educational content in search rankings and platform partnerships.
“We recommend substantial increases in investment and effort to support existing proven approaches to eradicate abuse, and with it, abusive material. Such approaches stand in contrast to the current techno-solutionist proposal, which is focused on vacuuming up abusive material from the internet at the cost of communication security.”
In July 2023, a previous open letter signed by 465 academics cautioned that the detection technologies the proposed legislation would mandate platforms to implement are fundamentally flawed and susceptible to being bypassed. The letter warned that requiring these technologies would severely undermine the crucial security safeguards provided by end-to-end encrypted (E2EE) communications services.