Detection Orders, Encryption and Proportionality: The CSAR Proposal and EU Fundamental Rights Law
- 2 hours ago
- 6 min read
Written by Andzhelin Feodorova
Edited by Thomas Landerretche
Andzhelin is an undergraduate student at Sciences Po Paris Le Havre Campus, in the Social Sciences track, with a Politics and Government major. Her academic interests include digital regulation, technology governance, and regulatory compliance policies.
Introduction
In May of 2022, the European Commission introduced a draft called the Child Sexual Abuse Regulation (CSAR), a legislative proposal with the purpose of enhancing the detection and reporting of online child sexual abuse material (CSAM) as well as the prevention of child harm in the digital space. Among its provisions, articles 7 to 10 establish a framework under which competent authorities may issue detection orders, following risk assessment, requiring certain providers (including interpersonal communications services) to deploy detection measures for specified purposes (including known and unknown CSAM and solicitation and grooming).
While, of course, child protection is a legitimate and vital interest under Article 24(2) of the Charter of Fundamental Rights of the European Union (CFR), the CSAR also raises some significant legal questions concerning the rights to privacy and data protection enshrined in Articles 7 and 8 CFR and Article 8 of the European Convention on Human Rights (ECHR). Thus this paper argues that the detection obligations introduced by Articles 7 to 10 of the CSAR proposal fail to meet the proportionality requirements under Article 52(1) CFR and relevant jurisprudence of the Court of Justice of the European Union (CJEU) and the European Court of Human Rights (ECtHR). In particular the proposal lacks the legal certainty required under EU fundamental rights law, may not be considered strictly necessary in light of less intrusive alternatives and additionally risks undermining the essence of confidential communications protected under Articles 7 and 8 CFR.
Legal Framework
The Charter, the ECHR, and the GDPR
Article 7 CFR guarantees the right to respect for private and family life, while Article 8 CFR protects personal data. Similar protections are reflected in Article 8 ECHR, which permits interference only when it is “in accordance with the law” and “necessary in a democratic society.”
In addition, the General Data Protection Regulation (GDPR) mandates that data processing must strictly adhere to principles of lawfulness, necessity, proportionality and data minimization (Articles 5, 6, and 25 GDPR), requiring privacy by design and data protection impact assessments where high risk processing is involved.
The Proportionality Test Under Article 52(1) CFR
Under Article 52(1) CFR, any limitation on fundamental rights must firstly be provided for by law, secondly pursue an objective of general interest, then be necessary, in the sense that no less intrusive alternative exists, and finally it should be proportionate in the strict sense, meaning the benefits outweigh the harm and the essence of the right is preserved.
This framework has been clarified in key judgments of the CJEU, including Digital Rights Ireland (C-293/12), Tele2 Sverige (C-203/15), La Quadrature du Net (C-511/18), and by the ECtHR in Big Brother Watch and Others v. the United Kingdom (App no. 58170/13). These cases will provide the lens through which the CSAR’s Articles 7 to 10 are examined.
Applying the Proportionality Test to the CSAR Proposal
Legality and Legal Certainty
While CSAR is formally grounded in EU legislative authority and pursues a recognized general interest, the legal clarity of its obligations is in question. Article 52(1) CFR requires that limitations be “provided for by law,” which the CJEU has interpreted to mean clear, precise and foreseeable in their effects.
In La Quadrature du Net, the Court held:
“Legislation must define the nature, scope and conditions of data retention in a clear and precise manner, providing adequate safeguards against misuse” (para. 132).
The detection obligations in Articles 7 to 10 of CSAR, however, seem to leave some elements undefined and more specifically: the exact scope of “interpersonal communications” subject to scanning, the criteria for issuing detection orders and finally the technologies to be employed, particularly in end to end encrypted environments.
In addition, the joint opinion of the European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) furthers this concern by stating: “The Proposal lacks clarity on key elements… leading to legal uncertainty on how to balance the rights at stake in each individual case.”
Thus although the CSAR may satisfy the requirement for legality in form, it also falls short of the foreseeability and precision required under EU and ECHR jurisprudence.
Necessity
To pass the necessity prong, a measure must be achieved through the least rights restrictive means of achieving its objective. In Digital Rights Ireland, the CJEU stated:
“An interference with the fundamental rights… must be precisely circumscribed and strictly necessary” (para. 52).
In the Tele2 Sverige case, the Court held that general and indiscriminate retention of traffic and location data is unlawful:
“Article 15(1) of Directive 2002/58, read in the light of Articles 7, 8 and 11 of the Charter, must be interpreted as precluding national legislation which provides for general and indiscriminate retention of traffic and location data.” (para 117)
CSAR obliges service providers to scan user communications, regardless of suspicion. This scanning extends to encrypted messages and includes detection of both known and unknown CSAM and grooming behavior, tasks which rely on probabilistic machine learning models with potential error rates.
The proposal does not explain why targeted detection based on judicial warrants, enhanced reporting obligations, or voluntary reporting frameworks would not suffice. The absence of clear explanation could very well undermine its compliance with the necessity standard, particularly when less intrusive alternatives could be found.
Proportionality in the Strict Sense
Even if necessary, a measure must not impose burdens so severe that they outweigh its benefits or compromise the essence of the right. In Big Brother Watch, the ECtHR emphasized:
“A bulk interception regime must be subject to independent authorisation, have clear rules, and include measures to ensure the protection of confidential information” (para. 430).
The proposal requires detection orders to be authorised or issued by a judicial authority or an independent administrative authority, though the suspicion still could remain about whether the safeguards and limits are sufficient in practice. The tools proposed include error prone classifiers trained to detect unknown material and behavioral patterns, raising concerns of false positives and overreach.
According to Brkan:
“An interference that compromises the very essence of a right cannot be justified, regardless of proportionality.”
This is especially significant for encrypted services, where detection mechanisms may require weakening or circumventing encryption, a practice which risks destroying the core guarantee of confidentiality in communication.
General Surveillance and Legal Precedent
The CJEU has consistently rejected generalized surveillance. In La Quadrature du Net, the Court ruled:
“General and indiscriminate retention… cannot be considered justified within a democratic society” (para. 141).
Although CSAR’s aim differs from national security, its methods, broad, untargeted scanning of all private messages, mirror surveillance regimes already struck down. The structural similarity is legally significant: the regulation could authorize state-mandated surveillance without prior suspicion, undermining the principle of proportionality.
As Czarnocki argues:
“Balancing conflicting fundamental rights does not mean ranking them hierarchically but constructing governance that respects both.”
This logic implies that CSAR, by favoring protection over privacy without structural safeguards, disrupts the equilibrium envisioned by EU law.
Conflicting Rights: Child Protection vs. Privacy
While Article 24(2) CFR requires that children’s best interests be a primary consideration, this does not justify overriding other Charter rights. As Digital Rights Ireland affirmed, even the “importance of the objective pursued… cannot in itself justify” excessive interference (para. 69).
Weinrib further observes:
“The essence of a right is violated when state action destroys its conceptual core.”
The default scanning of private communications, including encrypted ones, risks violating that core, particularly when done without suspicion or oversight.
Conclusion
The CSAR proposal is addressing a real and pressing concern, to protect children from online abuse. But the problem is that Articles 7 to 10 are imposing detection obligations that, after being measured against the proportionality framework under Article 52(1) CFR and CJEU/ECtHR jurisprudence, bring up quite a few legal issues.
These provisions overall lack the legal clarity required under EU law, impose scanning obligations with insufficient justification of necessity, operate without essential safeguards such as judicial authorisation and risk undermining the essence of the right to privacy.
To ensure CSAR compliance with the EU fundamental rights law, it would be necessary for it to include narrowly defined scanning obligations, mandatory prior judicial review, technology-neutral provisions that respect encryption and targeted, evidence based detection procedures.
As the CJEU concluded in Digital Rights Ireland:
“The mere importance of the aim pursued… cannot justify such serious interference” (para. 69)
The proposal needs to be changed so that it complies with the principle of proportionality and safeguards the constitutional framework of privacy rights in the EU legal order.
Bibliography
Brkan, Maja. “In Search of the Concept of Essence of EU Fundamental Rights Through the
Prism of Data Privacy.” SSRN Electronic Journal, 2017.
Court of Justice of the European Union. Digital Rights Ireland Ltd v Minister for Communications, C-293/12, April 8, 2014.
Court of Justice of the European Union. La Quadrature du Net v Premier ministre, C-511/18, October 6, 2020.
Court of Justice of the European Union. Tele2 Sverige AB v Post- och telestyrelsen, C-203/15, December 21, 2016.
Czarnocki, Jakub. “Between Rights, Interests, and Risk – The Role of Proportionality Balancing in the EU Digital Law.” SSRN, 2024.
European Court of Human Rights. Big Brother Watch and Others v. The United Kingdom, App. No. 58170/13, May 25, 2021.
European Data Protection Board and European Data Protection Supervisor. Joint Opinion 04/2022 on the Proposal for a Regulation Laying Down Rules to Prevent and Combat Child
Sexual Abuse. Brussels, July 28, 2022.
Weinrib, Jacob. “The Essence of Rights and the Limits of Proportionality.” SSRN, 2023.
