
On 11 May 2022, the European Commission proposed the Regulation Establishing Rules for Preventing and Combating Child Sexual Abuse, both online and offline, with the aim of creating a solid legal framework for the protection of children and facilitating a coordinated approach between the many actors involved in protecting and supporting them. The proposal –which is still under discussion in the European Council – provides for strict measures for the detection and reporting of child pornography, by online platforms operating within the EU. Despite the objectives set by the Commission, the text has been the subject of numerous criticisms from those who believe that the proposed rules and coercive measures would undermine the fundamental right to privacy, so much so that they have renamed the proposal “Chat Control”.
The technical details of the proposed Regulation
The proposal involves that major online messaging platforms, such as “WhatsApp”, “Telegram”, “Signal” and “Messenger”, implement systems to scan multimedia content through users’ devices so that any child pornography material can be identified before it is encrypted. In case of positive match, as outlined in Articles 14 and 16 of the Proposed Regulations Establishing Rules for Preventing and Combating Child Sexual Abuse, service providers could alert the competent authority to issue the removal order. The system described as follows would allow a prior scanning of the multimedia files – then preliminary to securing them in the servers of online messaging platforms – so as not to compromise the effectiveness of the “end-to-end” encryption. Thereby the service providers could have access to all users’ messaging content, making their devices “security agents,” thus violating privacy and, eventually, the security of their data.
Legal and ethical issues with the proposal
The worries arising from the possible implementation of these measures were highlighted by the “European Data Protection Board” (“EDPB”) and the “European Data Protection Supervisor” (“EDPS”), which delivered a joint opinion in July 2022, where they expressed significant concerns about the requirements of necessity and proportionality of the measures envisaged in the proposal. Indeed, the two Committees observed that compliance with the principle of necessity entails “a fact-based assessment of the effectiveness of the envisaged measures for achieving the objective pursued and of whether it is less intrusive than other options for achieving the same goal … Another factor to be considered … is the effectiveness of existing measures over and above the proposed one”. The tools indicated in the proposal, which are useful for combating online child pornography, can be easily circumvented by encrypting of the content, including through the use of separate mobile applications before posting or uploading, while those applicable to the detection are subdivided into three types of detection orders for different types of child pornography material (of known, previously unknown child sexual abuse material and grooming), are based on technologies with relatively high error rates. In detail:
-
technologies to detect known child sexual abuse material are usually matching technologies in the sense that they rely on an existing database of known CSAM against which they can compare images (including stills from videos). To enable the matching, the images that the provider is processing as well as the images in the database must have been made digital, typically by converting them into hash values. This kind of hashing technology has an estimated false positives rate of no more than 1 in 50 billion (i.e., 0,000000002% false positive rate);
-
for the detection of new CSAM, a different type of technology is typically used, including classifiers and artificial intelligence (AI). However, their error rates are generally significantly higher, set at 99.9% (i.e., 0,1% false positive rate);
-
lastly, for the detection of grooming the existing technologies have an “accuracy rate” of 88%.
Thus, given the over-invasiveness of the measures, followed by uncertainty about the results, the EDPB and EDPS “consider that the interference created in particular by the measures for the detection of solicitation of children goes beyond what is strictly necessary and proportionate. These measures should therefore be removed from the Proposal”. The context outlined so far has led EDPB and EDPS to raise doubts about the evenness of the proposal, especially regarding “data protection and privacy”. Indeed, service providers would have the ability to monitor users' entire conversations, with unlimited monitoring power. Besides they would have access to a huge amount of personal data, with worrying implications for privacy and confidentiality.
Conclusions
The proposal is currently under negotiation between the European Parliament and the EU Council, the text of which has been amended several times due to concerns from some Member States, so much so that the vote in the Council - set for last June - was cancelled due to fears of dissent from some of the Member States that would have prevented the achievement of the qualified majority necessary for approval. Issues arising from how the proposal would be implemented could be resolved by several alternatives including: enhancing international collaboration between member state law enforcement agencies, establishing a collaborative approach between online messaging platforms, implementing voluntary reporting and parental control systems, and using AI systems to analyse large amounts of data and find patterns of behaviour, trends and correlations.
Lawyer Rossella Bucca and Dr. Lorenzo Maione