A A+ A++
12/7/2021

Independent experts’ report for CoE: Automated technology to detect online child sexual abuse must respect human rights

Independent experts’ report for the Council of Europe says the technology used to detect the online sexual abuse of children must respect human rights and the rule of law. The scale of online child sexual exploitation and abuse is increasing at an alarming rate.

According to the Internet Organised Crime Threat Assessment, detection of online child sexual abuse materials (CSAM) has been already increasing on a year-to-year basis and seen a sharp spike during the peak of the COVID-19 crisis. Such abuses are long-lasting in a child’s life, especially because of the continued circulation of the images. While it is vital to find ways to identify and help rescued child victims and investigate crimes and stop the circulation of CSAM, the use of automated technology may impact the confidentiality of the content and related traffic data, which service providers must ensure. Therefore, it is essential that detecting child online child sexual exploitation and abuse is conducted in a manner that is fully human rights-compliant and respects the children’s right to privacy.

The report has been prepared by a group of experts, led by former European Court of Human Rights President Linos-Alexandre Sicilianos, for the Lanzarote Committee of the Parties to the Council of Europe Convention on Protection of Children against Sexual Exploitation and Sexual Abuse. Drawing on a wide range of Council of Europe standards, it aims to help policymakers develop a comprehensive and balanced approach to the use of automated technologies to detect child sexual abuse material. It also contains a series of recommendations and calls for the establishment of a “public interest-based framework”, based on the Lanzarote Convention and other Council of Europe conventions, enabling service providers to automatically detect, remove and report relevant content in line with data protection and privacy safeguards.