CHILDREN WITH DISABILITIES
Child protection: The duty of platform providers
In the current discussion on the EU Commission's Child Sexual Abuse (CSA) regulation, the German government's Independent Commissioner for Child Sexual Abuse Issues, Kerstin Claus, has made clear she believes two sets of legal interests - child protection and data protection - are taken equally seriously and not played off against each other. Keeping children and young people safe online is a duty of Internet service providers, she says.
by Kerstin Claus, german version published on 31.05.2023
Digital media are an integral part of the lives of children and young people, serving as sources of information and also as spaces for social, partnership and sexual orientation and experiences. But, at the same time, they also bring along with them various potential hazards: These include cybergrooming, i.e. targeting minors on the Internet with the intention of establishing sexualised contact, publishing nude images without consent or against the will of minors, to sexual violence and exploitation of minors which may also lead to the dissemination of images on the Internet and making money out of it.
The importance and necessity of rules to protect children in the digital space is also shown by a study commissioned by the Media Authority of North Rhine-Westphalia, for which more than 2,000 children and young people between the ages of eight and 18 were surveyed. Just under a quarter of the children and young people said they had already met an adult online then were asked by that adult to meet them in person. These figures clearly show how high the threat potential is on the Internet for children and young people.
Providers must be obliged to protect
Against the backdrop of these figures, I, as the German Abuse Commissioner, fundamentally consider the EU Commission's draft to combat child abuse on the Internet to be right and important - despite ongoing heated debates. The Commission is focusing on the digital environment, where children and young people have been largely unprotected up to now. It is a paradox that in the real world we specify which spaces may be used by children and young people from what age, but leave them unprotected in the digital world, without age verification, without support, and without sufficient digital literacy.
In essence, the CSA (Child Sexual Abuse) regulation seeks to improve Internet service providers' platforms following a child protection approach and taking in account the perspective of underage users. This is an important step towards better child protection on the Internet. The CSA regulation clarifies that providers have a duty of care to ensure their services keep underage users safe. Therefore they must carry out risk assessments of their services and take appropriate measures for protection. The basic principle is that children and young people cannot be left to sort these things out on their own. We must provide safe spaces for them, which is also of significant importance in digital environments. This fundamental protective concept made its way into the Amendment of the German Youth Protection Act in 2021 - and is now also reflected in the EU's Digital Services Act.
Don’t play child protection and data protection off against each other
Current criticism of the EU Commission's proposal focuses primarily on the question of whether it is permissible to interfere with privacy or the right to confidential communication in order to protect children and young people, what is appropriate and what is not. This debate is enormously important, because it involves the difficult weighing of fundamental rights. But, the political objective must be to achieve the highest degree possible of child protection online while at the same time respecting the right to privacy in communication as well as the legal right of children and young people to protection from sexualised violence on the Internet.Some of the measures in the draft regulation are very comprehensive, and are being reviewed and probably adjusted where necessary as the legislative process continues. However, this should not lead to the rejection of the draft. A detailed assessment and consideration are necessary but, unfortunately, not all arguments are given the same attention. The protection of children and adolescents from violence too often is not set out as a fundamental right which also has to be respected - and, unfortunately, is often not understood. Instead, the focus is on the fundamental right to privacy and confidential communication. However, this also must comprise the right of children and young people not to have their images posted on the Internet if they do not want them to be. Children too have the right to privacy and confidential communication. These rights must be respected as well.
Specific, legally compliant measures required
Above all, the projected CSA regulation is an effective instrument for combating digital sexual violence against children and young people because it sets out three important core tasks: notice and take-down of already known child sexual abuse material, detection and deletion of new material, and detection of cyber-grooming strategies.
These tasks require different approaches as well as smart deployment of trained professionals and technical tools - both, on the side of providers and law enforcement. These must be in accordance with all other EU legal rights.
The European Commission's proposal therefore provides for a multi-stage procedure: Providers firstly will carry out a risk assessment on their own, identify the risks, and then employ instruments to mitigate or avoid these risks. This can be achieved, for example, by use of moderators, age labels, precautionary measures in the preference settings or the creation of safe spaces for children. Further measures shall and may only be taken when providers receive an explicit legal order, for example to detect known child sexual abuse material based on hash values. Material thus found must then be reported to the projected EU Center that investigates the case. Eventually, a judicial order or equivalent is needed to delete the material, keeping in mind the rights and protections of all parties involved. The shortened term "chat control without cause" does not do justice to this multi-stage procedure and polarises the debate in only one direction.
The objective must be to detect and prevent abuse as early as possible. If, for example, the use of artificial intelligence (AI) can help us in the future to find unknown abuse material and to detect the solicitation of minors by perpetrators at an early stage, we should also assess and discuss the deployment of these technologies. Progress is made continuously, especially in the field of AI. Therefore, regulations we negotiate based on today's state of knowledge must be comprehensively conceived in a technology-neutral way, not to be limited tomorrow to prosecute global offender structures in the digital environment with yesterday's technology. Given the immense pace of progress in artificial intelligence, this would be disastrous.
What we need is a debate around the measures and ideas best suited for the highest possible level of child protection in the digital environment, while at the same time taking into account the protection of privacy in communications.
Kerstin Claus is the Independent Commissioner for Sexual Abuse Issues (UBSKM) since April 2022, a position for which she was appointed by the German Government. She has already been active for years, both full-time and on a voluntary basis, in combating sexual violence against children and young people.