FOCUS
-
Published 28.08.23
Cyber Hate Summit 2023
Torsten Krause, SDC
This annual conference of the International Network Against Cyber Hate (INACH) will take place in Malaga, Spain, on October 5. With the motto "Cyber Hate Summit - Connecting to Build Bridges", the impact of the Digital Services Act will be discussed, as well as other developments outside the European Union. The importance of artificial intelligence and algorithmic transparency in the field of digital hate speech will be the focus of the conference, which will be held in English. In addition to the exchange with numerous international actors, many information booths will offer the opportunity to network with international organizations and initiatives. The full program of the event, further information and registration can be accessed via the INACH event page.
-
Published 23.08.23
Enforcing children's rights in digital services
Torsten Krause, SDC, SDC
The German Federal Ministry for Digital and Transport has presented a bill for the national implementation of the Digital Services Act and asked the federal states, local authority umbrella organizations, and specialist groups and associations to comment on it. The Digital Opportunities Foundation participated in this process and submitted a position statement. The position focuses on children's rights as defined in the UN Convention on the Rights of the Child and General Comment No. 25 on the rights of children in the digital environment and the resulting concerns of children and young people, and draws on existing expertise from the project "Child Protection and Children's Rights in the Digital World".
The Digital Opportunities Foundation expressly welcomes the planned establishment of an independent authority for the enforcement of children's rights in digital services at the Federal Agency for the Protection of Children and Young People in the Media in Bonn, as well as the intended establishment of an advisory board with 16 representatives from various areas of society, which is to advise and accompany the work of the Digital Services Coordinator. The Foundation points out that it would be useful to appoint at least one expert representative for the area of children's rights in the digital environment to this advisory board. In addition, it notes that in the course of implementing the Digital Services Act, the anticipated elimination of opportunities of participation for children and young people in the Youth Protcetion Act will have to be compensated for by new forms and options for the participation of young people in the protection of children and young people in the media, in order to continue to meet the state's obligation under Article 12 of the UN Convention on the Rights of the Child.
You will find the detailed argumentation in German here.
-
Published 26.07.23
Eurobarometer survey shows that a majority of EU citizens support the draft regulation on preventing and combating child sexual abuse online
European Union
Introduction
On 11 May 2023, the European Commission proposed a Regulation laying down rules to prevent and combat child sexual abuse. With 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone, and many more going unreported, child sexual abuse is pervasive. The current system based on voluntary detection by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires. In particular, providers falling within the scope of the ePrivacy Directive will have no EU legal basis to keep detecting child sexual abuse on a voluntary basis after August 2024.
The proposed Regulation is, first and foremost, about prevention of child sexual abuse. Providers would be required to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards. Detection constitutes a last resort measure under the proposal, and it is only necessary where preventive measures fail. Interpersonal communication services (chat, messages), as well as others (gaming services, other hosting and online service providers), would be required to detect online child sexual abuse material (both known and new content) and activities related to the solicitation of children, known as grooming.
To find out what EU citizens think about the proposed new EU legislation, a Flash Eurobarometer survey was conducted between 28 June and 4 July 2023. On behalf of the European Commission, Directorate-General Migration and Home Affairs, Ipsos European Public Affairs interviewed a representative sample of citizens, aged 18 and over, in each of the 27 Member States of the European Union. More than 26 000 interviews were conducted online (via computer assisted web interviews). Data presented in this summary are weighted to known population proportions and the EU27 averages account for the size of the 18+ population of each EU Member State. Totals mentioned in the text are calculated from rounded percentages as shown in the charts.
Key Finding
Increasing risks for children online
Across all Member States, 92% of respondents 'strongly' or 'rather agree' that children are increasingly at risk online. At the individual country level, the level of agreement varies between 86% in Latvia and 96% in Croatia. Additionally, 73% of respondents across the EU reply that the problem of child sexual abuse in their country is 'very' or 'fairly widespread'. There is, however, a large variation across the Member States for this question (from 37% in Latvia to 86% in Greece).
Support for the proposed EU legislation to prevent and combat child sexual abuse
Respondents were also explained that online service providers (e.g. social media platforms) can currently use several safety measures, including a combination of automated technology tools and human oversight, to detect and report sexual abuse of children, helping to rescue victims and bring perpetrators to justice. On 3 August 2024, however, the EU law that allows online service providers to voluntarily detect and report online child sexual abuse will expire. As such, new legislation is proposed that would oblige online service providers to prevent child sexual abuse from happening on their services. If prevention fails, and in case of significant risk of child sexual abuse, the service provider could be temporarily obliged to detect and report online child sexual abuse.
On average, 78% of respondents reply that they 'strongly support' or 'tend to support' the law proposed by the EU. In contrast, 13% 'tend to oppose' or 'strongly oppose' the proposed EU law. The total level of support varies between 65% in Cyprus and 83% in Czechia and Luxembourg. In Czechia, however, the level of 'strong support' is lower than in Luxembourg (45% vs 64% respectively). Differences in 'strong support' are also seen across socio-demographic groups. For example, across all age groups, a vast majority at least 'tend to support' the EU proposed law; however, the level of 'strong support' is 41% for those aged 18 to 24 and increases to 62% for those aged 55 and over.
Detecting child abuse and the right to online privacy
Across the EU, 60% of respondents reply that the statement 'the ability to detect child abuse is more important than the right to online privacy' is closest to their own view; at the individual country level, this proportion is the highest in Italy (72%) and the lowest in Hungary (35%).
The statement that the right to online privacy and the ability to detect child abuse are both equally important is selected by 36% of respondents across the EU and varies between 24% in Italy and 61% in Hungary.
In total, 96% of respondents state that the ability to detect child abuse is more important or equally important than the right to online privacy. Not more than a handful of respondents (2%) reply that the right to online privacy is more important than the ability to detect child abuse.
Tools to detect child sexual abuse online
The tools that online services providers can use to detect child sexual abuse online may interfere to a different extent with the privacy of the users. Even after explaining to respondents that the tools being used by online services providers may have an impact on privacy, 89% of respondents 'strongly' or 'tend to support' the use of tools that automatically detect images and videos of child sexual abuse material already known to the police, to identify where these images and videos are shared online again. In Czechia, 81% of respondents, in total, support automatic detection of images and videos of child sexual abuse material already known to the police; this proportion increases to 95% in Portugal.
When asked about tools based on artificial intelligence (AI), 85% of respondents across the EU 'strongly support' or 'tend to support' the use of these tools, even if they may interfere with the privacy of users, for detecting new sexual abuse material images and videos shared online and 84% 'strongly support' or 'tend to support' the use of these tools for detecting grooming and/or imminent abuse.
Support for the use of AI tools to detect new sexual abuse material images and videos shared online varies between 77% in Czechia and 93% in Portugal. Similalry, support for the use of AI tools to detect grooming and/or imminent abuse varies between 78% in Slovakia and 94% in Portugal.
Taking into account that 70% of the 1.5 million reports of child sexual abuse online stemming from the EU come from online messages, email and chat, 87% of respondents across the EU 'strongly support' or 'tend to support' that service providers detect child sexual abuse material and grooming conversations in messages (e.g. e-mail, chat) in case of a significant risk of child sexual abuse on a specific platform. At the individual country level, support varies between 79% in Latvia and 93% in Greece, Portugal and Romania.
When asked about detecting child sexual abuse material and grooming in messages using end-to-end encryption, in case of a significant risk of child sexual abuse on a specific platform, 83% of respondents across the EU 'strongly support' or 'tend to support' this. Respondents in Romania (92%) and Portugal (91%) are the most likely to support detecting child sexual abuse material and grooming in messages using end-to-end encryption, while respondents in Hungary and Latvia are the least likely to do so (both 73%).
A summary of the study can be downloaded here and the full pdf with graphics can be downloaded here.
-
Published 20.07.23
Crafting of the Code of Conduct on age-appropriate design kicks off
news article EC
At July 13, 2023 the Special Group on Code of Conduct for age-appropriate design convenes for its first meeting, an important step under the Better Internet for Kids Strategy (BIK+)
One of the key actions for the Commission under the BIK+ strategy focuses on helping implement legislation, through drafting a comprehensive code of conduct on age-appropriate design, for industry to sign up to. The Code of Conduct for age-appropriate design, envisioned as a collaborative effort, is set to play a pivotal role in shaping how industry treats its youngest.
Comprising representatives from industry (including via trade associations), academia and civil society, the special group is made of 21 members selected following a call for expression of interest. On behalf of the Commission, DG Connect will chair the Special Group and ensure secretarial services via the Better Internet for Kids+ Platform contractor. The Group's collaborative efforts are expected to create a balanced and comprehensive framework that encourages responsible behaviour from all parties involved with children in the digital sphere.
With the ever-expanding reach and influence of online services, ensuring the well-being and safety of young users is a priority, and the timing for crafting the Code is very opportune. The Digital Services Act (DSA) implementation is starting, after the Regulation entered into force in November 2022. The first 17 very large online platforms (“VLOPs”) and 2 very large search engines (“VLOSEs”) were designated at the end of April. Following their designation, the companies have four months to comply with the full set of new obligations under the DSA. The DSA makes significant changes to the digital landscape, and among many new obligations, all online platforms accessible to minors must ensure a high level of privacy, safety and security for minors on their service. It also introduces stricter penalties for non-compliance and empowers authorities - at European and at national level - to take action against systemic risks posed by digital services.
The Code's primary focus will be to build upon and support the implementation of the DSA, specifically emphasizing provisions dedicated to safeguarding minors. It will also actively contribute to the Audio-Visual Media Services Directive (AVMSD) and be in line with the General Data Protection Regulation (GDPR).
Key Performance Indicators (KPIs) and robust monitoring will be inherent elements of the Code. However, it is important to note that while the Code builds on the provisions of the DSA, it will not affect enforcement of the DSA’s obligations.
-
Published 19.07.23
International debate on the European Commission’s proposal to regulate CSA
Marlene Fasolt, SDC
An international group of scientists is speaking up against the European Commission’s draft regulation on preventing and combating child sexual abuse on the Internet. Their joint statement, which is addressed to the Members of the European Parliament and the Member States, was published on the pages of various portals.
In a publication, the head of the IT department of the Canadian Centre for Child Protection, Lloyd Richardson, refutes central arguments of the researchers as misleading and inaccurate. The article can be found here.
An open letter is circulating in strong support of the EU’s draft regulation on preventing and combating child sexual violence online. The letter can be publicly supported by signing it.