First day of the Internet Governance Forum 2021
Children's rights as the basis for regulation
Is legal regulation a suitable instrument for creating a safe online environment for children? This question was the focus of two workshops on Day 1 of the Internet Governance Forum 2021.
In the afternoon, the Dynamic Coalition for the Rights of Children in the Digital Environment addressed the issue of combating depictions of sexual abuse online. Patrick Burton of the Centre for Justice and Crime Prevention in South Africa warned against viewing regulation as a panacea for a safer Internet space. The causes of child exploitation and abuse lie outside the net and must be tackled there as well, Burton said. Andreas Hautz from the German organisation jugendschutz.net and Michael Tunks, representing the British Internet Watch Foundation, emphasised the responsibility of platform operators. Moderation of content is imperative, he said, even if it is difficult to distinguish between illegal, harmful or grey area content, it must be done consistently because this is the only way to manage the amount of content and ensure a safe environment. This was also underlined by Thiago Tavares from SaferNet in Brazil, who believes that content moderation is essential to provide children with a safe Internet experience without limiting their rights to access information.
Encryption is an important tool for privacy and security, but it is potentially accompanied by unintended effects on child protection that need to be carefully considered. Hautz and Tunks pointed out that in December 2021, the ePrivacy Directive resulted in platform operators no longer using technical tools such as photo-DNA to detect child sexual abuse material, allowing a very large amount of such content to continue to be distributed unimpeded. Such "collateral damage" of an EU-wide regulation agreed upon with good intentions would have to be prevented in the future - also during the upcoming deliberations on the Digital Services Act.
These considerations were taken up directly in the subsequent Workshop 170 on the question of how the protection of children on the Internet can be regulated by law, which was organised by the Digital Opportunities Foundation and the German Children's Fund. Speakers from Egypt, Ghana with reference to other countries on the African continent, Great Britain, Germany and the European Commission first presented the regulatory approaches pursued in each area; in addition, David Miles representing Meta / Facebook lined out the position of a platform provider.
After a decade of Internet governance in which de-regulation and self-regulation were predominant and the development of free market forces was in the foreground, there is now an increasing tendency towards state regulation in order to strengthen the rights of users. In particular, the focus is on the question of how content moderation can be carried out in accordance with human rights and in consideration of the protection of children and young people.
Risk assessment is the key to ensuring a safe space, explained . It is particularly important to address systemic risks, the likelihood of which is higher on very large platforms than on smaller offerings, Kaarlep said. The obligation for providers to take precautions, as provided for in the German Youth Protection Act and the UK Online Safety Bill, was generally welcomed. David Miles pointed out that they must also be given the time to develop these measures in an ongoing process. In his view, there is a need to develop a new type of regulator and to further staff the regulators already operating so that they can meet the major challenges.
There was a consensus among the participants in the workshop that a new perspective on regulatory measures was needed in order to promote dialog among the stakeholders. In the sense of dialogic regulation, it was necessary to develop an overall strategy to ensure a joint perception of responsibility. Agne Kaarlep from the European Commission DG CNECT made clear that regulation must be based on the principle of proportionality in order to achieve a compromise between a comprehensive duty of care and specific obligations. To this end, Kenneth Adu Amanfoh from the Africa Cybersecurity & Digital Rights Organisation proposed a multi-faceted, multi-disciplinary approach that includes the development of standards that are applied in practice and in among technical developers. Hoda Dahroug from the Ministry of Communications and Information Technologies in Egypt suggested that in creating a safe environment for all users, the possibilities of artificial intelligence should also be exploited.
Regulation must be thought of from the child's perspective, suggested Thomas Salzmann of the newly created Federal Agency for Child and Youth Media Protection in Germany, and found a comrade-in-arms in Beeban Kidron of the UK's 5 Rights Foundation. She called on legislative bodies at national and European level to make the UN Convention on the Rights of the Child and General Comment No. 25 on the rights of children in the digital environment a reference for laws. In this way, a harmonised legal framework could be created that is based on internationally recognised standards and respects the rights of children.