IGF 2022 in EthiopiaMarlene Fasolt, Stiftung Digitale Chancen
Ethiopia’s government will host the UN Internet Governance Forum 2022 at the UN-ECA Conference Center in Addis Ababa, Ethiopia, from November 28 to December 2, 2022. The Internet Governance Forum (IGF) is a global multi-stakeholder forum for dialogue on Internet governance issues, which is convened by the United Nations Secretary-General. It will be organized in a hybrid format bringing together stakeholders from around the world to discuss the overarching theme: Resilient Internet for a shared sustainable and common future. Participants representing governments, intergovernmental organizations, the private sector, the technical community, and civil society (including academia) are expected to have meaningful IGF sessions both online and in a physical (hybrid) form.
The host country has created a website for this year’s IGF that offers helpful information regarding hotels close to the venue, attractions, transportation, Covid-19, applying for a visa and customs.
Visas are required for all foreign visitors to Ethiopia, with the exception of nationals of Kenya and Djibouti; and Diplomatic and UN passport holders. Before applying for a visa, you must first register for the IGF 2022 through the UN Accreditation Process and wait for the approval confirmation to be sent to your email. The approval usually takes a few days. After receiving the confirmation, you can email it together with a scanned copy of your passport biopage to the Ministry of Innovation and Technology (MinT) at firstname.lastname@example.org. The Ministry will then send you the necessary documents to apply for a visa online which are: an invitation letter, application letter, and note verbal. More information on this process can be found on the host countries website.
In our focus article “IGF 2022: Children’s and young people’s rights” you can find information on sessions that relate to children’s and young people’s rights in the digital environment.
"Caution: May contain traces of child protection"Jutta Croll, SDC / Torsten Krause, DKHW
A working group of the European Committee for Standardization (CEN) and the European Committee for Electrotechnical Standardization (CENELEC) is currently working on the standardization of a framework for age-appropriate digital services. Clear guidelines are being established for the design of services aimed at or used by children, based on the requirements of the 25th General Comment on the Rights of Children in the Digital Environment, adopted by the United Nations Committee on the Rights of the Child in March 2021. This Europe-wide framework comes at the right time and can spur the implementation of precautionary measures laid down in § 24a of the German Youth Protection Act, which was also amended in 2021. This seems sensible and necessary - one and a half years after the new rules came into force - because the willingness of service providers to assume responsibility still has significant potential for improvement.
In September, the company Electronic Arts Sports released the game FIFA 23, approved by the USK without age restriction. It’s great when even small children can immerse themselves in the fascinating world of digital football - you can’t start early enough to get children excited about the sport. It becomes critical, however, when children are tempted to improve their score by using cash in a really fascinating game with so-called loot boxes. These loot boxes, available for a price of 22.96 EUR via in-app purchases, contain content put together at random, i.e. they are costly bags of wonders. Even attentive parents will not get the idea that there is a high risk of commercial exploitation of children - and possibly the plundering of their own credit card account - in a game that is approved for ages 0 and up. So-called micro-payments - a term that is in itself questionable for amounts such as for the loot boxes in the game FIFA 23 - are a highly lucrative business for the providers; in Germany, a total of 4.2 billion in sales were made with in-app purchases in 2021.
§10a of the German Youth Protection Act states that one of the objectives of the law is to protect the personal integrity of children and adolescents when using media. This objective requires a broad interpretation, because the vulnerability of children lies in very different areas and includes the danger of age-inappropriate incentives to play and buy. §10b of the Youth Protection Act regulates that circumstances of the respective use of the medium that lie outside the effect of the media content can also be taken into account in the classification as having an adverse effect on development. Risks due to purchase functions, mechanisms similar to gambling, as well as mechanisms that promote excessive media use behavior are explicitly mentioned here.
The approval of FIFA 23 from 0 years of age took place within the framework of a regular test procedure in July 2022 according to the USK on the basis of the currently valid USK guideline criteria. According to the USK, the formal implementation of the new regulations of the Youth Protection Act started in 2021 and is to be completed by spring 2023, after which the adjusted guideline criteria for the testing of computer games will come into force. The age ratings issued until then will remain unaffected by the new guideline criteria - i.e. FIFA 23 will continue to remain without a labeling of the associated risks long after the youth protection amendment. Even though the USK argues in its statement that the decision regarding the age rating is correct from a procedural point of view, one may ask whether this satisfies the responsibility of providers to protect the personal integrity of children and young people, as they should also be committed to voluntary self-regulation.
The Youth Media Protection Index 2022 presented in Berlin on 13 October on the occasion of the 25th anniversary of the German Association for Voluntary Self-Regulation of Digital Media Service Providers (FSM e. V.) states that parents are more concerned than in 2017; three quarters of the respondents mention at least one concern, especially with regard to interaction risks, there is a significant increase. 72 percent would like to be able to see whether online services are suitable for their children based on corresponding labels. The number of parents who ascribe a high responsibility for the protection of minors in the media to the voluntary self-regulation bodies has also increased significantly, from 63 to 80 percent. Of these, 59 percent believe that self-regulation bodies do their job rather well to very well, a significant increase of 16 percent compared to 2017 and thus a high level of trust by parents in the structures of youth media protection, which the providers and self-regulation bodies must now take into account in a joint assumption of responsibility.
The Federal Agency for the Protection of Children and Minors in the Media (Bundeszentrale für Kinder- und Jugendmedienschutz), which is responsible for enforcing the JuSchG, also provides impulses for this by entering into dialogue with providers and demanding measures for better protection of young people in the media. In a press release on 10 October they have asked game providers to explain which precautionary measures they intend to include in their game offers against gambling accustomization and dependency mechanisms in children. This is an important step to ensure that FIFA 24, 25, 26 ... contains more than just traces of youth media protection in the future. And to make the call to action appealing to the Cologne-based provider EA Sports with a song by the Cologne band de Höhner: If not now, when?
Draft European Regulation on combating child sexual abuse on the Internet
On 11 May, the European Commission presented a draft regulation to combat sexual abuse of children on the internet. This was open for public comment until 12 September and will now be discussed in the parliamentary procedure over a period of about two years.
The draft regulation can be downloaded here.
From the point of view of the Digital Opportunities Foundation, we comment on the draft as follows:
We expressly welcome the draft regulation since it is the first time a law in the EU follows a fundamental and comprehensive child rights approach focusing on the primacy of the best interests of the child according to the EU Charter of Fundamental Rights Art. 24 (2).
Among the options in consideration the EC has chosen the most far-reaching proposal (E) with regard to the protection of children, addressing detection, reporting and deletion of both known depictions of abuse and "new" material, as well as solicitation of children with sexual intent (grooming). The obligation to assess the risk of grooming within apps is a milestone in combatting child sexual abuse online.
The fight against CSAM must start with a risk assessment and risk mitigation measures by the service providers, since prevention is crucial. Only if and when these preventive operations turn out to be inefficient, the process potentially leading to a detection order will be initiated. The regulation outlines transparently the necessary steps before a detection order is issued and the safeguards to eliminate the violation of fundamental rights as far as possible. We appreciate detection orders being issued by a court or national authority based on a thorough validation process, any order will be limited in time and addressing only a certain type of content on the respective service. Research as well as law enforcement investigations give evidence that sexualised violence follows escalation pathways, the earlier these paths can be stopped the better.
We acknowledge the need to ensure the privacy of interpersonal communications, but the regulation needs to take into account private chats because this is where perpetrators initiate contact with children. The EC suggests scanning technologies look for behavioural patterns suggestive of abuse, but not to analyse the actual content of the communication at first. We trust in the structural safeguards already foreseen in the draft regulation to prevent surveillance of all personal communication without cause.
The draft recognizes end-to-end encryption as an effective means of ensuring the confidentiality of communications and explicitly does not exclude it as an instrument (para. 26). The draft leaves it up to providers to choose the appropriate technology, but makes it unequivocally clear that providers are obliged to detect CSAM and grooming in their services. Our expectation would be for service providers to invest in the development and deployment of such technology right now to have it operational when the parliamentarian process hopefully is finished in mid 2024 with the new regulation chiming into force.
We welcome the co-operational approach for the European Centre to prevent and counter child sexual abuse (EU Centre), which is an important component of the regulation. The tasks foreseen for the Centre must be addressed on a trans-national level without putting in question the work of national authorities. Only the time horizon of 8 years till it is fully operational is too long. We expect the EC to undertake preparational measures for the Centre to be ready to start in 2024. The centre must operate independent from law enforcement although in close co-operation with Europol.
The interim derogation is a strong example how to overcome the only seemingly insurmountable contradiction between privacy and child protection. Like all people also children have the inalienable right to privacy as laid down in the UN-CRC Art. 16. Art. Also they have the right to protection from any form of exploitation (Art. 34 - 36). The approach of the regulation is justified by General Comment 25 on children’s rights in relation to the digital environment esp. by a call for special protective measures on the part of the states in chapt. 12. Also it requires states in para 118 not to criminalise self-generated sexual content children possess or share with their consent and solely for their own private use. We suggest the EC reviews the draft to ensure it is fully compliant also with this para.
An open letter from civil society organisations on the draft regulation can be found at Sexual Abuse - Civil Society Open Letter to the EU
IGF 2022: Children’s and young people’s rightsMarlene Fasolt, Stiftung Digitale Chancen
The 17th annual Internet Governance Forum (IGF) meeting will be hosted by the Government of Ethiopia in Addis Ababa in a hybrid format from 28 November to 2 December 2022.
The theme of this year’s IGF is "Resilient Internet for a Shared Sustainable and Common Future". Representatives from business, science and civil society will meet with high-ranking government representatives from all over the world and address the following topics in various event formats:
- Connecting All People and Safeguarding Human Rights
- Avoiding Internet Fragmentation
- Governing Data and Protecting Privacy
- Enabling Safety, Security and Accountability
- Addressing Advanced Technologies, including AI
Many sessions at this years IGF will cover the topic of children’s and young people’s rights to protection, empowerment and participation in the digital environment. Multiple of these sessions have a strong focus on data privacy and protection.
We have put together a list of the sessions that will cover aspects of children and young people growing up in a digital environment. This way you can decide which parts of the program are of interest to you. All sessions are designed to actively involve participants, whether they are on-site or online. Within the IGF week we will report from the event, with a special focus on these sessions.
Registration is required to attend the IGF. It is recommended that you register in good time before the start of the event. You will receive a confirmation that entitles you to access the on-site events and the digital Zoom meetings.
IGF-Sessions on "Growing up in a digital environment":
- 12:15-13:45 CET (=10:15-11:45 UTC), Room Banquet Hall A: IGF 2022 WS #523 Youthful approach at data protection in messaging apps
- 14:35-16:05 CET (=12:35-14:05 UTC), Room CR3: IGF 2022 WS #183 Digital Wellbeing of Youth: Selfgenerated sexualised content
- 16:30-17:30 CET (=14:30-15:30 UTC), Room Banquet Hall A: IGF 2022 WS #269 Data privacy gap: the Global South youth perspective
- 11:50-13:20 CET (=10:50-12:20 UTC), Room CR1: IGF 2022 DC Main Session: Our Digital Future: How Dynamic Coalitions Support the Global Digital Compact
- 14:05-15:35 CET (=12:05-13:35 UTC), Room CR5: IGF 2022 WS #341 Global youth engagement in IG: successes and opportunities
- 15:50-17:20 CET (=13:50-15:20 UTC), Room CR4: IGF 2022 WS #318 Gen-Z in Cyberspace: Are We Safe Online?
- 8:30-10:00 CET (=6:30-9:00 UTC), Room Banquet Hall B: IGF 2022 DCCOS Translating data & laws into action for digital child rights
- 10:15-11:35 CET (=8:15-9:45 UTC), Room PBR: IGF 2022 WS #471 Addressing children’s privacy and edtech apps
- 12:45-13:45 CET (=10:45-11:45 UTC), Room LBR:IGF 2022 WS #252 Building a safe & trustworthy digital world for all children
- 14:00-15:00 CET (=12:00-13:00 UTC), Room PBR:IGF 2022 WS #352 Youth lenses on Meaningful Access and Universal Connectivity
Public Consultations on EU rules regards combating child sexual - Have your say now!Jutta Croll, Stiftung Digitale Chancen
Through two consultations, the Commission is seeking the views of EU citizens on existing and future regulatory measures to combat child sexual abuse.
The public consultation, which is open until July 13 2022, aims to evaluate and possibly revise the EU Sexual Abuse Directive, and it is part of the data collection activities following the Inception Impact Assessment published in September - October 2021. This public consultation will inform the evaluation and revision of the EU Child Sexual Abuse Directive, and give citizens and stakeholders the opportunity to provide their feedback on current and future challenges in combatting child sexual abuse, sexual exploitation and child sexual abuse material and possible ways to reinforce, develop and update the existing framework.
Furthermore, the consultation of the new regulatory proposal "Combating Child Sexual Abuse: Detection, Removal and Reporting of Illegal Content Online" is open until September 2022.
Child sexual abuse (CSA) can take multiple forms which can occur both online (e.g. forcing a child to engage in sexual activities via live streaming or exchanging child sexual abuse material (CSAM) online) and offline (e.g. engaging in sexual activities with a child or causing a child to participate in child prostitution). When the abuse is also recorded and shared online, the harm is perpetuated. The EU Directive on combating child sexual abuse, sexual exploitation and child pornography (2011/93) is the main EU legal instrument to combat these crimes. The Directive sets out a comprehensive response, in particular: - It approximates definitions of criminal offences, sets minimum levels for criminal penalties and facilitates reporting, investigations and prosecution of such crimes - It sets out prevention measures, including awareness raising, and intervention programmes for offenders and persons who fear they might offend, - It reinforces provision of support to victims including prevention of additional trauma caused by participating in criminal proceedings.
On 24 July 2020, the Commission adopted the EU Strategy on a more effective fight against child sexual abuse, which proposes concrete actions to set up a comprehensive response to these crimes. One of the actions is to evaluate the EU Directive which has been in place since 2011 to identify best practices and any remaining legislative gaps. If necessary, new priority actions will be proposed to ensure that this legislation continues to reach the goals that it sets out to achieve.
The Commission presented a new regulatory proposal on 11 May, "Combating Child Sexual Abuse: Detecting, Removing and Reporting Illegal Content Online." This is now open for public consultation, initially until 5 September (midnight CEST). This deadline is continuously moving backwards until the regulatory proposal is published in all EU languages.
Some providers are already voluntarily using technology to detect, report, and eliminate child sexual abuse online in their services. However, the actions taken by providers vary widely and a significant number are not yet taking any action at all. According to a recent impact assessment, voluntary measures to combat child sexual abuse online are insufficient, so the Commission is planning to introduce mandatory measures.
The proposed Regulation consists of two main building blocks:
- It imposes obligations on providers concerning the detection, reporting, removal and blocking of known and new child sexual abuse material, as well as solicitation of children, regardless of the technology used in the online exchanges
- It establishes the EU Centre on Child Sexual Abuse as a decentralised agency to enable the implementation of the new Regulation.
Taking part in the consultation is possible via the EU consultation system. You can find the consultation process here
- „Combating child sexual abuse - review of EU rules“ til July 13th, 2022
- „Fighting child sexual abuse: detection, removal and reporting of illegal content online“ til Sept. 6th, 2022.