Time for children's rightsJutta Croll & Torsten Krause, SDC
On the third day of the Internet Governance Forum in Kyoto, Japan, many events were focused on children's rights. Topics included how to develop technologies that promote the well-being of young people, how young people can get involved and participate in decision-making, and how their data can be protected and handled responsibly.
Together with leading experts, UNICEF and LEGO have launched the RITEC project (Responsible Innovation in Technology for Children) to identify what is necessary to develop technologies and services that not only protect children and young people, but also promote their well-being and thus make a positive contribution to their development. The participants are aware that such a development cannot be realized without the participation of young people. For this reason, the Young & Resilient Research Center at Western Sydney University in Australia conducted a comprehensive participation survey with more than 34,000 children from over 30 countries. These results will help to develop a framework that provides guidance for developers and programmers on how to address the competencies, emotional self-direction, empowerment, and creativity of young people without neglecting their social connectedness, issues of safety and security, diversity and inclusion, and the development of young people. Once the guidelines are finalized, it is intended to enter into a pilot phase and test their realization.
At the event, which was organized by young people, they presented their plans for media literacy education, but also for increasing the involvement of young people in shaping and regulating the digital environment, and reported on their experiences as young ambassadors or members of youth committees, for example of the IGF and the International Telecommunication Union (ITU). They pointed out that self-organization and dealing with topics that match their interests contributes significantly to getting more young people involved and participating. They can benefit from the support of more experienced young people, who can help them become familiar with the specifics and procedures of adult organizations and help them successfully make offers for their conferences and events. Although they can gain good attention through this, they sometimes also experience that their participation and presence is welcomed and supported in principle, but that their influence on the shaping of the development of the digital space is not yet effective in all cases.
At the event, representatives of the German Children's Fund, Childfund Japan, Asia Pacific Youth IGF, Microsoft and the Center for Law and Crime Prevention discussed with the audience how to guarantee the safety of young people using digital applications in the face of a multitude of opportunities and risks. The conversation pointed out that the United Nations Committee on the Rights of the Child has produced a landmark document, General Comment No. 25, that comprehensively sets out how children's rights should be interpreted and applied to the digital world. However, it also became clear from practical experience and current studies that there are still major challenges in encouraging young people to participate safely in digital applications. One reason for this is that children and young people are not always familiar with all the options for protection in the applications and often receive little support and guidance from their parents or other adults. But not all offers and services fulfill their responsibility and provide appropriate preventive and protective services for their users either. For this reason, there are various efforts worldwide to protect children and young people in digital environments, for example by forbidding companies to use their data for the targeted advertising of products. The challenge is to achieve the right balance between protection and participation and to find a good solution in the best interest of the child and the best interests of all children. More participation of young people and research on how they use digital applications will be necessary to achieve this goal.
Artificial Intelligence is THE buzzword at IGF 2023. So the workshop’s title underlines the importance of bringing children’s right to protection together with the potential of AI.
Ghimire Gopal Krishna from Civil Society in Nepal, Sarim Aziz, representing Meta and Michael Ilishebo from Zambia Government and Jutta Croll were discussing how can AI technologies be effectively leveraged to detect and combat emerging forms of child exploitation in the digital age, considering the evolving nature of online risks. Sarim Aziz lined out what Meta is doing to fight CSAM on their services and that they do so very successfully, f.e. with software like PhotoDNA and now also by a technology that is able to detect abusive video material. But, although platform providers have being working there is still a huge amount of CSAM available even on the open Internet. Jutta Croll referred to the draft CSAM regulation currently in the parliamentarian process which provides for regulatory measures addressing the following three issues: already know CSAM, not-yet identified CSAM and grooming processes. Due to the high complexity of monitoring content and communication it is obvious that it is necessary to improve the efficiency of monitoring technologies, and to make them as rights respecting as possible. Privacy is one of the most ambivalent rights, Jutta Croll, explained. Children of course have the right to privacy not only in regard of service providers, but also towards their parents, But, as Michael Ilishebo pointed out rightly parents want to protect their children and with that intention sometimes break the privacy of their children. Gopal Krishna informed about the Nepalese Child Protection Act and encouraged to adhere to human rights and the rule of law to uphold democracy. He then referred to the concept of “age of consent” that is differently implemented in national jurisdictions, and these differences are enhanced in the digital environment with cross-border services being offered. In response Jutta Croll referred to Art. 118 of General Comment #25, which explicitly urges States not to criminalize the consensual sharing of sexually explicit contents among young people under the age of 18. This led to the key take-away of the session: Ai is not a silver bullet; it may be able to recognize certain types of illegal content but understanding the consensual or non-consensual nature of interaction among human beings will always need human assessment.
Are we on the right paths to address children’s issues in Internet Governance?Jutta Croll & Torsten Krause, SDC
Debates on Day 2 at the Internet Governance Forum addressed in various way issues of children’s rights to protection, provision and participation. A main session in the morning was dedicated to a multistakeholder perspective on the Global Digital Compact (GDC) addressing both the process and the content of the Compact that has been developed so far. The GDC’s objective is to ensure that digital technologies are used responsibly and for the benefit of all. In this session the youth representative in the process, Omar, outlined that as the only child at the table of the deliberations he took responsibility to bring in the perspective of his generation. He strongly emphasized the need for substantive developments on how to hold the private sector accountable for its role in the digital world and ensure it protects the rights and the interests of children and young people.
Later that day a series of sessions dealing with Artificial Intelligence and its impact on children started with WS #469 AI & Child Rights: Implementing UNICEF Policy Guidance
First Steven Vosloo set the scene outlining how the UNICEF Policy Guidance has been developed. Eventually he stated “It is easy to define guidance on AI in line with children’s rights, but it is difficult to apply or implement it.” Then the debate circled around embedding children’s rights in industry strategies for AI, inclusion of children in the whole circle of product design and governments supporting transparent interaction between stakeholders involved strengthening their accountability.
A key take-away from the session is to acknowledge AI has already a huge impact on children’s lives and well-being, so the journey to getting things rights needs to start immediately. Afterwards, it seemed like this message had already been heard by the speakers in the following main session on Artifical Intelligence.
Thobekile Matimbe, Senior Manager, Partnerships and Engagements Paradigm Initiative emphasized the importance of all stakeholders remaining with agency over fundamental rights and freedoms thus to ensure children’s rights are promoted in the use of AI.
The day ended with more concrete examples how industry engagement and regulation can work together to improve child online safety in Open Forum #58. While a new Japanese law is built on raising awareness for parents and children on Online risks and obligations for service providers to block child sexual abuse material Albert Antwi-Boasiako, Director-General, Cyber Security Authority Republic of Ghana described their approach of collaborative regulation. There is consensus that self-regulation alone cannot keep the children safe, and there are models of industry engagement for the protection of children. Ghana brought into force their Cybersecurity Act which was developed in dialogue with industry and then they have been acquiring allies for implementation of the Act. A concept that looks kind of in line with the Amendment to the German Youth Protection Act of 2021.
Eventually the session concluded with Dunstan Allison-Hope, Vice President, Human Rights, BSR (Business for Social Responsibility) asking for companies voluntarily embedding child rights impact assessments should in the broader framework of human rights due diligence.
In the event #403 Safe Digital Futures for Children: Aligning Global Agendas, the vast majority of participants were of the opinion that self-regulation by services is no longer sufficient to protect young people online. In order to achieve global legislation, leaders from Australia, Ireland, South Africa, South Korea, the UK and Fiji have joined together in a global network of online safety regulators. Together, they are striving to make rules that apply offline also effective online. A youth representative of the Digital Youth Council demanded that this should not be implemented without the perspectives of young people.
Trust is essentialJutta Croll, Marlene Fasolt & Torsten Krause, SDC
The role of artificial intelligence plays a major role in many events of the Internet Governance Forum 2023, but also in the corridors of the Congress Centre in Kyoto and in discussions during breaks. Common to these conversations are, on the one hand, the great expectations and hopes associated with the technologies. On the other hand, concerns and fears about their effects are also present. At the High Level Panel on Artificial Intelligence, it was emphasised several times that knowledge about the data used and its diversity, but also transparency about the effects of the technologies are crucial for accepting the results and consequences of artificial intelligence and dealing with them. Openness is the key to creating the necessary trust, which in turn provides the basis for being able to use the opportunities that are opening up for all people and in the sense of their rights. How companies and providers can be encouraged to ensure this openness is also a matter of debate for many states and organisations around the world. In the event The Role of Parliamentarians in Shaping a Trusted Internet Empowering All People, parliamentarians were also called upon to create meaningful solutions and regulations that bring order to complex issues. Furthermore, it was pleaded for the United Nations to be used as a catalyst for these diverse processes in order to learn from each other and, as a result, to set frameworks that make it possible to realise the rights of all people by means of artificial intelligence.
A discussion with three speakers and the audience explored what policies should be in place to ensure the responsible and ethical use of generative AI technologies in educational settings. There were also an exchange on how policymakers can collaborate with relevant stakeholders to ensure that teaching and learning processes are enhanced while sustaining creativity, critical thinking and problem solving, but also how to ensure that the use of generative AI by youth in education is inclusive, age-appropriate and aligned with their developmental needs and abilities.
The speakers agreed that generative AI has immense potential to reform education and can lead to more inclusive, personalized and accessible education. As everyone has a different learning style, generative AI can personalize the learning tools and e.g. translate lectures into different languages, add audio description, etc. They also noted the risks regarding data privacy, an overreliance on the technology as well as discrimination and biases due to the training data sets that are based on white westerners and often times ignore minorities. Because of this, more robust data protection rules were recommended as well as generative AI built in different countries and regions that recognize local language and cultures. It is important to keep a balanced view that looks at the benefits as well as the risks of this technology.
When regulating generative AI the importance of cooperation and collaboration was emphasized, especially the need to include children’s and young people’s voices in the discourse. As important is also starting an ongoing discussion between policy makers, technology companies, students, teachers, and parents that aims to establish clear guidelines on generative AI in education. Policy makers should take a human centric approach, embrace new technologies, and not hinder innovation, while educators should learn to work with generative AI, instead of banning it, as students will use it regardless. When considering these points and leading additional digital literacy programs AI can become a bridge and not a barrier to education.
Children's rights were also addressed in the Dynamic Coalition on Data Driven Health Technologies (DC-)DDHT session, which focused on robotics and the medical Internet of Things (MIoT). Jutta Croll referred to Art. 24 of the UN Convention of the Rights of the Child that obliges states to recognize the right of the child to the enjoyment of the highest attainable standard of health and to facilities for the treatment of illness and rehabilitation of health. More than 30 years after the UN-CRC was adopted General Comment No #25 provides for guidance how states obligations should be interpreted in the digital environment children are now growing-up in. The Internet of Things may play an important role in children’s health, given privacy and the protection of children’s data are ensured. Jutta Croll also emphasized that a child’s health begins at its birth and has to be addressed continuously. A child’s health, she said, start with its birth, therefore without identification, registration and acknowledgment of child / person, access to healthcare may be limited, delayed or denied - as described in General Comment 25. With regard to medical Internet of Things she referred to Teddy the Guardian, a health monitoring device for children and pointed out there are issues of privacy, transparency and ethics in regard to children’s sensitive data.
Take young people’s opinions into account in developing the internetJutta Croll & Torsten Krause, SDC
With the events of day 0, the 18th Internet Govenance Forum began in Kyoto, Japan, on 8 October 2023. More than 8200 people will come together in the coming days to discuss all relevant issues of internet policy and regulation. Especially the events important for children’s rights will be reported on our homepage.
On Sunday morning, various representatives of the INSAFE network presented their national activities, each emphasising the involvement of young people in these and in developing materials. Regardless of whether the Safer Internet Centres in Belgium, Poland or the activities in the UK were reported, it became clear that the existing guidance and support services are increasingly enriched by peer-to-peer services. For this purpose, young people are empowered to provide help and advice to people of their age group. For example, the Safer Internet Centre UK runs the "Digital Leaders Programme" together with Childnet. And in Poland, young people have created a joint webinar series "talking about internet" to reach more young people through digital services and inform them about issues that are important to them. But even though young people are increasingly taking action themselves to help others with concerns and challenges, this does not replace the role of adults in the services. In Belgium, the Max programme is currently being implemented with the aim of ensuring that every young person has a trusted adult to turn to when they have unpleasant experiences in digital environments. Central to this is that young people can choose their "Max" themselves. Crucial to this approach is that research shows that young people often do not turn to their parents or educational staff to share their concerns or ask for help.
In the afternoon, the Global Youth Summit addressed digital topics of concern to young people all over the world. The young representatives from Hong Kong, India and Italy called for processes and policies to make the internet and digital applications safe for children and young people. In order for this to happen in the interest and sense of young people, it should be ensured that they are involved in the development of such policies and that their opinion is taken into account. A representative of the European Commission pointed out in this context that the European Union is committed to an approach that puts people at the centre of its policies and considers their interests rather than those of states or companies to be decisive. Adult participants also emphasised that the aim was to balance the various fundamental rights to freedom, privacy and protection. Vint Cerf also addressed these aspects at the beginning of the event. In contrast to what was heard from adults later on, however, he already anticipated his support for the concerns of the young generation by clearly stating that the freedom of each individual ends where it interferes with that of the other. With regard to the protection against attacks or violations in the digital environment, which young people called for several times in the debate, he critically noted that these unintended effects of the internet should not undermine its goal of overcoming barriers and enabling networking and exchange.
The session addressed the engagement of civil society organizations in the Global Digital Compact (GDC) which was first mentioned in the UN Secretary General’s Roadmap for Digital Cooperation in June 2020. The Global Digital Compact’s objective is to ensure that digital technologies are used responsibly and for the benefit of all, while addressing the digital divide and fostering a safe and inclusive digital environment.
Among the issues addressed by the GDC are
- Upholding human rights
- Avoiding fragmentation
- Digital connectivity
- Promoting trust
As it turned out in the debate although children’s rights may be assumed under human rights they nonetheless do not feature as prominently in the GDC so far as one could have expected from the Roadmap.
In regard to the process to develop the GDC civil society organizations recommended transparency, an accurate reflection of the scope and coherence.
The substance of the GDC was discussed along the lines of the paper issued by the Co-facilitators of the UN SG’s Global digital Compact on Sept. 9th, 2023 based on the deep dive discussions on the GDC. Participants came to the conclusion that the paper should be the basis for on-going discussions. The IGF 2023 is a key moment for Civil Society in regard of expressing a common position towards the Gobal Digital Compact.
New CWA 18016 provides a practical framework for children’s protection and well-being onlineCEN CENELEC
CEN and CENELEC, together with IEEE (the Institute of Electrical and Electronics Engineers) just published a new Workshop Agreement, CWA 18016 Age appropriate digital services framework. This document describes a set of processes to help design and develop online products and services with the rights and well-being of children in mind. Read more about this in the article New CWA 18016 provides a practical framework for children’s protection and well-being online.