Jump to main content keyboard shortcut 2 Jump to navigation menu keyboard shortcut 1 Jump to search keyboard shortcut 5

FOCUS


  • Published 18.07.24

    Eeny, meeny, miny, moe, how old are you?

    Jutta Croll, SDC

    Imagine you have three children aged between 5 and 12. You are worried about which kindergarten and school they will visit when the family moves to the country-side soon. Where and how will Paula, Emma and Laurin be looked after, in which environment will they feel comfortable and how will they get there safely? Which school career will they pursue and how can the best conditions be created for this? The mom would like to seek external advice and support in making these difficult decisions, anonymously, without revealing their children's names, gender or age and under no circumstances by showing identification documents. It's good when there is an awareness of data protection and privacy. This doesn't make the counselling task any easier, but the counselling service is resourceful and knows its way around the internet. How old is the mother? What can be deduced about the age of the children? There are no photos of the children to be found, but didn't the mother recently look for decoration ideas for a 10th birthday party on xyz.com, offer a girls' satchel for second graders in a swap shop and post that she wants to move out of town? It's easy to put together a family profile from this.

    This is exactly how platform operators are currently proceeding if they want to comply with Art. 28 (2) of the Digital Services Act and not display advertising to minors. They evaluate profile data and posted content and analyse user behaviour to find out how old their customers are and who is aged under the threshold set by the terms of use. In doing so, they intrude deeply into the privacy of their users and - albeit with good intentions - potentially violate other personal rights.

    At a multi-stakeholder dialogue on age assurance on 11 July 2024 in Brussels, around 50 experts discussed how things can be done differently and where the use of age verification systems can be useful in order to provide users an age-appropriate environment that guarantees freedom and protection in equal measure. The Centre for Information Policy Leadership (CIPL) and the WeProtect Global Alliance organised this dialogue format, which had already taken place once in London in March. A number of large platform providers (VLOPs), business associations, regulatory authorities and government representatives at national and European level as well as representatives from civil society and academia took part.

    There was widespread agreement among the participants that age verification should not be carried out generally but should be prioritised for high-risk services. Not all platforms are equally risky, and the understanding of the potential risk depends on the respective perspective, for example on data protection or the protection of minors. There was also no question that a standardised legal framework is desirable but would be difficult to achieve at both European and international level.

    Various approaches were discussed with regard to age assessment procedures. There was a consensus that the current situation is considered inadequate: Self-declaration of age could only lead to reasonably reliable results with accompanying measures. Device-based procedures are less vulnerable in terms of privacy protection, but they require setting up a user profile that is not verified and thus harbour the risk of an incorrect age setting in the case of one device used by different family members or device sharing with others. If the profile is not up-dated according to the respective user’s age children could get access to age-inappropriate adult content, and devices with a child profile could also be used for unauthorised contact by adults, e.g. cybergrooming.

    Application-based methods must meet strict requirements in terms of data minimisation, privacy protection and user anonymity, which could be achieved through so-called double blindness. This refers to procedures in which neither the platform requesting the age information nor the verifying organisation get knowledge of each other.

    The discussion revealed a high level of willingness on the part of all those involved in the various processes to assume responsibility. This was evident not only in the well-founded and high-level discussion of different technical approaches, but also in the commitment to creating age-appropriate digital experience spaces in a joint process by means of valid age verification. In four digital working groups, the participants will deal with questions of regulation and risk assessment in the coming weeks before a further on-site meeting is planned for the autumn.

    To summarise: The train towards age verification is put on rails and has started rolling. Anyone who participates in this open process also has the power to set the course and set the signals for a direction that creates age-appropriate digital freedom. If this is implemented well, privacy, anonymity and data protection are safeguarded for all users and, at the same time, children's rights to protection in digital environments are harmonised with their rights to freedom.


  • Published 01.07.24

    Countering sexual violence against children and young people

    Jutta Croll, SDC

    Berlin, 27 and 28 June: summer conference of the National Council, it's hot outside, pleasantly cool in the new conference centre and there could be an association of lightness if it weren't for the heavy topics on the agenda. For two days, the focus is on sexual violence against children, on those affected who are now adults and on processing in institutions.

    On the first day, Christine Streichert-Clivot, Minister for Education and Culture of the Saarland and President of the Standing Conference of the Ministers of Education and Cultural Affairs of the Länder, Josefine Paul, Minister for Children, Youth, Family, Equality, Refugees and Integration of the Land of North Rhine-Westphalia, Prof. Dr Winfried Speitkamp, State Commissioner for Child Protection in the Free State of Thuringia, Angela Marquard, Renate Bühn, both members of the Council of Victims of Sexual Violence at UBSKM, Michael Groß, Chairman of the Federal Association of Independent Welfare Organisations and Katja Adler, Member of the German Bundestag, Deputy Member of the Commission for the Representation of the Interests of Children (Children's Commission) discussed under the title "Much has already been achieved - much still needs to be done to protect children and young people from sexual violence". A lot has actually happened at federal and state level since Dr Christine Bergmann took on the honorary role of the first Independent Commissioner for Child Sexual Abuse Issues in 2010 and personally dealt with well over 10,000 enquiries from victims during her term of office, which lasted until 2011. The cabinet decision of 19 June 2024 on the so-called UBSKM Act is considered a milestone. This will legally establish the structures of the Independent Commissioner for Child Sexual Abuse Issues (UBSKM) and the Council of Victims, and a centre for research into sexual violence against children and young people is to be set up. Despite all the joy about this important step, there is still a lot to be done, as emphasised in particular by the two representatives of the Council of Victims during the discussion. Representatives of support services also pointed out that there is a lack of financial and human resources and that there is a huge need for specialised staff.

    After this introduction, the topics were explored in greater depth in specialised forums. These dealt with the inclusion of children and young people with disabilities, the role of youth welfare offices in combating sexual abuse, the development of protection concepts in schools, state councils for victims in Germany, child-friendly justice and the dissemination of scientific findings.

    When Dr Christine Bergmann was bid farewell at the end of the day, there were tears, but there was also the hoped-for lightness, not least because she stood up as a strong voice for children's rights and once again called for children's rights to finally be enshrined in the Constitution. According to Bergman, this was not just a symbolic act, but rather a signal to society to immediately realise the primary consideration of the best interests of the child.

    Nationaler Rat-Kinder- und Jugendbetreuung

    The second day opened with two presentations on the topic of "Protecting children and young people from sexual violence online - current usage experiences of young people and technical potential". The two speakers, Ayla Askin and Dr Dorothea Czarnecki, provided convincing and highly competent information on various approaches to protection. Ayla Askin introduced the peer-to-peer counselling service JUUUPORT e. V., where she volunteers as a team member, and described the problems that young people face today and for which they seek advice and help from JUUUPORT. These range from cyberbullying and cybergrooming to sexual violence and sextortion, i.e. blackmail using intimate photos.

    Dr Dorothea Czarnecki, who heads the Child Protection and Human Trafficking department at FORENSIK.IT GmbH, followed on from this and explained which forensic methods can already be used to uncover criminal activities on the internet. So-called 'financial sextortion' involves sending users explicit sexualised images - often generated using artificial intelligence - and putting them under great time pressure to make a payment. FORENSIK.IT's task is to support the investigative work of law enforcement authorities. For example, criminals could be convicted by means of data analysis; a high number of simultaneous chat histories, chat messages often sent to hundreds of contacts at the same time using copy and paste, etc. would provide evidence of conspicuous and relevant usage behaviour.

    The attendees of the summer conference then discussed issues of sexual peer violence, how protection concepts can be designed inclusively, how psychotherapy can affect the credibility of those affected in criminal proceedings and which support services for children and young people who have been trafficked can be considered good practice in four further specialist forums.

    The event ended with a summary and an outlook on the future work of the National Council. The conclusion that much has already been achieved in the protection of children and young people is justified. At the same time, the realisation that the challenges will not become any smaller in the future, even in the face of advancing digitalisation, had the same effect as the expectations that rest on the implementation of the UBSKM Act. This is a lasting basis for the work of the National Council and an incentive for further networking and good professional cooperation. According to the unanimous opinion, it is important to continue along the chosen path and to treat children and young people equally.


  • Published 26.06.24

    Considering children's rights in the Global Digital Compact

    Torsten Krause, SDC

    The United Nations is working on defining a Global Digital Compact. This is to be adopted in September 2024 at the Summit for the Future as part of the Our Common Agenda. The Global Digital Compact is intended to jointly determine the basic principles for shaping the digital environment. The aim is to align further development with common ideas and principles, for example with regard to connectivity, data usage and respect for human rights.

    In the run-up to the resolution, the United Nations invited the public to participate in a consultation process for the second time, in which the Stiftung Digitale Chancen (Digital Opportunities Foundation) took part. During the first round of participation on 12 and 13 February 2024, we emphasised to the United Nations that, in addition to the fundamental consideration of human rights, special attention must be paid to the rights of children. In addition to protection, it is also important to realise provision and participation for young people. General Comment No. 25 on the rights of the child in the digital environment should serve as a guideline for action. The verbally presented position can be read here.

    In our statement of 21 June 2024, we were able to build on the inclusion of corresponding reference points in the revised version of the Global Digital Compact and welcome the fact that all three pillars of children's rights (protection, provision, participation) have been anchored in the document as guiding principles for action. In this sense, attention is drawn to the fact that children should not only be seen as a vulnerable group of users worthy of protection. In order to utilise and expand the benefits and opportunities of the digital environment for children and young people, we expressly maintain the fact that the perspectives and experiences of different stakeholder groups should be taken into account in the processes. Appropriate processes and fora must continue to be provided for this purpose. The written statement submitted can be viewed here.


  • Published 24.06.24

    Shaping and regulating the metaverse in accordance with children's rights

    Torsten Krause, SDC

    The European Commission is continuing to prepare the market environment for virtual worlds and artificial intelligence. In this context, it carried out a call for contributions until 11 March and asked for feedback from the sectors of regulation, science, industry and consumer protection regarding their estimates and evaluations of possible trends. In this context, the Digital Opportunities Foundation submitted a contribution that focussed in particular on the rights and needs of children. According to Article 5 of the European Unfair Commercial Practices Directive, children can be defined as a special group of consumers with regard to their age.

    Using Livingstone and Stoilova's 4C model, foreseeable and possible developments of opportunities and risks in the categories of content, contact, conduct and contract are highlighted. Due to the expected immersion and the associated pull effect, we point out that children's rights must be considered from the very beginning when shaping and regulating a developing metaverse in order to be able to realise the protection, provision and participation of young people in virtual worlds. These must be structured in such a way that all children can participate equally without experiencing discrimination of any kind and that the child's well-being and best interests are taken into account in the process. In this context, we recommend that the concept of the personal integrity of children, which was newly established as a protection objective in the 2021 German Youth Protection Act, be established for all users of the metaverse and used as a benchmark for regulation.

    The statement dated 11 March 2024 can be viewed here.


  • Published 19.06.24

    Regulation could strengthen trust in technology

    Torsten Krause, SDC

    After three days of intensive discussion and exchange, this year's edition of the European Dialogue on Internet Governance (EuroDIG) concluded in Vilnius (Lithuania) on 19 June. At the closing event, the messages of the conference were discussed with regard to political projects within the European Union, the digitalisation of administrations and the use of citizens' data, artificial intelligence and against the backdrop of ongoing discussions on the Global Digital Compact. As soon as they are published, they will also be available here.

    Prior to this, on the last day of the conference, participants from politics, industry, civil society, science and research focussed on issues relating to artificial intelligence (AI). In their presentations, Tomas Lamanauskas, Deputy Secretary General of the International Telecommunication Union (ITU), and Mariju Pejcinovic Buric, Secretary General of the Council of Europe, addressed current international developments in the regulation of AI. Tomas Lamanauskaus defined the cornerstones on which the global community needs to reach an understanding. According to him, there should be an agreement on setting a framework for artificial intelligence that fulfils the requirements of human rights, ensures interoperability through the development of international technical standards and helps to reduce the digital divide. Around 2.6 billion people are still not part of the digital society for various reasons, the Deputy Secretary General of the ITU reminded us in this context.

    Mariju Pejcinovic Buric then presented the Council of Europe's Framework Convention on Artificial Intelligence and Human Rights. This treaty will enter into force on 5 September this year and is the first international legally binding treaty in this area. The Council of Europe is thus pursuing the goal of safeguarding human dignity and individual autonomy in the age of AI, counteracting discrimination and protecting privacy and data. The Secretary General emphasised that artificial intelligence has the power to change societies and that the Council of Europe is committed to using this power to further advance the realisation of human rights. Following on from this, it was pointed out in a subsequent discussion that trust in technologies is higher in regions of the world where societies perceive regulations as efficient and safe. To ensure that this can also be achieved with regard to artificial intelligence, it seems necessary to include as many perspectives and areas of expertise as possible in the regulatory process in order to find solutions that do justice to the diverse possible uses and areas of application of AI.

    The youth representatives at the conference were unafraid of this. Technology is not bad per se; rather, the way it is used is always in the hands of the user. Previous developments and technologies were also sometimes seen with concern, but with good regulation, security can also be guaranteed in the future.



<< < 1 2 3 4 5 6 7 ... > >>