FOCUS
-
Published 14.11.23
Don't push away your responsibility
Torsten Krause, SDC
After the "Don't push your thoughts away" campaign launched in November 2022 reached 21 million people online and 18 million via TV, the Federal Ministry for Family Affairs, Senior Citizens, Women and Youth (BMFSFJ) and the Independent Commissioner for Child Sexual Abuse Issues (UBSKM) presented its sequel in a Berlin cinema on 13 November. Under the title "Don't push away your responsibility", the focus is now on encouraging everyone to look and listen in order to be able to act in the event of sexual abuse against a child. At the event, Kerstin Claus (UBSKM) emphasised that not all affected children remain silent. They do send out signals, but unfortunately these are often not recognised or understood by adults. The campaign therefore aims to sensitise people to be attentive, to ask questions when in doubt and to take responsibility.
In addition to posters, adverts and events, the current campaign offers a wealth of information and guidance on how adults can react if they have the impression that a child is suffering sexual abuse. Well advised, they can then take action, protect children and thus fulfil their responsibility. Materials on sexual abuse of children and how to take action against it are also available in English, French, Polish, Turkish and Ukrainian.
-
Published 09.11.23
Children's rights organisations call for stronger measures to protect children from sexual abuse
Torsten Krause, SDC
Negotiations on the European Commission's proposal to prevent and combat sexual abuse of children on the internet are continuing in both the European Council and the European Parliament. While final agreement on a compromise proposal has already been postponed several times in the European Council and various concepts for dealing with possible detection orders are still being discussed, by end of October the European Parliament's LIBE Committee has already been able to publicly announce that all parliamentary groups have agreed on a common position. Nevertheless, the committee has yet to reach a decision. This has been scheduled for the meeting on 14 November 2023.
Based on the joint statement by the rapporteur and shadow-rapporteurs in the LIBE Committee, detection orders for the depiction of sexual abuse of children online should be permissible if there are reasonable grounds of suspicion on individual users, or on a specific group of users of unencrypted services, either as such or as subscribers to a specific channel of communication, in respect of whom there is a link, even an indirect one, with online child sexual abuse material, and a court authorises the detection. It should not be possible for detection orders to cover the solicitation of adults towards children for the purpose of sexual abuse (grooming). It will also no longer be possible to voluntarily monitor depictions of sexual abuse of children online. When the proposed regulation comes into force, the current interim derogation of the ePrivacy Directive will expire. Detections will then only be allowed on the basis of the new regulation.
Against this backdrop, more than 80 children's rights organisations are concerned that the number of reported depictions could decrease significantly and that the prosecution of and combat against sexual abuse of children on the internet would be heavily affected. In an open letter, they argue that the new regulation should also allow service providers to voluntarily investigate known and new depictions of abuse. A compromise proposal submitted in the meantime by Committee Chairman Zarzalejo envisaged that service providers themselves could apply to the court for a detection order. This would make sense, for example, if the mandatory risk assessment were to examine for a limited period of time whether and how much incriminated material is being distributed on a service in order to establish suitable measures to minimise the risk. With an open letter, the children's rights organisations want to convince those involved in the legislative process to include the possibility of voluntary detection in the regulation.
In parallel a new campaign called "Every image counts" was launched to raise awareness of the importance of combating sexual abuse of children on the internet. It uses current data to illustrate the extent of such offences against children and calls to take action to protect children. Meanwhile, organisations and alliances that oppose the European Commission's proposal criticise the Commission for taking one-sided advice and not sufficiently acknowledging opposing positions.
-
Published 27.10.23
Children's Rights between Door and Hinge
Jutta Croll & Torsten Krause, SDC
From Oct. 23rd to 26th the ICANN community met in Hamburg for their Annual General Meeting, also celebrating the 25th anniversary of the organization. ICANN is an Acronym for Internet Corporation for Assigned Names and Numbers. The non-profit organisation has the objective of ensuring a secure, stable and consistent operation of the Internet and could therefore be called the stewardship of the Internet. All the necessary regulations and principles are developed jointly and consensually between the various stakeholders, such as governments, business, the technical community, civil society and users. In this way, the foundation of the Internet grows in a bottom-up manner with the aim of enabling access to it worldwide and for all people. In doing so, ICANN focuses on the structures and functionalities that make the Internet feasible and are essential for its preservation.
Looking back at the journey ICANN has travelled since 1998 one can see how their organizational structure and their policy development procedures have evolved over time and it also becomes somehow understandable why ICANN refrains from taking responsibility for any content that is made available via the infrastructure it reigns.
Not all abuse is abuse
The Domain name system (DNS) is the backbone of ICANN’s remit and work. So, several committees, units, working groups and sub-groups like e.g. the Security and Stability Advisory Committee are dealing with keeping the DNS intact. This was mirrored in the meeting’s program where several sessions were addressing so called Domain Name Abuse. Per ICANN’s definition DNS abuse comprises phishing, pharming, malware, botnets, and spam. These areas are researched by the DNS Abuse Institute, the DNS Research Federation and the DAAR project (Domain Abuse Activity Reporting) and addressed by the Clean DNS Initiative. Although the dissemination of child sexual abuse material is not understood as being DNS abuse it is at least considered being an issue to be addressed, despite ICANN’s well intended policy not to take responsibility for content. The differentiation between maliciously registered versus compromised domains could also be applied to an approach to combat CSAM on the Internet. In 2023 hotlines in Europe received a high percentage of reports referring to domains with kind of cryptic strings of letters and numbers as a domain name where invites are spread to join platforms for the exchange of CSAM. The Netherland’s registry for their Country Code TOP Level Domain (ccTLD) .nl has implemented an AI based strategy towards DNS abuse that identifies applications for “suspicious” domain registrations and initiates further action to verify the registrant’s data. Such an approach might probably also be used to address domains abused for the dissemination of CSAM.
Is Blocking on DNS level to sharp a sword?
Blocking is the remedy of choice in case of domains rightly identified as being abused in the sense of ICANN’s definition. So as long as CSAM was mainly spread via file sharing servers it is obvious that such a server should not be shut down when 99.9 percent of the content is legally hosted there. For the .1 percent illegal CSAM content it is also true that blocking alone would not help solving the problem. Perpetrators are not all acting out of a pedophile inclination. Many of those who produce and spread CSAM have a commercial interest which is not very much different from those who distribute malware. Although children’s rights are not in the focus of ICANN and only discussed between door and hinge, this swamp must be drained, and also ICANN should be at stake in that.
-
Published 20.10.23
How do platform designs influence young people's media behavior?
Torsten Krause, SDC
Summary of the study "Dark Patterns and Digital Nudging in Social Media - How Do Platforms Hinder Self-Determined Media Action?"
For the Bavarian Regulatory Authority for Commercial Broadcasting (Bayerische Landeszentrale für neue Medien), Kammerl et al. investigated how manipulative mechanisms (digital nudging and dark patterns) function on the basis of design features, structures and design interfaces of the platforms, as well as whether young people recognize these mechanisms and what dangers can arise from excessive media use resulting from them. The authors based their research on a literature review, an assessment of exemplary social media services, and interviews with young users of social media. A total of around 70 newly researched scientific articles were taken into account, five services (WhatsApp, TikTok, Instagram, Snapchat and YouTube) examined and eleven young people interviewed.
For the evaluation of the literature review, the authors focused on the excessive and addiction-like use of social media. Of particular interest was whether increasingly excessive use should be regarded as self-determined action or whether a dependence on social media could be observed, which would have to be understood as an inability on the part of the providers to regulate social media use in such a way that it does not lead to negative personal consequences for the young users. In general, it became clear that media-centered assumptions, which assume a strong negative influence of social media use on psychological well-being, are not supported by the available empirical findings. It was also found that the findings on children and adolescents are currently even more limited, but that there are also reports of more pronounced negative aspects than among adults. The findings suggest that characteristics of social media, the individual and the social environment are each relevant for explaining problematic use of social media (PUSM, see Moretta et al. 2022).
Kammerl et al. define digital nudging as the deliberate design of user interface elements aimed at influencing the behavior of users. Corresponding designs can thus be considered manipulative and in the (business) interest of the services. These designs are used to influence users of social media regarding their original temporal, monetary and content-receiving intention(s). These methods can therefore be classified as ethically questionable overall, especially if they affect children and young people in their freedom of choice.
The term dark patterns is used to describe design patterns that exploit weaknesses in the ability of users to reflect and process information. In particular, when misinformation is used, information is withheld, or decision options are more or less withheld, we are talking about dark patterns. It is assumed that the interface was intentionally designed in such a way that the use of the service also leads to consequences that do not correspond to the intention of the service users - even if it cannot necessarily be assumed that the intention of the services is to cause harm.
Overall, the research on the influence of digital nudging and dark patterns on excessive use is poorly established. It is relatively clear that many users are not aware of the use of these manipulative techniques (dark pattern blindness, see Di Geronimo et al. (2021)). But even if these are recognized, their effectiveness can be poorly assessed and they are accepted as a framework condition to be accepted. The studies found do not yet provide clear evidence that digital nudging and dark patterns can also be used to purposefully elicit excessive use.
Instagram, TikTok, Snapchat, YouTube, and WhatsApp were selected for an analysis of high-reach and popular apps. This revealed that these services cannot be used neutrally, but rather that in many cases dark patterns and digital nudges inscribe themselves into usage practices and attempt to control them. Kammerl et al. initially show this from a process perspective for the various usage phases. The corresponding mechanisms already take effect when a smartphone with an integrated operating system is purchased and thus before the active installation of apps. Different manipulative patterns become relevant from the initial default settings, through the first and longer-term use, to the logout or deletion of the account.
It is proven that many dark patterns can be found in all analyzed apps. However, there were also differences and dark patterns were more developed in some apps than in others.
The interviews revealed that all young people are aware of the problem of increased use. However, they often attribute the causes to individual responsibility. They initially see a lack of self-control, unlimited access to interesting content and the associated loss of their own sense of time as decisive factors influencing too much use of social media. Responsibility is also attributed to the manner of use by one's peers or to the content on the platforms. The manipulation mechanisms are viewed ambivalently. The perception of the services and their content as entertaining and popular pastimes and the high level of comfort for young people inherent in the apps and the feeling that they do not want to use these offerings excessively are at odds with each other. The young people's strategies to counteract the nudging mechanisms of the platform operators include self-control as well as technical methods, but also controlling measures carried out by parents. These young people do not consider themselves to be adequately protected or supported by the platform operators to appropriately control their usage time. This is reflected in their importance of (self-)regulating measures. Restricting the duration of use or uninstalling apps is used particularly frequently. Rarely are ways of use and strategies applied that are provided for within the apps. A special role and legitimacy for regulating social media use is also attributed to parents, who can regulate their children's usage times, for example, through parental control apps and settings.
In summary, the empirical evaluations clearly show that different aspects seem to be relevant for the emergence of problematic use of social media. It follows that multicausal explanations are very likely. In particular, certain aspects of media education (e.g., a less pronounced restrictive mediation) and a more pronounced hyperactivity / inattention seem to contribute to the development of a problematic use of social media. As recommendations for action, it could be derived from these empirical findings that clear rules and, if necessary, consistent restrictions on the use of social media could be helpful to prevent future problematic use of social media. In the case of certain predisposed adolescents (for example, those with attention deficit hyperactivity disorder), the use of social media should be accompanied to a somewhat greater extent by parents and, if necessary, also regulated, especially if the self-regulation skills of this subgroup are somewhat less pronounced than those of other adolescents.
In addition, Kammerl et al. formulated the following recommendations for action:
- Media providers should (self-)commit to responsibility and ethical design
- Media providers should receive orientation through ethics by design,
- Technical possibilities for containing potential risks of dark patterns and digital nudging should be used,
- Voluntary self-regulatory bodies should take appropriate patterns and designs into account in their age ratings and approvals,
- Media supervision is to be strengthened in order to (better) enforce child and youth media protection,
- milieu-sensitive media competence promotion for children and young people as well as parental work and further training for pedagogical specialists, as well as
- further research is needed.
The study "Dark Patterns and Digital Nudging in Social Media - How Do Platforms Impede Self-Determined Media Action?" was produced as an expert opinion for the Bavarian Regulatory Authority for Commercial Broadcasting (Bayerische Landeszentrale für neue Medien) by Prof. Dr. Rudolf Kammerl, J.-Prof. Dr. Michaela Kramer, Dr. Jane Müller, Katrin Potzel, Moritz Tischer and Prof. Dr. Lutz Wartberg and published as part of the BLM series of publications, volume 110 at NOMOS and can be downloaded here.
References:
Di Geronimo, L., Braz, L., Fregnan, E., Palomba, F., & Bacchelli, A. (2020). UI dark patterns and where to find them: a study on mobile applications and user perception. In Proceedings of the 2020 CHI. https://sback.it/publications/chi2020.pdf (zuletzt aufgerufen am 04.12.2022).
Moretta, T.; Buodo, G.; Demetrovics, Z. & Potenza, M. (2022). Tracing 20 years of research on problematic use of the internet and social media: Theoretical models, assessment tools, and an agenda for future work. Comprehensive Psychiatry 112 (2022) 152286.
-
Published 17.10.23
Majorities for child protection
Torsten Krause, SDC
On 13 October, ECPAT and the NSPCC published the results of a new survey on protecting children from sexual abuse online. Between 30 August and 29 September, more than 25,000 adults in 15 member states of the European Union and the United Kingdom were surveyed on the topic. According to the survey, 95 percent of the respondents believe that it is important to have legal regulations on the prevention and protection of children from sexual violence online. Almost as many respondents (91 per cent) agreed that internet service providers should be obliged to design their services in such a way that they are safe for children and prevent sexual abuse against them. Fewer, but still a large majority of respondents (81 per cent) agreed with the statement that providers of online services should be required to search for such depictions in their services as well as to report and remove them.
The results of the survey are published by ECPAT and NSPCC at a time when the European Council and the European Parliament are discussing a regulatory proposal of the European Commission to combat online sexual violence against children and are preparing their respective decisions on how to deal with this draft. At the centre of the discussion is whether and to what extent the detection of grooming or depiction of sexual abuse against children should also be made possible in encrypted services. In the survey, 72 per cent of the participants expressed their willingness to accept a compromise for the advantage of child protection and at the expense of their privacy in order to make it possible to search for these unlawful activities.