Jump to main content keyboard shortcut 2 Jump to navigation menu keyboard shortcut 1 Jump to search keyboard shortcut 5

FOCUS


Published 20.10.23

How do platform designs influence young people's media behavior?

Torsten Krause, SDC

Summary of the study "Dark Patterns and Digital Nudging in Social Media - How Do Platforms Hinder Self-Determined Media Action?"

For the Bavarian Regulatory Authority for Commercial Broadcasting (Bayerische Landeszentrale für neue Medien), Kammerl et al. investigated how manipulative mechanisms (digital nudging and dark patterns) function on the basis of design features, structures and design interfaces of the platforms, as well as whether young people recognize these mechanisms and what dangers can arise from excessive media use resulting from them. The authors based their research on a literature review, an assessment of exemplary social media services, and interviews with young users of social media. A total of around 70 newly researched scientific articles were taken into account, five services (WhatsApp, TikTok, Instagram, Snapchat and YouTube) examined and eleven young people interviewed.

For the evaluation of the literature review, the authors focused on the excessive and addiction-like use of social media. Of particular interest was whether increasingly excessive use should be regarded as self-determined action or whether a dependence on social media could be observed, which would have to be understood as an inability on the part of the providers to regulate social media use in such a way that it does not lead to negative personal consequences for the young users. In general, it became clear that media-centered assumptions, which assume a strong negative influence of social media use on psychological well-being, are not supported by the available empirical findings. It was also found that the findings on children and adolescents are currently even more limited, but that there are also reports of more pronounced negative aspects than among adults. The findings suggest that characteristics of social media, the individual and the social environment are each relevant for explaining problematic use of social media (PUSM, see Moretta et al. 2022).

Kammerl et al. define digital nudging as the deliberate design of user interface elements aimed at influencing the behavior of users. Corresponding designs can thus be considered manipulative and in the (business) interest of the services. These designs are used to influence users of social media regarding their original temporal, monetary and content-receiving intention(s). These methods can therefore be classified as ethically questionable overall, especially if they affect children and young people in their freedom of choice.

The term dark patterns is used to describe design patterns that exploit weaknesses in the ability of users to reflect and process information. In particular, when misinformation is used, information is withheld, or decision options are more or less withheld, we are talking about dark patterns. It is assumed that the interface was intentionally designed in such a way that the use of the service also leads to consequences that do not correspond to the intention of the service users - even if it cannot necessarily be assumed that the intention of the services is to cause harm.

Overall, the research on the influence of digital nudging and dark patterns on excessive use is poorly established. It is relatively clear that many users are not aware of the use of these manipulative techniques (dark pattern blindness, see Di Geronimo et al. (2021)). But even if these are recognized, their effectiveness can be poorly assessed and they are accepted as a framework condition to be accepted. The studies found do not yet provide clear evidence that digital nudging and dark patterns can also be used to purposefully elicit excessive use.

Instagram, TikTok, Snapchat, YouTube, and WhatsApp were selected for an analysis of high-reach and popular apps. This revealed that these services cannot be used neutrally, but rather that in many cases dark patterns and digital nudges inscribe themselves into usage practices and attempt to control them. Kammerl et al. initially show this from a process perspective for the various usage phases. The corresponding mechanisms already take effect when a smartphone with an integrated operating system is purchased and thus before the active installation of apps. Different manipulative patterns become relevant from the initial default settings, through the first and longer-term use, to the logout or deletion of the account.

It is proven that many dark patterns can be found in all analyzed apps. However, there were also differences and dark patterns were more developed in some apps than in others.

The interviews revealed that all young people are aware of the problem of increased use. However, they often attribute the causes to individual responsibility. They initially see a lack of self-control, unlimited access to interesting content and the associated loss of their own sense of time as decisive factors influencing too much use of social media. Responsibility is also attributed to the manner of use by one's peers or to the content on the platforms. The manipulation mechanisms are viewed ambivalently. The perception of the services and their content as entertaining and popular pastimes and the high level of comfort for young people inherent in the apps and the feeling that they do not want to use these offerings excessively are at odds with each other. The young people's strategies to counteract the nudging mechanisms of the platform operators include self-control as well as technical methods, but also controlling measures carried out by parents. These young people do not consider themselves to be adequately protected or supported by the platform operators to appropriately control their usage time. This is reflected in their importance of (self-)regulating measures. Restricting the duration of use or uninstalling apps is used particularly frequently. Rarely are ways of use and strategies applied that are provided for within the apps. A special role and legitimacy for regulating social media use is also attributed to parents, who can regulate their children's usage times, for example, through parental control apps and settings.

In summary, the empirical evaluations clearly show that different aspects seem to be relevant for the emergence of problematic use of social media. It follows that multicausal explanations are very likely. In particular, certain aspects of media education (e.g., a less pronounced restrictive mediation) and a more pronounced hyperactivity / inattention seem to contribute to the development of a problematic use of social media. As recommendations for action, it could be derived from these empirical findings that clear rules and, if necessary, consistent restrictions on the use of social media could be helpful to prevent future problematic use of social media. In the case of certain predisposed adolescents (for example, those with attention deficit hyperactivity disorder), the use of social media should be accompanied to a somewhat greater extent by parents and, if necessary, also regulated, especially if the self-regulation skills of this subgroup are somewhat less pronounced than those of other adolescents.

In addition, Kammerl et al. formulated the following recommendations for action:

  • Media providers should (self-)commit to responsibility and ethical design
  • Media providers should receive orientation through ethics by design,
  • Technical possibilities for containing potential risks of dark patterns and digital nudging should be used,
  • Voluntary self-regulatory bodies should take appropriate patterns and designs into account in their age ratings and approvals,
  • Media supervision is to be strengthened in order to (better) enforce child and youth media protection,
  • milieu-sensitive media competence promotion for children and young people as well as parental work and further training for pedagogical specialists, as well as
  • further research is needed.

The study "Dark Patterns and Digital Nudging in Social Media - How Do Platforms Impede Self-Determined Media Action?" was produced as an expert opinion for the Bavarian Regulatory Authority for Commercial Broadcasting (Bayerische Landeszentrale für neue Medien) by Prof. Dr. Rudolf Kammerl, J.-Prof. Dr. Michaela Kramer, Dr. Jane Müller, Katrin Potzel, Moritz Tischer and Prof. Dr. Lutz Wartberg and published as part of the BLM series of publications, volume 110 at NOMOS and can be downloaded here.

References:

Di Geronimo, L., Braz, L., Fregnan, E., Palomba, F., & Bacchelli, A. (2020). UI dark patterns and where to find them: a study on mobile applications and user perception. In Proceedings of the 2020 CHI. https://sback.it/publications/chi2020.pdf (zuletzt aufgerufen am 04.12.2022).

Moretta, T.; Buodo, G.; Demetrovics, Z. & Potenza, M. (2022). Tracing 20 years of research on problematic use of the internet and social media: Theoretical models, assessment tools, and an agenda for future work. Comprehensive Psychiatry 112 (2022) 152286.