FOCUS
Published 18.07.24
Eeny, meeny, miny, moe, how old are you?
Jutta Croll, SDCImagine you have three children aged between 5 and 12. You are worried about which kindergarten and school they will visit when the family moves to the country-side soon. Where and how will Paula, Emma and Laurin be looked after, in which environment will they feel comfortable and how will they get there safely? Which school career will they pursue and how can the best conditions be created for this? The mom would like to seek external advice and support in making these difficult decisions, anonymously, without revealing their children's names, gender or age and under no circumstances by showing identification documents. It's good when there is an awareness of data protection and privacy. This doesn't make the counselling task any easier, but the counselling service is resourceful and knows its way around the internet. How old is the mother? What can be deduced about the age of the children? There are no photos of the children to be found, but didn't the mother recently look for decoration ideas for a 10th birthday party on xyz.com, offer a girls' satchel for second graders in a swap shop and post that she wants to move out of town? It's easy to put together a family profile from this.
This is exactly how platform operators are currently proceeding if they want to comply with Art. 28 (2) of the Digital Services Act and not display advertising to minors. They evaluate profile data and posted content and analyse user behaviour to find out how old their customers are and who is aged under the threshold set by the terms of use. In doing so, they intrude deeply into the privacy of their users and - albeit with good intentions - potentially violate other personal rights.
At a multi-stakeholder dialogue on age assurance on 11 July 2024 in Brussels, around 50 experts discussed how things can be done differently and where the use of age verification systems can be useful in order to provide users an age-appropriate environment that guarantees freedom and protection in equal measure. The Centre for Information Policy Leadership (CIPL) and the WeProtect Global Alliance organised this dialogue format, which had already taken place once in London in March. A number of large platform providers (VLOPs), business associations, regulatory authorities and government representatives at national and European level as well as representatives from civil society and academia took part.
There was widespread agreement among the participants that age verification should not be carried out generally but should be prioritised for high-risk services. Not all platforms are equally risky, and the understanding of the potential risk depends on the respective perspective, for example on data protection or the protection of minors. There was also no question that a standardised legal framework is desirable but would be difficult to achieve at both European and international level.
Various approaches were discussed with regard to age assessment procedures. There was a consensus that the current situation is considered inadequate: Self-declaration of age could only lead to reasonably reliable results with accompanying measures. Device-based procedures are less vulnerable in terms of privacy protection, but they require setting up a user profile that is not verified and thus harbour the risk of an incorrect age setting in the case of one device used by different family members or device sharing with others. If the profile is not up-dated according to the respective user’s age children could get access to age-inappropriate adult content, and devices with a child profile could also be used for unauthorised contact by adults, e.g. cybergrooming.
Application-based methods must meet strict requirements in terms of data minimisation, privacy protection and user anonymity, which could be achieved through so-called double blindness. This refers to procedures in which neither the platform requesting the age information nor the verifying organisation get knowledge of each other.
The discussion revealed a high level of willingness on the part of all those involved in the various processes to assume responsibility. This was evident not only in the well-founded and high-level discussion of different technical approaches, but also in the commitment to creating age-appropriate digital experience spaces in a joint process by means of valid age verification. In four digital working groups, the participants will deal with questions of regulation and risk assessment in the coming weeks before a further on-site meeting is planned for the autumn.
To summarise: The train towards age verification is put on rails and has started rolling. Anyone who participates in this open process also has the power to set the course and set the signals for a direction that creates age-appropriate digital freedom. If this is implemented well, privacy, anonymity and data protection are safeguarded for all users and, at the same time, children's rights to protection in digital environments are harmonised with their rights to freedom.