Jump to main content keyboard shortcut 2 Jump to navigation menu keyboard shortcut 1 Jump to search keyboard shortcut 5

REPORTS AND PUBLICATIONS

Published 19.11.21

General comment No. 25 - Chapter V: General measures of implementation by States parties J - K 2 - Chapter VI: Civil rights and freedoms A

  1. Children may face particular difficulties in obtaining remedy when their rights have been abused in the digital environment by business enterprises, in particular in the context of their global operations. States parties should consider measures to respect, protect and fulfil children’s rights in the context of businesses’ extraterritorial activities and operations, provided that there is a reasonable link between the State and the conduct concerned. They should ensure that businesses provide effective complaint mechanisms; such mechanisms should not, however, prevent children from gaining access to State-based remedies. They should also ensure that agencies with oversight powers relevant to children’s rights, such as those relating to health and safety, data protection and consumer rights, education and advertising and marketing, investigate complaints and provide adequate remedies for violations or abuses of children’s rights in the digital environment.

  2. States parties should provide children with child-sensitive and age-appropriate information in child-friendly language on their rights and on the reporting and complaint mechanisms, services and remedies available to them in cases where their rights in relation to the digital environment are violated or abused. Such information should also be provided to parents, caregivers and professionals working with and for children.

  3. VI. Civil rights and freedoms

    A. Access to information

  4. The digital environment provides a unique opportunity for children to realize the right to access to information. In that regard, information and communications media, including digital and online content, perform an important function. States parties should ensure that children have access to information in the digital environment and that the exercise of that right is restricted only when it is provided by law and is necessary for the purposes stipulated in article 13 of the Convention.

  5. States parties should provide and support the creation of age-appropriate and empowering digital content for children in accordance with children’s evolving capacities and ensure that children have access to a wide diversity of information, including information held by public bodies, about culture, sports, the arts, health, civil and political affairs and children’s rights.

  6. States parties should encourage the production and dissemination of such content using multiple formats and from a plurality of national and international sources, including news media, broadcasters, museums, libraries and educational, scientific and cultural organizations. They should particularly endeavour to enhance the provision of diverse, accessible and beneficial content for children with disabilities and children belonging to ethnic, linguistic, indigenous and other minority groups. The ability to access relevant information, in the languages that children understand, can have a significant positive impact on equality.

  7. States parties should ensure that all children are informed about, and can easily find, diverse and good quality information online, including content independent of commercial or political interests. They should ensure that automated search and information filtering, including recommendation systems, do not prioritize paid content with a commercial or political motivation over children’s choices or at the cost of children’s right to information.

  8. The digital environment can include gender-stereotyped, discriminatory, racist, violent, pornographic and exploitative information, as well as false narratives, misinformation anddisinformation and information encouraging children to engage in unlawful or harmful activities. Such information may come from multiple sources, including other users, commercial content creators, sexual offenders or armed groups designated as terrorist or violent extremist. States parties should protect children from harmful and untrustworthy content and ensure that relevant businesses and other providers of digital content develop and implement guidelines to enable children to safely access diverse content, recognizing children’s rights to information and freedom of expression, while protecting them from such harmful material in accordance with their rights and evolving capacities. Any restrictions on the operation of any Internet-based, electronic or other information dissemination systems should be in line with article 13 of the Convention. States parties should not intentionally obstruct or enable other actors to obstruct the supply of electricity, cellular networks or Internet connectivity in any geographical area, whether in part or as a whole, which can have the effect of hindering a child’s access to information and communication.

  9. States parties should encourage providers of digital services used by children to apply concise and intelligible content labelling, for example on the age-appropriateness or trustworthiness of content. They should also encourage the provision of accessible guidance, training, educational materials and reporting mechanisms for children, parents and caregivers, educators and relevant professional groups. Age-based or content-based systems designed to protect children from age-inappropriate content should be consistent with the principle of data minimization.

  10. States parties should ensure that digital service providers comply with relevant guidelines, standards and codes and enforce lawful, necessary and proportionate content moderation rules. Content controls, school filtering systems and other safety-oriented technologies should not be used to restrict children’s access to information in the digital environment; they should be used only to prevent the flow of harmful material to children. Content moderation and content controls should be balanced with the right to protection against violations of children’s other rights, notably their rights to freedom of expression and privacy.

  11. Professional codes of conduct set by news media and other relevant organizations should include guidance on how to report digital risks and opportunities relating to children. Such guidance should result in evidence-based reporting that does not reveal the identity of children who are victims and survivors and that is in accordance with international human rights standards.

  12. Previous chapter
    Table of content with link to the whole document as an accessible pdf-file
    Next chapter