How can we make the Internet safe for children in practice?

Does regulating the Internet to protect children mean controlling the age of all users? Sonia Livingston discusses the importance of taking a child-rights-focused approach and highlights the policy and political challenge of recognizing who a child is online when trying to protect children.

One in three children use the Internet and one in three Internet users is a child. Yet tech companies say they can’t determine who a child is online. This is a practical but also a political challenge: does the company want to mandate companies to assess the age of all their users? Or is it too great a risk to the privacy and freedom of expression of adults and children?

Why kids need protection online

Evidence of the problems abounds in a plethora of studies that headline media reports, as well as cited in government inquiries and third sector advocacy. While some sources are disputed and their implications too often overstated, the claim of a robust, evidence-based approach to protecting children online is widely accepted, hence the slew of new laws and policies being introduced in the UK, US and elsewhere. international level.

The EU Kids Online research network conducted a pan-European survey on online access, skills, opportunities, risks and mediation for children’s safety. It found, for example, that only a quarter of 9- to 16-year-olds feel safe online at all times, and 10% never feel safe. Depending on the country, up to half of all children said something upset them last year, double that in the previous survey. What troubles them? There is evidence that children encounter all 4 Cs of online risk: content, contact, conduct and contract. The most common risk is hatred; and the largest increase is in exposure to self-harm content. Crucially, the risks of online engagement also decrease unevenly: the European project ySKILLS is finding that the most vulnerable adolescents are most at risk of suffering harm online, including those who are discriminated against or in poorer health.

A child rights-based approach

An authoritative statement of the United Nations Committee on the Rights of the Child is its General Comment 25, which establishes the implementation of the United Nations Convention on the Rights of the Child in relation to the digital environment. A child rights framework requires a holistic approach that balances the rights to protection, care and participation and balances children’s age and maturity (or ability to evolve) and best interests (a complex judgment which puts children’s rights before profit and which requires consultation of children and making decisions transparent).

While General Comment 25 is aimed at states, the technology sector also has clear responsibilities for protecting children online. Therefore, alongside new legislation, design approaches including safety by design are increasingly required of companies whose digital products and services impact children in one way or another. And there’s a lot they could do, for example, research from EU Kids Online shows that children either don’t trust platforms or can’t figure out how to get help: after a negative experience, only 14% changed their privacy settings and only 12% reported the problem online. Meanwhile, less than a third of parents use parental controls because they don’t know how parental controls work or even if they’re effective, and they fear negative effects on their children’s privacy, range and online opportunities.

Getting the policy framework right means taking a child rights-based approach, as the UK and Europe have long been committed to but have not been sufficiently implemented.

But society cannot aim to protect children at the cost of limiting their civil rights and freedoms, nor through policies that, however inadvertently, incentivize companies to limit the age of children to beneficial digital services. Getting the policy framework right means taking a child rights-based approach, as the UK and Europe have long been committed to but have not been sufficiently implemented. As we wait, children, once intrepid explorers of the digital age, are becoming overly cautious, worrying about risks and, as evidence shows, missing out on many online opportunities.


Sonia Livingstone discusses children’s rights in a digital world. Click here to watch on LSE Player.

Identify who a child is

But, if companies don’t know which users are children, how can they be mandated to provide age-appropriate provision and protection? In the European Commission-funded euCONSENT project, my role was to explore the implications of age assurance and age verification on children’s rights. The Information Commissioners Office is actively exploring these issues in the UK.

One option is that we expect tech companies to design their services as a broadly accessible, child-friendly, broadly civilized space such as a public park with exceptions that require users to prove they are an adult (such as when buying alcohol in a shop). Another option is that we expect tech companies to treat all users in age-appropriate ways (e.g. find out everyone’s age and provide personalized services accordingly, even if there’s a lot to argue about what that means). the appropriate age and how to implement it, given that children vary enormously not only in age but also in many other factors). While there are challenges with both approaches, policy innovation is vital if we are to move beyond the status quo of treating children online as if they were adults.

To realize children’s rights in a digital age, should society require companies to re-engineer their service as needed or re-engineer the process of accessing their service? In both the UK’s Online Safety Bill and the Europes Digital Services Act, success will depend on the effective and responsible conduct of risk assessments.

Currently, many age guarantee schemes do not respect the full range of children’s rights.

In the euCONSENT project, we argued that a children’s rights-based approach to age assurance must protect not only the right of children to be protected from digital content and services that may harm them, but also their right to privacy and freedom of expression (including that of exploring their identity or seeking confidential help without parental consent), their right to a timely and effective child-friendly remedy, and their right to non-discrimination. This means they need to be able to access digital services along with everyone else even if they don’t have government ID or live in alternative care or have a disability, and whatever their face colour. Currently, many age guarantee schemes do not respect the full range of children’s rights.

Let’s be practical

Despite the significant ongoing discussions about the potential privacy, expression and inclusion costs associated with age assurance technologies to date, in practice, users are already verified. Google says it has estimated the age of all users who have accessed its service, based on a variety of data collected over time, including how their friends look, the sites they visit and everything they know about them. But Google’s strategy here is not very transparent. Meanwhile, Instagram is one of a growing number of platforms adopting age estimation technology for all of its users, as is Roblox.

In the Digital Futures Commission, with 5Rights Foundation, we have proposed a model of Child Rights by Design. It provides a toolkit for digital product designers and developers and was developed in collaboration with them and children. It is based on the UNCRC and General Comment 25. It focuses on 11 principles of which age-appropriate service is one, privacy is another, security too, of course. The other eight are equally important for a holistic approach to equity and diversity; best interests; consultation with children; corporate responsibility; children’s participation; well being; maximum development; and agency in a commercial world.

If big technology incorporated children’s rights by design, the job of policy makers, educators and parents would be greatly eased.


This post is based on a speech by the author at the European Commission stakeholder event on the Digital Safety Act and the European Parliament’s Internal Market and Consumer Protection Committee (IMCO) public hearing on online safety of minors.

All articles published on this blog provide the views of the author(s) and not the position of LSE British Politics and Policy, nor the London School of Economics and Political Science.

Image Credit: Shutterstock and Michael Jeffery via Unsplash

Print, PDF and email friendly


#Internet #safe #children #practice
Image Source : blogs.lse.ac.uk

Leave a Comment