Child’s play: one year on, who is listening to the ICO Children’s Code?

The ICO Children’s Code was introduced in September 2020 by the UK data regulator. One year on from the 12-month implementation period, we are asking what is the Code's impact, and does it go far enough to protect children's privacy online?

Why a Children’s Code

One critical concern is that the internet was not designed with children in mind: yet UNICEF estimates that one in three internet users are children. There is worry about how children’s privacy is being protected online, and the harmful effects on children’s mental health and wellbeing, particularly with the use of social media. The Code is principally aimed at tackling the first issue, while online harms are beginning to be addressed by rule makers in parallel.

What is the Code?

The UK Information Commissioner’s Office, the ICO, was directed to produce the Code under the Data Protection Act 2018. The Code - more formally known as the Age Appropriate Design Code - is intended to support compliance with data protection laws and non-conformance means it will be difficult for organisations to show that they are acting within data protection laws. 

It’s a statutory code of practice which intends to better protect children (defined as those under the age of 18) when they use “information society services” (ISSs), which includes websites, social media platforms, online messaging, content streaming, online games, and apps. 

The Code focuses on the concept of privacy by design, which means giving children a default high privacy and age-appropriate service. It outlines 15 standards that online services should follow, which include having children’s best interests at the forefront of design, being transparent about privacy information, not using children’s data in a detrimental way, having high default settings, and collecting data about children only where necessary.

The positive impact 

Within months of the Children’s Code being introduced, big online platforms started to implement measures to make their platforms more suitable for children. For instance, Instagram disabled targeted adverts for under 18s and turned off location tracking. There were other pressures at play as well, including action by various European supervisory authorities, but commentators agree that the Code has influenced some large platforms. 

Even though the Code applies to ISSs that will be accessed by children and process personal data in the UK, changes were made globally by Instagram, which has helped to protect children worldwide. Such changes demonstrate the significant effect UK data protection laws and compliance has had across the globe.

International influence

The Children’s Code has also been praised for being an “inspirational” example on how we can better protect children’s privacy online. This has led to other countries using the Code as a basis for their own rules and regulations on children’s privacy.

On 15 September 2022, California announced a new bill which established the California Age-Appropriate Design Code Act. Like the ICO’s Children’s Code, this act mandates that online platforms put children at the forefront by considering children’s best interests and requiring a high threshold for privacy and safety settings. The act will be in force by July 2024. Similarly, a state senator in New York introduced a children’s privacy bill shortly after news of the California act, which contains similar measures.

The Irish Data Protection Commission published its final guidance “Children Front and Centre: Fundamentals for a Child-Oriented Approach to Data Processing” in December 2021 which introduced child-specific data protection principles. The guidance bears a lot of similarities to the Children’s Code, including the 14 core fundamentals which organisations should follow. The French data protection authority, the CNIL, has also produced recommendations for protecting minors online.

The state of play

Some commentators hope that by following the example of the ICO’s Children’s Code, nations will effectively create a coherent set of global rules. This would, in turn, make compliance easier for organisations that operate in multiple countries, and close loopholes to compliance.

That said, you only have to spend a short time on websites aimed at children to see that the Code is far from being implemented in full. Nudge techniques are still prevalent, and cookie banners, privacy notices and information about settings are often not worded in ways which children are likely to understand. Some websites still ask for children’s (real) first names to be entered to personalise content.

Difficulties with age-verification measures

One of the 15 standards introduced by the Children’s Code is age-appropriate application. This means online services should tailor safeguards and protections based on the differing age ranges of children as opposed to applying one broad brush. The Code provides various methods of establishing age but warns that organisations should consider the risks of processing children’s data and the extent to which organisations can rely on ages provided by children themselves. These conflicting rules can be seen as difficult to navigate and have left some businesses unsure of how to implement age verification measures. 

It has been argued that such measures may adversely impact privacy, as submitting additional information could boost the risk of online harm to children. This is because a recognised child user may attract child predators. Equally, some children may provide false information by saying they are older than they are to gain access to certain online content. 

For online services that rely on consent, the Code specifies that children under the age of 13 are required to get parental authorisation. Verifying age and obtaining parental consent is costly for organisations to implement and manage. Added to the cost of verifying age is the cost of providing age-appropriate content, not to mention the lost revenue from having to block content behind age-gates, and the risk of losing website visitors who can’t or won’t verify their age. As a result, some organisations will find it expensive to fully comply with the Code.

What about harmful online content?

Although establishing a young user’s age can help to manage the risks posed to children in an online environment, it is only as effective as the rest of the measures implemented as part of a privacy-by-design approach.

The Children’s Code has a heavy focus on the practical measures online service providers should implement to better protect children online. However, the Code does not address harmful online content itself, which is the subject of other measures coming down the pipeline such as the Online Safety Bill in the UK. 

What now?

The ICO ran a consultation, which recently closed on 18 November 2022, to evaluate the impact of the Children’s Code. 
Opinion is divided on the impact of the Code, and it will be interesting to understand the prevailing view. While some say the Code is “inspirational”, others have said that it does not go far enough to protect children online, while others suggest it is simply too onerous on organisations.

There is a clear global recognition of the need for stricter protection of children’s privacy, and the effect of incoming legislation is beginning to be felt across the board.

Disclaimer

This information is for educational purposes only and does not constitute legal advice. It is recommended that specific professional advice is sought before acting on any of the information given. © Shoosmiths LLP 2024.

Insights

Read the latest articles and commentary from Shoosmiths or you can explore our full insights library.