Go back to menu

Age appropriate design: UK deliver new code on designing online spaces

The UK's data regulator wants to make online spaces safer children.

28 September 2020

Organisations within scope of the UK data regulators new code will need to consider what the new proportionality-driven, age-appropriate design guidelines mean for their platforms.

Online Evolution

A child's online experience just before the millennium was really quite different to today. 

Online spaces were not driven by a 'like' economy, and the cost of entry was at a premium. You needed to pay to play.  No all you can eat data!  A 1990s child's biggest concern may have been desperately asking for their modem to be upgraded so they could download a game more quickly.

In 2017, 99% of 12-15 year olds were online; 53% of 3-4 year olds were too.  In the 'Internet 2.0' era, the 'datafication' of identity – the reduction of individuals to data points – weaves the fabric of online spaces. This datafication can create better services for children (online teaching in the Covid-era is just one relevant example).  But also it creates new forms of potential harm. 

The code came into force on 2 September 2020.  Organisations within scope have until 2 September 2021 to implement the necessary changes to their services.

Flexible standards

The code provides 15 flexible standards designed to protect the best interests of the child when using online services in the UK. 

The code is not a new law, but its provisions have a statutory footing under the UK data protection law implementing the EU General Data Protection Regulation (GDPR).

The new rules aims to ensure that online services use children's data in ways that support the rights of the child, including their rights to privacy, freedom of expression, and ensure protection from economic, sexual or other forms of exploitation.

The code breathes life into the provisions of the EU and UK data protection rules designed to protect children, providing practical guidance on how the rules are to apply in real-world contexts.  It does not ban activities or prescribe how organisations should comply. Rather, there is a clear mandate to implement each principle in a risk-based and proportionate manner, mindful of the risks to children and the nature of the service. 

The UK regulator does not expect, nor does it encourage, brute force application of the rules.  They want organisations to really think where their services may not be designed to act in the best interest of a child and be creative and pragmatic to remediate deficiencies if they exist. The devil is in the detail. Like most regulatory implementation, effective compliance requires organisations being honest in their appraisals of the risks and finding workable solutions.

Who does it apply to?

The ICO has produced an applicability flowchart here.

The code applies to providers of information society services. These are essentially organisations that provide online products or services that process personal data, and are "likely" to be accessed by children in the UK (such as apps, programs, websites, games or community environments, and connected toys or devices).

For relevant purposes, a 'child' is someone under the age of 18 ["A child is defined in the UNCRC and for the purposes of this code as a person under 18."]

Why?  The code is designed to fortify pre-existing legal protections, including conventions like the UN Convention on the Rights of the Child and EU personal data protection rules .  Bringing these protections to life is a principle aim, balancing the complexities of 'one size fits all' technology regulation and the need to provide companies with meaningful implementation guidance.

The code can apply to companies outside of the UK, if they provide services to UK citizens or residents.  Like other extraterritorial rules, this makes the code relevant to a potentially very large population of organisations who will - at the very least - need to consider if it applies to them (and, if so, how). 

This approach to data regulation is increasingly common, as countries assert 'data sovereignty' and apply their rules in territories outside their borders. The UK's chief data regulator, Elizabeth Denham, said that "this is the way of the future", noting that the European Commission, Mexico, the U.S. Federal Trade Commission and others are all taking notes.

How it works

The code moves beyond the 'age-gating' approach of existing rules like the U.S. Children's Online Privacy Protection Rule (COPPA), which imposes certain requirements on operators of websites or online services directed to children under 13 years of age.  If others will follow suit, online services in the coming years will look and feel very different to today.

It takes a user experience (UX) driven approach. This means the code - developed in consultation psychologists, parents, teaches, and businesses - thinks much more about design-related issues and how thoughtful design may help prevent online harms.

The code's principles expand on many core privacy principles.  Children's data shouldn't be shared unless necessary.  Intrusive settings like geolocation tracking should be off by default. 

The design of platforms shouldn't exploit children or encourage them to volunteer more data than is needed – nudge settings, for example, that encourage this should be designed out of platforms (think push notifications).  Compelling reasons would be needed to justify privacy settings not automatically being at the highest level.  The code also addresses issues of parental control and profiling.

For some companies, this will mean very few changes are needed because their products and services were designed with children in mind, or children are unlikely to access them.  Others will need to take the code much more seriously, reflecting on (and implementing) technical and policy changes needed to implement the standards.

Unwinding the code

The ICO (and, in fact, the law) encourages – and sometimes requires – organisations to conduct formal assessments to identify risks that require mitigation, and to identify appropriate solutions. This is called a data protection impact assessment.

The GDPR requires these assessments to take place where it's clear the processing poses a high risk to individuals' rights.  The starting position for those touched by the code is that such a high risk may apply, so a risk assessment should be carried out.

A starting point for the assessment has been made publicly available by the ICO, with guidance notes for organisations.

In speaking to on SuperAwesome's #kidtech podcast, Elizabeth Denham has also encouraged the use of consultation and ethics boards within companies will help with implementation. 

What's next?

The code came into force on 2 September 2020, with a 12 month transition period. Organisations should conform by 2 September 2021.

The ICO is spending this year providing tools and services and working with stakeholders in various sectors to assist them.

The support is available, and the imperative is to assess and apply the code appropriately and proportionately – all driven by the objective for overall better data governance for children.