Last week the UK data privacy regulator launched the most radical effort by any regulator to date to make the internet safer for kids—the Age Appropriate Design Code (AADC).  The Code is a set of design guidelines for digital services that may be used by children. Any website, app or digital service operator with UK users will have to follow the code in order to be compliant with the EU data privacy law, GDPR.

Exactly what is it? A law? A regulation? A guideline?  

  • It’s a ‘statutory Code of Practice’ under section 123 of the DPA 2018.  If you are not compliant with the Code, you are likely to be considered in breach of the GDPR and the Data Protection Act 2018, and be exposed to fines of up to €20 million (£17.5 million when the UK GDPR comes into effect) or 4% of your annual worldwide turnover, whichever is higher.

Does the Code apply to companies based outside the UK?

  • Yes, it applies to:
    • Any service provided by a company that has “a branch, office or other ‘establishment’ in the UK.” 
    • Any service from outside the EEA, which has users in the UK, and (after Brexit) any service from the EEA with users in the UK.

Are children defined as under 18 now?

  • There is no change to the GDPR Article 8 requirement to obtain parental consent before processing the personal data of under-16s (or the applicable age of consent, e.g. 13 in the UK).  The Code does—however—require operators to apply its privacy by default and best interest of the child obligations to all users under 18. So for all intents and purposes, you need to think about under-18s and design your service to be safe for them.

So what happens next? When does it kick in?

  • As required by the GDPR, the government notified the EU Commission on 22 Jan about the Code, and will abide by a ‘standstill period’ for 3 months, eg until 23 April; 
  • The Code then automatically becomes law within 40 days, unless parliament votes to reject it, eg early June 2020;
  • There is a 12-month transition period before enforcement begins, in June 2021.

If you operate a website, app, connected device or other digital service and are likely to have users under 18, then you should be preparing a Data Protection Impact Assessment or DPIA to assess what personal data you collect and use. Your service may require design changes to ensure you are abiding by the principles of privacy by default and taking the best interests of the child into account.

What it means

The ICO’s new rules have triggered extensive press coverage on both sides of the Atlantic, and with good reason. While it is not actually a new law, but simply a clarification on how GDPR-K is to be enforced, it does extend the regulators’ reach beyond the basic protection of kids’ personal data into how kids’ digital experiences are designed

In fact, the Code represents a natural evolution in the reach of kids’ data privacy laws: 

  • 2013: COPPA → collection of personal data
  • 2018: GDPR-K → processing of personal data
  • 2020: AADC → design of features based on personal data

The Code makes clear that companies can’t be in compliance with GDPR if they use personal data in ways that don’t have the best interests of their potential child users in mind. It goes further by effectively extending privacy protections to: 

  1. teenagers up to 18 years old; 
  2. not only services aimed at kids, but those likely to be used by them; 
  3. the passive collection of data by connected devices (such as voice assistants or toys); 
  4. ‘inferred data’, such as that created by ad targeting platforms; and, 
  5. any company with operations in the UK, any non-EEA company with users in the UK, and – post-Brexit – any EEA company with users in the UK.

For good summaries check out 5Rights Foundation and Bird & Bird.

But what does the Code mean for you in practice? Below we break down the Code’s key concepts and how they impact brands and content owners engaging kids through websites, apps, ad campaigns or other digital services:

Best Interests

You must take into account the age of your users and—if kids are likely to access your service—design it with their needs in mind. This means considering the child’s psychological and emotional development, health and well-being. If you’re likely to have users under 18, you must do a Data Protection Impact Assessment (DPIA). 

A good way to demonstrate your commitment to the child’s best interests is to endorse the Kidtech Standard, which encompasses the principles of the AADC as well as global best practices for kids digital safety. 

Age Assurance

You must take measures to assess the age of your users in a way that is proportionate to the risk of data processing. This could range from age gates to ID-checks. If you don’t check age, then you should apply the Code to all users. 

The key compromise the ICO made in this regard is to introduce the notion of proportionality.  The ICO accepts that verifying age without collecting more personal data (going against the GDPR’s data minimisation principle) is a challenge. Operators may therefore adapt their age verification method to match the risk of the proposed data processing. So, for example, if you wish to collect an email address to send a child a newsletter, a simple age gate is likely to be sufficient.  If, however, you are allowing kids to read and post user-generated content, you will have to take further measures to ensure your users are the age they say they are, and to ensure your service is safe for them. 

We expect this requirement to put significant pressure on the social media and streaming platforms to invest more in verifying their users’ age.

This proportionality concept does not eliminate the requirement for parental consent under Article 8 of the GDPR, in the event that consent is required for the collection of personal data from a user under 16.

Detrimental use of data

While this requirement has been softened since the first draft of the AADC was published last year, the Code sharply restricts how operators can use personal data to extend engagement. In the final draft, the ICO clarified that even if a game’s level progressions or reward mechanics, for example, are designed to extend play-time, you must give a child the option of pausing without losing progress.

More impactful is the prohibition on using personal data to recommend content that is harmful, whether advertising or content personalisation. This takes direct aim at the powerful recommendation engines that drive engagement on social media platforms, and which sometimes lead to highly inappropriate content recommendations, as highlighted in the Molly Russell case and in the many investigations into YouTube’s rabbithole effect.

User profiling must be off by default for both advertising and content recommendations.

Community Standards 

If the risks kids face on your platform are high, then you can’t take a ‘light touch’ approach to child safety.  If you say in your policy you will not tolerate bullying, then you must have mechanisms in place to enforce that.  The ICO justifies its jurisdiction here on the basis of ‘fairness’, eg you can’t uphold privacy standards if your service is not ‘fair’, and you can’t be ‘fair’ if you don’t do as you say.

Data minimisation

The ICO reminds us that blanket data processing permissions are not acceptable: “Children should be given as much choice as possible over which elements of an online product or service they wish to use and, therefore, how much personal data to provide.” 

This means you must consider each feature of your site or service independently of one another when establishing on which basis you may or may not collect personal data from kids.  

We recommend taking a ‘progressive permissions’ approach to enabling features based on the risk of data processing.  

Connected toys & devices are called out explicitly. They must implement processes for disclosure of personal data collection and consent as appropriate upfront, eg at point of sale and prior to device set-up.  

Our Kids Web Services team can help you design progressive permissions into your consent management infrastructure, whether for digital games or connected toys.

If you plan to adapt your service for compliance, we can help with tools for verifying age, managing consent requirements, safe-social engagement, zero-personal data monetisation and kid-safe delivery of ad campaigns.

If you’re interested in staying on top of technology and kidtech news, we publish several kids industry newsletters which now have over 10k subscribers reading monthly. Sign up now!


Max Bleyleben is Managing Director and Chief Privacy Officer at SuperAwesome.