This week the UK government published its long-awaited final position on Online Harms, previewing new legislation in 2021 that will go further to protect consumers—and particularly children—from harmful content than any country.

The publication follows growing criticism over the summer about delays in advancing the legislation and industry concerns about increasing overlap (and potential confusion) among UK government initiatives to mitigate online risks to kids. Keep reading to learn more about the Online Harms white paper and the proposed new legislation to follow.


The government’s comprehensive outline sets out its intentions to give Ofcom, the UK’s communications regulator, sweeping regulatory authority over the biggest digital platforms operating in the UK, alongside the Information Commissioner’s Age Appropriate Design Code and subsuming the recently implemented Audiovisual Media Services Regulation.

The paper confirms nearly all the principles described in the interim paper published earlier this year, with some changes—most notably the decision to defer the threat of criminal prosecution for individual executives to a review by the Law Commission.

What is the new legislation?

The government commits to introducing the legislation, the Online Safety Bill, in 2021. It would impose a new “duty of care” on digital operators to make their services safe for users, with two broad objectives: 

  1. To eliminate illegal online activities (such as distribution of content related to terrorism, suicide videos and child abuse).
  2. To prevent children from being exposed to inappropriate content. 

Ofcom will issue codes of practice to help companies exercise their duty of care, and will have the power to issue fines of up to £18m or 10% of global turnover, whichever is greater.

(The announcement this week came on the heels of the EU Commission proposing two new pieces of legislation, the Digital Services Act and Digital Markets Act, seeking to force tech platforms to tackle illegal content, transparent advertising, and disinformation.) 

Who will be impacted?

The UK government’s intention is to make the duty of care proportionate to company size, impact, and risk, which it claims will mean it applies to less than 3% of operators. It is clear that the primary target of the legislation is the large digital platforms, including social media and video-sharing and streaming services, but also online dating sites, marketplaces, messaging services, consumer cloud storage, and potentially games that enable social interaction. Explicitly excluded from the scope are B2B services, including internet service providers, virtual private networks, web-hosting companies, content delivery service providers, device manufacturers, and app stores. 

However, the scope includes not only “anyone hosting user-generated content accessible by UK users” but also—controversially—search engines and “anyone enabling public or private online interactions between users,” eg, online instant messaging services and closed social media groups). This means that Ofcom will be empowered to require encrypted messaging services to take action against illegal content that may be shared in private messages, although it acknowledges the need to balance this against the benefits of user privacy. It remains to be seen how this will impact the recent trend toward end-to-end encryption of messaging services, which some activist groups have criticised for putting children at greater risk.

What does risk-based and proportionate mean?

The government proposes a tiered approach to operators, with most falling into Category 2 (required to take a proportionate approach to address illegal content and protect kids), and a small number into Category 1 (additionally required to take action on content that is legal but harmful, and to publish regular transparency reports). The government intends to pass secondary legislation to define online harms (which in turn will contribute to the categorization). 

For example, a service is likely to be considered higher risk if “it has features such as: 

  • Allowing children to be contacted by unknown adult users;
  • Allowing all users—including children—to live-stream themselves;
  • Including private messaging channels where the content on those private channels is not or cannot be moderates.”

Key terms that will impact companies

The Online Harms white paper reiterates the government’s commitment to promoting key technologies as outlined in the recent ‘Safer Technology, Safer Users: The UK as a World-Leader in Safety Tech’ report. Companies will be required to conduct child safety risk assessments, to identify and implement proportionate measures to protect kids, and to monitor these for effectiveness. The law will require the use of technologies to prevent kids from accessing age-inappropriate content, including “age assurance and age verification technologies, which are expected to play a key role for companies in order to fulfil their duty of care.” In its report, it cites the LEGO Life app as an example of a “service that requires parental consent to unlock features and functions, to provide an age-appropriate service.”

In reference to advertising, the paper confirms that the definition of user-generated content will include organic and influencer promotion on services that are in scope. Whilst the Advertising Standards Authority (ASA) will continue to be the primary regulator for advertising, the Department of Culture, Media and Sport (DCMS) will launch a public consultation in the first half of 2021 to review the advertising rules.  

The paper suggests that the government has taken into account concerns voiced by industry about legislative overlap and confusion, and promises alignment with the Age Appropriate Design Code (both applicable to services likely to be accessed by children under 18), with the Audiovisual Media Services Regulation (“regulation of UK-established video sharing platforms to be part of the online harms regime”), and with the ASA (whose “rules will continue to apply to all online advertisers”). 


The kids digital media ecosystem is complex and quickly evolving, but it doesn’t need to be daunting.

We’re here to help arm you and your teams with the knowledge and resources you need to navigate the latest advertising standards and data privacy restrictions. Enroll in our KidAware training program today to learn more.