At SuperAwesome, we’re committed to helping our partners stay ahead of the curve when it comes to youth privacy and online safety. One of the biggest regulatory changes you need to know about is the UK’s Online Safety Act – which officially became law in late 2023. Now, in 2025, the rollout is picking up speed as regulators begin enforcing new requirements, particularly when it comes to protecting children online.
Here’s a simple breakdown.
What Is the Online Safety Act?
In a nutshell, the Online Safety Act is a new law that makes online platforms providing user-to-user and search services — think apps, games, websites, social media — responsible for keeping users safe. It’s focused on protecting children and teens from both illegal content (like child sexual abuse material) and harmful-but-legal content (like bullying or self-harm content).
If your service is available to UK users — even if you’re not based there — the law probably applies to you.
Who’s in Charge Here?
The UK regulator, Ofcom (short for the Office of Communications) is leading the charge. They’re the ones setting the rules, enforcing compliance, and making sure platforms step up. Think of them as the new online safety watchdog. Not to be confused with the ICO, (the Information Commissioner’s Office) which is the UK’s independent regulator for data protection and privacy (they oversee the GDPR and Age Appropriate Design Code).
The Online Safety Act overlaps significantly with the online wellbeing goals of the ICO’s Children’s Code, even though they are legally distinct. Here’s how:
- Shared Focus on Youth Protection:
- Both the Children’s Code (ICO) and the Online Safety Act (Ofcom) aim to make digital spaces safer for young people, just from different angles
- The ICO focuses on how children’s personal data is collected, used, and protected
- Ofcom focuses on how online services assess and mitigate harms that young users might experience from content and interactions
- Mutually Reinforcing Duties:
- If a service is designed responsibly for children under the Children’s Code (e.g., using high privacy settings by default, avoiding manipulative design), it is also better positioned to meet Ofcom’s expectations under the Online Safety Act (e.g., providing safer content environments, managing risks to wellbeing)
- Age Assurance Pressures:
- Both regulators now expect that services will know their users’ age ranges. This means that age assurance (age estimation, verification, or self-declaration) becomes a necessary design feature, and Ofcom will look closely at how services are managing this as part of their risk assessments for child protection

What’s Happening Right Now?
The law is live, but enforcement is rolling out in phases. Here’s where we are:
- Ofcom just published the first wave of official rules and guidance (Codes of Practice at a glance here). This follows an extensive consultation process with over 27,000 children and 13,000 parents, as well as feedback from industry, civil society, and child safety experts – including SuperAwesome
- These rules explain what companies need to do to protect kids online – and they’re serious about it
- Companies must start assessing the risks on their platforms and putting real protections in place
Key Dates You Should Have On Your Radar
Online platforms likely to be accessed by children must:
- By July 24, 2025: complete risk assessments around children’s safety.
- From July 25, 2025: have safety measures up and running. If you’re not following Ofcom’s Codes exactly, you’ll need to prove that your approach is just as effective.
What Kind of “Harmful Content” Are We Talking About?
Key protections include filtering out harmful content from children’s feeds, strong age checks, and services must also make it easier for children to control their online experience and report harmful content.
The law covers a lot more than just illegal material. Harmful content can include suicide or self-harm-related content, eating disorder promotion, as well as bullying, abuse, or harassment.

So, What Do Companies Actually Have to Do?
Some of the must-dos include:
- Running thorough risk assessments
- Strengthening content moderation and user reporting
- Enforcing age checks where needed
- Making community guidelines clear and easy to find
- Giving users (and parents) simple ways to flag problems
Why This Matters for You
If your platform reaches young audiences in the UK, this matters – a lot. But it’s not just about ticking legal boxes. This is about earning trust, building safer communities, and standing out as a brand that genuinely cares about the next generation.
At SuperAwesome, we’re here to help. We’ve been building tools and solutions designed to protect young audiences long before this law came into effect. Now, we’re working closely with partners to make sure they’re ready for what’s next.
Final Takeaway
The Online Safety Act is a major shift toward a safer digital world for kids. While there’s a lot to unpack, the mission is simple: protect young users and create better online experiences.