HomeInsightsThe Children’s Code in practice – key learnings from the ICO’s Reddit fine

The UK’s Information Commissioner’s Office (ICO) issued a significant decision on 24 February, fining Reddit £14.47 million for failures related to children’s privacy.

This is a landmark moment in the enforcement of the Children’s Code (Age Appropriate Design Code). As the largest fine issued by the ICO specifically regarding children’s data and age assurance, it offers important clarity for any business operating an online service likely to be accessed by younger users.

Moving Beyond Terms of Use

For many online services, relying on a “Terms of Use” provision (merely stating that a service is for those aged 13 and over) has been a long-standing approach. However, the ICO’s decision confirms that a policy-only approach is not sufficient where there is a real risk of children interacting with harmful content.

The investigation found that until early last year, Reddit lacked technical age-gating measures to actively prevent under-13s from joining. Even after a “self-declaration” gate was introduced in 2025, the ICO noted it could be easily bypassed. The takeaway for the industry is that for platforms with high-risk content, the regulator expects to see effective technical age assurance rather than relying on user honesty.

The potential harms to children stemming from use of their data on a social media platform is perhaps more pronounced than in the context of other online services. ICO guidance on age assurance does not prohibit use of self-declaration outright – the guidance states that self-declaration could be considered for activities which do not pose a high risk to children, in conjunction with other methods. Key then is assessing the level of risk child users will be exposed to.

The Vital Role of the DPIA

A key element of this case was that a Data Protection Impact Assessment (DPIA) specifically addressing the risks to children was not drafted until early 2025. ICO guidance has for many years made clear that DPIAs are mandatory for business offering online services to children (anyone under 18 years old). DPIAs are essentially risk assessments used to identify and mitigate risks related to use of personal data. With respect to offering online services to children there is an overlap between a DPIA for data protection purposes and the risk assessments required under the Online Safety Act.

A DPIA is the primary way a business can demonstrate it has thoughtfully identified the unique risks children face regarding use of their personal data when using an online service and taken steps to mitigate them. As this decision shows, the absence of a DPIA makes it very difficult for a platform to prove it is meeting its data protection obligations. Every online service which is likely to be accessed by children (not just social media platforms) should conduct one – the level of detail required and risks and mitigating steps identified will depend on the nature of the service and age of users.

Balancing Safety and Privacy-by-Design

Looking beyond the issues specifically raised by the ICO’s investigation into Reddit, implementing effective age gates and safety mechanisms more generally presents a unique challenge: protecting children often requires thoughtful data processing. To verify a user’s age or implement measures to protect children (e.g. monitoring for abusive behaviour), companies must navigate the delicate balance between robust safety and data minimisation. This is especially complex when leveraging AI. Using automated tools to identify underage users or flag harmful patterns requires a high degree of oversight to ensure the technology is accurate, fair, and compliant with regulation.

Data protection should be considered particularly carefully when:

  • Deploying technology to identify and exclude underage users without excessive data collection.
  • Monitoring for abusive behaviour to protect children in a way that respects wider privacy rights.
  • Structuring age-verification flows to meet regulatory expectations while maintaining a smooth user experience.

The Regulatory Landscape: The ICO and Online Safety

While much of the recent conversation on online safety has focused on Ofcom and the Online Safety Act, this enforcement action highlights the central role of the ICO and data protection law.

It demonstrates that data protection law remains a primary framework for managing and mitigating user risk for children. For businesses, this is a positive sign that existing data protection principles can be both a familiar and effective tool for navigating the broader online safety landscape.

This decision provides a helpful prompt for all online services (beyond social media platforms) to review their current approach:

  • Evaluate Age Assurance: Consider whether your current measures are sufficiently robust in light of potential risks to child users. Self-declaration is not sufficient where there are significant risks to children from personal data processed on the online service.
  • Update DPIAs: Ensure your assessments specifically account for children’s rights and any risks to them, such as the impact of platform algorithms. The DPIA should document your compliance with the 15 standards of the Age Appropriate Design Code.
  • Align Privacy and Safety: View your data protection strategy as the foundation of your overall safety mission.

If you are reviewing your age assurance or online safety processes in light of this decision, we are happy to help you navigate these complexities.