March 23, 2026
Ofcom has urged platforms to do more to protect children online, stating that the industry has not done enough and that parents have lost trust in tech firms to keep their children safe.
Therefore, the regulator has written to a number of leading sites and apps which it has identified as “failing to put children’s safety at the heart of their products”, insisting that they implement the following four measures:
- Effective minimum-age policies. Alongside recent calls from the ICO (which we discussed here), Ofcom calls for platforms to use highly effective age assurance measures, given evidence that minimum age policies are not being adequately enforced.
- Stronger protections against grooming. This includes highly effective age assurance, but also strict controls on strangers contacting children they do not know.
- Safer feeds for children. Platforms have been issued information requests to understand content is promoted to children.
- No product testing on children. Platforms will be expected to notify Ofcom that they have assessed the risk of significant updates to their products before they are deployed.
For each measure, Ofcom will examine the steps taken by platforms when they report in May. At that time, consideration will be given as to whether enforcement action, or even a revision of Ofcom’s Codes, is required. This is in addition to the Government’s work exploring possible changes to the legislative regime in response to developing technologies (as discussed here).
To read more, click here.
Expertise