March 23, 2026
The Information Commissioner’s Office (ICO) has published an open letter to social media and video sharing platforms in the UK, urging them to strengthen their age assurance measures.
As the ICO explains, where services set a minimum age (such as 13), they are unlikely to have a lawful basis for processing the personal data of children under that age. As a result, services are reminded to “prevent access to children under [their] minimum age by implementing an effective age gate”.
Currently, many services will still rely on users to self-identify that they are over the age of 13 or employ a form of profiling to enforce minimum age requirements. However, the ICO is clear that these are insufficient measures in the light of recent advances in age assurance technologies, and it expresses concern that services have not yet implemented readily available technological solutions that can protect young children.
Services are therefore expected to use “current viable technologies”, such as facial age estimation, digital ID, or one-time photo matching to enforce minimum age requirements. Any such mechanism must also comply with data protection law, as set out in the ICO’s guidance on age assurance for children.
The ICO expects industry to take “urgent steps to meet this call to action” and will be engaging with high-risk services in the next couple of months to strengthen their age assurance measures. It also warns that it will be monitoring practices in case further regulatory action is required in this area.
Commenting on the ICO’s action, the Chief Executive Officer, Paul Arnold, said: “Our message to platforms is simple: act today to keep children safe online. There’s now modern technology at your fingertips, so there is no excuse not to have effective age assurance measures in place. Platforms need to be ready to demonstrate what they’re doing to keep underage children out and safeguard those children that are old enough to access their services”.
To read the letter in full, click here.
Expertise