March 9, 2026
The much-anticipated consultation on potential measures to protect children online has been published.
Titled ‘Growing up in the online world: a national conversation’, the consultation seeks to respond not only to growing calls for greater protections for children online (including consideration of an Australian-style social media ban for under-16s), but also to broader concerns that children are spending too much time online and on particular apps or platforms that do little to enrich their lives.
The consultation is divided into five sections, the first of which seeks to establish a foundational understanding of how people assess the potential benefits and risks associated with children spending time on social media and online more generally.
The next section is the most substantive, setting out the potential interventions that could be introduced to address concerns about children’s activity online. These include:
Restricting social media services by age
This proposal has received a lot of attention in recent weeks, as some have urged for the UK to follow Australia’s lead in introducing a ban for under 16s. Others are more cautious, arguing for example that a ban could create a ‘cliff-edge’ which would see children being driven to use riskier platforms when they are finally able to use services.
Raising the age of digital consent
Currently, children can only consent to their data being processed by an ‘Information Society Service’ once they are 13. As the consultation notes, raising this to 16 could provide “another intervention point that gives parents and carers more control over how platforms are using their personal data”. However, it would only affect online services relying on consent as a legal basis for processing, and could have the effect of limiting access to other services such as educational technology.
Restricting access to services based on features and functionalities
The consultation explains that, despite existing requirements under the Online Safety Act 2023 for online services to prevent children from encountering certain content, some functionalities or features sufficiently “increase the risk of children being exposed to harmful or inappropriate content” that those services that implement them should not be accessible to under 16s. These include, for example, livestreaming, location sharing, and disappearing messages. Similarly, the consultation explores restricting access to certain forms of ‘addictive’ features such as infinite scrolling, autoplay, and so-called ‘affirmation features’ such as liking or commenting on content.
Chatbots and AI
Views are sought on the benefits of children using AI chatbots, as well as the features that may pose the greatest risks to them, such as how they flatter, encourage more interaction, and how their “persistent and anthropomorphic design can lead children to form parasocial attachments”.
Finally, the consultation explores how any interventions can be effectively enforced, including whether it would be proportionate to require all social media users to verify their age, how minimum age restrictions could be “effective and workable”, and whether VPNs should be age-restricted. It also discusses ways to improve children’s digital skills and media literacy, and how parents and carers can be supported.
The consultation closes on 26 May 2026 and can be found here. In the meantime, the Government will be piloting a number of the proposed interventions, and has already introduced powers (as discussed here) intended to ensure that it can “act fast on [the consultation’s] findings within months”.
Expertise