HomeInsightsProtecting children under the OSA and DSA – how does it differ?

Contact

On 24 April 2025, Ofcom published its final guidance (OSA Children’s Harms Guidance) on the protection of children under the UK’s Online Safety Act (OSA). Hot on its heels on the 13 May 2025, the European Commission (EC) published its draft guidelines (DSA Guidelines) on the protection of minors under Article 28(4) of the Digital Services Act (DSA) – the consultation for the DSA Guidelines is open until 15 June 2025.

We also make reference to the illegal harms guidance under the OSA (OSA Illegal Harms Guidance).

But how does the guidance for the OSA and DSA differ, and where are they alike in relation to the protection of minors? Below we highlight some of the major topics and some of their key similarities and differences. For more details, please feel free to reach out.

Scope

  • To start from the top, it’s worth remembering that the children’s harms duties under the OSA apply where a ‘user-to-user service’ is also “likely to be accessed by children”.
  • This is broader than Article 28, which only applies to ‘online platforms’ under the DSA that are “accessible to minors” (as set out in Recital 71). That means for any games business that is a ‘hosting service’, they will not be in scope of the DSA Guidelines.
  • Both the OSA Guidance and DSA Guidelines make the point that merely having a self-declaration age gate, or barring children from accessing your service in your terms, is not enough to deem that the service is not accessible to children.
  • Another important point to remember is that the OSA children’s harms provisions are only focused on specific types of harms – the DSA Guidelines aren’t just focused on online harms but also privacy, security, consumer issues and more.

Approach

  • It is worth noting that the approaches of the respective DSA and OSA guidance are slightly different – the EC has taken the approach of suggesting (non-exhaustive) lists of measures that may be appropriate to meeting the broad requirements of Article 28(4).
  • Ofcom is a little more hands-on than this – the recommended measures (provided the service meets the trigger point for them – normally either by having a certain risk profile or size) are considered ‘safe harbour’ for compliance under the OSA, so there is perhaps more incentive to meet each recommended measure. However, this does not stop a provider implementing alternative measures, provided that they still meet the underlying OSA safety duties (as well as the user’s right to freedom of expression and privacy).

(a) Risk assessments: Risk assessments are a core duty under the OSA, and the guidance around them provided by Ofcom is incredibly detailed. Both the DSA and OSA child risk assessments want in-scope services to consider how different aspects of their service may give rise to risk (e.g. functionalities, number and type of users), and both set out the need to look at mitigation measures already in place, and whether additional measures are needed.

On the other hand, the OSA Children’s Harms Guidance and DSA Guidelines go about risk assessments in different ways – Ofcom recommending its ‘four-step methodology’ to understanding and mitigating harms, with the EC opting for the ‘5Cs typology of risks’. The DSA Guidelines also more expressly focus on the rights of the child, and how they may be impacted in relation to the different risks and suggested mitigation measures.

(b) Child account discovery: Both the DSA Guidelines and OSA Children’s Harms Guidance have overlaps in certain measures recommended here – for example, both suggest that child users should not be in contact suggestions for other users and that child users should have to accept invitations before being added to group chats.

However, in both the above scenarios, these recommendations only ‘bite’ under the respective pieces of OSA guidance if certain thresholds are met. For example, the recommendation around invitations to group chats (PCU J3) is only recommended to services likely to be accessed by children if they are at medium or high risk of certain types of ‘legal but harmful content’ (e.g. abusive content, bullying content).

(c) Content moderation: There are a number of similar points between each piece of guidance here, with both recommending having:

  • moderation teams with clear processes and policies in place (including around prioritisation);
  • a dedicated person (or possibly team under the DSA Guidelines) responsible for the protection of children, who has access to senior management;
  • sufficiently resourced and trained teams;
  • tracking of harms and regular reporting of this to senior management or the relevant governance channels.

The DSA Guidelines however focus more on the implementation of preventative features, whereas the OSA Illegal Harms and Children’s Harms Guidance expressly calls out the provision of training materials to volunteer moderation staff.

(d) User reporting: Both the OSA Children’s Harms Guidance and DSA Guidelines recommend that reporting tools should be suitable for younger users, and that it should be clear to them if their details are shared with the user they have reported (or another third party). Both also suggest that child users should receive an acknowledgement of their complaint, an indicative timeframe for a decision and possible outcomes of their report – however in the case of the OSA guidance, this applies only in relation to services that are a certain size or risk profile.

The OSA guidance contains specific measures in relation to complaints from users where they think content has been mistakenly taken down by automated means. A unique measure in the DSA Guidelines is suggesting that users should have the ability to report another user who they believe is underage.

(e) Age assurance: One of the more complex and controversial topics is no doubt age assurance.

Under the OSA Children’s Harms Guidance, the recommended measures which suggest age assurance should be used vary depending on several factors (e.g. whether or not the service permits content that is ‘legal but harmful’ to children, the risk level and/or whether or not it can technically take content down). In several cases, Ofcom recommends age assurance in scenarios which go beyond the underlying requirements of the OSA – primarily those in relation to ‘priority content’ and those using content recommender systems. Services will also need to consider the UK’s Age Appropriate Design Code (AADC) too.

Article 28(3) of the DSA sets an interesting scene in relation to age assurance under the DSA Guidelines, which sets out that online platforms will not be obliged to “process additional personal data in order to assess whether the recipient of the service is a minor”.

The DSA Guidelines go on to say that platforms may be able to adopt measures that protect children as an alternative to age assurance measures, however the section suggests that in certain scenarios age assurance will be recommended (e.g. 18+ restricted content). Here, the European Data Protection Board’s statement on Age Assurance needs to be reviewed too if processing of additional data needs to happen.

The DSA Guidelines are therefore relying on privacy-preserving technologies to be readily available on the market in order for the EC to encourage adoption of age assurance measures that don’t trip up on Article 28(3) – the upcoming EU Digital Identity Wallet will no doubt help this given its selective disclosure features, but given the additional requirement that there should always be more than one age assurance measure available to users, both businesses and the EC will need options.

Due to the breadth of the DSA Guidelines, it naturally overlaps with other UK rules too – a good example of this being default settings and engagement techniques under the DSA Guidelines, which has more in common with standards 5 and 7 of the AADC than the OSA Children’s Harms Guidance. Loot boxes is another topic – the proposed ban of their sale to minors under the DSA Guidelines doesn’t quite match up with what the UK’s Principles and Guidance on Paid Loot Boxes sets out (i.e. they can be purchased by minors with parental / guardian consent).

Parental controls is one we will have to wait on a bit longer – although the AADC does cover parental controls from a privacy perspective, it is unmentioned in the Ofcom Children’s Harms Guidance (aside from being earmarked as a topic to be explored).

  • One of the positive aspects of the DSA Guidelines is that the European Commission has put a stronger focus on the rights of children (and their best interests) when compared to the OSA Children’s Harms Guidance, in particular the rights enshrined in the Charter and the United Nationals Convention on the Rights of the Child (this is something that the AADC has done well too).
  • On the other hand, the spread of topics covered by the DSA Guidelines means that there is little detail (or rationale) on each given topic (for example, how to determine risk levels for the risk review – which has a knock-on effect for topics such as age assurance).
  • There are some topics in the DSA Guidelines that would be better tackled elsewhere and also given further thought – for example, some of the consumer law risks highlighted would likely be better housed in dedicated consumer law legislation which applies to all businesses which have children accessing their service, not just those that are ‘online platforms’.

The above is just a snapshot of the similarities and differences between the OSA and DSA measures on the protection of children – if you’d like to know more, or if you need help on responding to the DSA Guidelines consultation, please get in touch.