January 15, 2026
What is the ‘Year One’ Online Safety Risk Assessment report?
In December 2025, Ofcom published its key thoughts on the 104 illegal content and children’s risk assessments submitted by supervised services in 2025 under the Online Safety Act (OSA).
The pithy 27-page report highlights good practices, common shortcomings, and Ofcom’s expectations for 2026. It details how providers should assign risk levels, use evidence, document controls, and govern online safety risk. Below we provide our key takeaways, and where in-scope services (in particular video games companies) can improve their risk assessments for 2026.
What are the key takeaways from the report?
1. ‘Low’ and ‘negligible’ risk levels must be robustly explained and evidenced. Ofcom expects providers to record more robust, evidence-based explanations for ‘low’ risk level assignments, especially where multiple risk factors have been identified for a specific type of illegal or harmful content. Ofcom’s view is that a ‘negligible’ risk level should only be used where “evidence shows it is not possible or extremely unlikely” that this kind of harm “takes place by means of [the] service,” and should not be considered the default.
A ‘good practice’ case study in the report focuses on an instance where a video games company got their risk assessment level right; despite having user controls and moderation, the company concluded there was a ‘high’ residual risk for illegal ‘hate speech’ content. This was attributed to the service’s scale and the significance of voice chat reports.
2. Absence of evidence is not evidence of absence. Ofcom noted that some providers had a lack of specific evidence to justify a lower risk level. The regulator expects providers to consider whether they have sufficient information to accurately assess the risk for each type of harm (as set out in Risk Level Tables). If a provider concludes that a lack of data indicates the absence of harm, Ofcom will expect a clear explanation to support that conclusion. See more on evidence at point 4!
3. Children’s risk assessments must not rely on self-declared age data. Ofcom noted that too many providers relied on self-declared ages in their children’s risk assessments. Unless a provider uses highly effective age assurance to enforce its minimum age, Ofcom expects it to “take a conservative view of the potential number and age of children” and err on the side of caution when assessing risk. Therefore, if strong age assurance is not in place, Ofcom will expect a conservative approach to assessing children’s exposure and risk levels for all regulated user-to-user interactions, such as voice chat, UGC, and recommendation features.
This is likely to be relevant to many games companies in-scope of the OSA, where most providers will be looking at their existing self-declaration data to understand their userbase. Although it can be useful, Ofcom re-highlights that it expects providers to use this data with a sufficient ‘pinch of salt’.
4. Assess ‘impact’ as well as ‘likelihood’, with service size front and center. In 2026, Ofcom expects providers to give greater consideration to the impact of encountering illegal and harmful content when assigning risk levels, particularly how the size of their user base could affect the number of UK users and children impacted. For children’s risk assessments, Ofcom requires a more robust assessment of the potential impact on children, based either on evidence or a cautious approach to risk levels.
Assessing ‘impact’ remains a difficult task for the games industry, as video game companies often lack the data needed to assess the impact on their users compared to providers that naturally capture more user data, such as social media companies or online dating services. Video games companies may therefore need to consider new channels to better understand their users, such as player surveys, focus groups or third-party market research.
5. Internal governance and accountability – Ofcom expects names and structure. Of the 104 risk assessment records Ofcom reviewed, 69 did not name the person responsible for online safety at the company. Ofcom’s guidance makes it clear that naming the responsible person and explaining how the assessment has been reported through internal governance channels is necessary to fulfil the OSA’s record-keeping duties.
In a good practice case study, Ofcom praised a large service provider for outlining its governance structure, noting that the executive team received monthly reports and its board of directors was briefed on trust and safety issues at least annually.
Ongoing governance doesn’t just extend to people and committees; Ofcom expects providers to be monitoring the effectiveness of their own safety measures and controls too. Video games companies should therefore consider whether their safety tools and processes can be sufficiently audited internally, or if support from third-party partners is needed.
6. Don’t just lump all illegal or harmful content into one risk category. Ofcom expects separate assessments and risk levels for each of the 17 kinds of priority illegal content, as well as for other illegal content. For user-to-user services, this includes sub‑levels for grooming, image‑based CSAM, and CSAM URLs, plus a level for ‘other illegal content.’
Similarly, children’s harm risk assessments should assign separate levels for each primary and priority harm, plus any non‑designated content assessed. If a provider uses internal risk headings, Ofcom expects these to be clearly mapped to the official categories in the OSA. The onus is on providers to ensure each type of illegal content and harm has been assessed and assigned an individual residual risk level, supported by an evidence‑based rationale.
When is the next round of risk assessments due for supervised services, and what else should providers be aware of?
- From what we know of Ofcom’s supervisory activity so far (and from Ofcom’s thoughts in A3 of the report), the providers whose assessments were captured by this report are those that have some of the larger userbase footprints and/or highest risk profiles in their specific sector.
- Ofcom will request the next illegal content and children’s risk assessment records from supervised services within a three‑month window between 1 May and 31 July 2026 and will expect to see improvements based on the feedback in this report, and more granular feedback given to the providers themselves.
- For providers not currently engaging with Ofcom directly, this report provides a good opportunity to measure their own risk assessments against common problem areas. As highlighted above, the risk assessments that informed this report were generally the largest or riskiest services. Therefore, we would hope that expectations on at least some of these topics would be lessened in relation to much smaller and/or much less risky services – albeit this is not a guarantee.
- To get ahead of this year’s risk assessments, providers may want to consider their own evidence base and identify where they may have gaps for assessing risk, how long it would take to rectify those gaps and at what cost. We would recommend re-reviewing the ‘core’ and ‘enhanced’ inputs in the Risk Assessment Guidance and Risk Profiles for illegal harms and children’s harms respectively to see what might be appropriate.
- Finally, providers should be aware that this report was compiled before certain new pieces of guidance were published, which may be expected to take a more central role in future assessments, such as the guidance on a safer life for women and girls.
Our interactive entertainment team has extensive experience helping video games companies navigate the various compliance challenges in the OSA. If you need support or would like to discuss how we can help, please get in touch.
Expertise