Insights Illegal Harms Consultation (Online Safety Act) – overview for games companies

Contact

Coming in at over 1600 pages, Ofcom’s online illegal harms consultation is the first of four major consultations expected over the next year on the Online Safety Act (OSA). The consultation contains Ofcom’s detailed draft guidance for risk assessments, illegal content, governance (inc. record keeping and review), content communicated “publicly” and “privately” and enforcement. This blog post primarily focuses on the risk assessment and illegal content portions of the consultation.

Consultations on the assessments relating to children’s (‘legal but harmful’) harms is expected this spring.

(If you want an introduction to the Online Safety Act, take a look at our previous blog post.)

Risk Profiles – these are lists of characteristics and features that might be on an online service that may add to the likelihood of there being a certain type of illegal harm on your service (e.g. anonymous user profiles may increase the risk of hate speech). The full profiles are in Appendix A of Appendix 5.

‘Priority’ illegal harms – a list of illegal harms which every service captured by the OSA has to complete a risk assessment for. Volume 2 contains the detail on this.

Register of Risks – this is an overview of the different offences there are for the ‘priority’ illegal harms (more on this below). For example, for ‘controlling or coercive behaviour’ this points to the Serious Crime Act 2015. The register is in Appendix B of Appendix 5.

Ofcom is suggesting a four-step process for assessing illegal harms:

(1) Understand the harms. Identify how the priority harms (detailed below) may be manifesting on your service and consult the Risk Profiles as part of this. As well as assessing the priority illegal harms, you will need to consider if there are any ‘non-priority’ harms that are relevant to your game.

(2) Assess the risk of harm. This involves considering the likelihood, impact and risk level to your users. To make this assessment, you’ll need to consider:

  • The Risk Profiles (which you’ll have filled out in Step 1) and if there are any features / characteristics on your service not included in the Risk Profiles which may add to risk levels. More details about the risk factors specific to each illegal harm are set out in volume 2.
  • Your core evidence (user complaints, user data, post-mortems of previous issues of harm and any other information you already hold).
  • If you don’t have sufficient core evidence (or if your risk landscape is complex!), you can also consider enhanced evidence (e.g. external experts, consulting with users, product testing).
  • The additional guidance from Ofcom on CSEA and grooming.
  • Any systems / processes in your game that may decrease risk (for example – internal governance, use of proactive technology, how you promote user media literacy).

You will then need to give each harm a ‘low’, ‘medium’ or ‘high’ risk rating – to help assess this there is a risk matrix and risk level table, including specific ones for CSEA and grooming (albeit this is not a definitive criteria).

If you have two or more illegal harms with a ‘medium’ or ‘high’ risk rating, under the current consultation the game would be considered a ‘multi-risk’ service. This will impact the number of recommended mitigation steps in (3) below (as will being a ‘large’ service).

Currently the above assessment process is the same regardless of your service type and size. A ‘large’ service is currently being posited as one with over 7 million monthly active users in the UK – importantly if you’re a ‘large’ service in relation to illegal harms this does not necessarily mean you’re a ‘categorised’ service under the OSA more broadly.

(3) Decide on measures, implement and record outcomes.

Once you’ve assessed the risks, you will then need to consider how you might mitigate these risks (the Ofcom Codes of Practice on terrorism and CSEA may help here).

This will include having: (i) a named person in your company accountable for safety duties; (ii) content moderation that allows for swift takedowns of illegal content; (iii) an easy to find, access and use complaints system; and (iv) clear Terms of Service on these topics.

You will also need to consider record keeping and governance measures (e.g. how will you record all of this information and keep it up-to-date).

(4) Report to relevant governance channels, monitor effectiveness and review. Once the risk assessment is completed and this has been reported internally, the basic rule is that the risk assessments should be reviewed annually. However, it should also be reviewed if the game undergoes a material update or change that may impact the risk assessment, or Ofcom updates its Risk Profiles which also impacts how the risks on your service are assessed.

Video games are seen as particularly relevant for terrorism (as a tool for recruitment), grooming, harassment / stalking / abuse, threatening communications and hate offences. It is worth noting although these five illegal harms specifically flag video games as a service type of risk, every service needs to assess against ALL priority illegal harms in their risk assessment, as well as any “non-priority” offences if there is a particular risk of this on your service.

The full ‘priority’ illegal harms list is as follows.

  • Terrorism offences
  • Child Sexual Exploitation and Abuse (CSEA) – which is broken down into grooming and Child Sexual Abuse Material (CSAM) and should be assessed separately.
  • Encouraging suicide / attempted suicide / serious self-harm
  • Harassment / stalking / threats / abuse offences
  • Hate offences
  • Controlling or Coercive Behaviour (CCB)
  • Drugs and psychoactive substances offences
  • Firearms and other weapons offences
  • Unlawful immigration and human trafficking offences
  • Sexual exploitation of adults
  • Extreme pornography
  • Intimate Image Abuse
  • Proceeds of crime
  • Fraud & financial services offences
  • Foreign interference

In all cases, it’s not just considering whether illegal harms content might appear on your service, but whether your service could be used to commit or facilitate such an offence.

  1. It’s important to assess whether or not you have sufficient ‘core evidence’ readily available – if not you may not be able to do a sufficient risk assessment (albeit you can move to the ‘enhanced evidence’ if so).
  2. The final illegal harms guidance is not expected until the end of 2024 (at which point companies will have three months to complete their risk assessments). Despite this, it is not a bad idea to consider assessing the risks of your game currently (essentially, do a ‘WIP’ risk assessment) and see where you might have particular issues or mitigation gaps.
  3. If you are a games company likely in the scope of the OSA for illegal harms, it will also likely mean you’ll be in scope for children’s harms too. Therefore, getting started on the illegal harms work now means less of a squeeze at the back end of 2024 / start of 2025.

If this blog post has whetted your appetite, we would suggest reading the consultation in the following order (excluding the parts relating to search services and enforcement):

  1. Overview – summarises the main themes of the consultation.
  2. A summary of each chapter – gives a high-level summary of the key proposals and the input it wants from stakeholders.
  3. Volume 1 – an overview of how services will be regulated under the OSA and who these services are likely to be.
  4. Annexe 5 – this walks through the process of an illegal harms risk assessment. (There is also further rationale on the risk assessment process in volume 3, below). This also contains the:
    • Risk matrix and risk level table.
    • Risk Profiles in Appendix A.
    • Register of Risks in Appendix B.
  1. Volume 2 – this goes into more detail on the 15 types of illegal harms, including the risk factors of a service specific to each illegal harm.
  2. Volume 5 – this goes into detail on how to actually assess if illegal content is illegal or not. Annexe 10 goes into this in even more detail.
  3. Volume 4 – this covers the illegal content codes of practice. Annexe 7 is also relevant here.
  4. Volume 3 – this covers who a risk assessment should be done by and also discusses record keeping and review (see also: annexe 6).
  5. Annexes 1 – 4 – which compiles the 55 consultation questions.