Insights UK Online Safety Bill – practical pointers for games companies

Contact

The Online Safety Bill (OSB) – the UK’s flagship online safety legislation – is due to become law in the coming weeks. This will be a piece of regulation to consider for games companies that develop games or products which contain ‘user-to-user content’ (e.g. chat functionality, forums, user-generated content more broadly). The OSB has teeth – companies that don’t comply with the rules can be hit with fines of up to £18m or 10% of their global annual revenue, whichever is larger.

The OSB is seeking to regulate ‘search services’ and ‘user-to-user’ services, whereby such services will have to address illegal content and, in the case of children (i.e. those under 18 years old), ‘legal but harmful’ content – which is broadly defined as content that presents “a material risk of significant harm”.

To be in scope, the service has to have either a “significant number” of users in the UK (what constitutes a “significant number” is currently unknown) or have the UK as one of its target markets. Therefore, businesses outside the UK can still be in scope of the OSB.

The OSB has two broad buckets of services – ‘uncategorised’ and ‘categorised’; whether or not a business will be ‘categorised’ will be determined by secondary legislation. However, we anticipate that most online games companies are likely be considered ‘uncategorised’ services, which have fewer obligations under the OSB (albeit still not insignificant). Some of the largest games companies may be categorised as Category 1 or 2B – those businesses will need to comply with additional requirements, including producing transparency reports.

Ofcom is the UK’s current broadcast regulator and will be taking on the additional mandate as the country’s online safety regulator. As part of its duties, Ofcom is obliged to produce several codes of practice under the OSB.

Ofcom is expecting to publish the first set of draft codes “very shortly”. The idea is that the launch of these draft codes and consultations will start fleshing out what is required by services caught by the OSB, including what factors may associate services with higher risks of online safety harm. It will also give the chance for stakeholders to input.

It’s worth taking a look at our blog on the Interactive Services Model (which was a piece of research published by Ofcom in the spring of 2023) as this gives some interesting insights as to which parts of online games Ofcom is likely to be particularly interested in as part of its OSB work. More broadly, games companies should be prepared for the following:

The OSB is heavily built around risk assessments and mitigating against any risks accordingly. Regulated services will be expected to complete risk assessments around illegal content and child access. If children are likely to access your service, then a further child risk assessment will need to be completed (we imagine most if not all online games will fall into this camp). The hope is that the ‘likely to be accessed by children’ test will align with the same test under the Children’s Code (as highlighted in the ICO and Ofcom’s 2022 Joint Statement) – see here for the ICO’s (the UK’s data protection authority’s) latest guidance for this test under the Children’s Code. Reportedly, the ICO and Ofcom are considering a bumper ‘child risk assessment’ which would cover both the Children’s Code and the child-related aspects of the OSB. However, this is likely some way off.

Shortly after the OSB has passed, Ofcom will release several key drafts including guidance on the illegal content risk assessments and the children’s harms risk assessment which games companies should review once published.

The protection of children is key to the OSB. Therefore, age assurance measures will also be a focal point for Ofcom. However, this is a topic that is being tackled on several fronts by various different regulators. The ICO and Ofcom highlighted in their Joint Statement that “developing an aligned approach on age assurance has been a priority for our joint work”.

As a result, it is unlikely that Ofcom is going to require anything more stringent than what we have seen under the Children’s Code, unless the risks are particularly grave on a particular service (and other measures cannot reduce the risk sufficiently) – in fact, the OSB must take into account the standards of the Children’s Code under Schedule 4 of the OSB.

This is also an area that Ofcom has said it will release draft guidance on – albeit this will be focused on providers of online pornographic content – so it’s looking quite possible that other online services will not be the main point of focus (at least initially).

For more of a games focus in the interim, it’s worth keeping an eye on the ICO’s age assurance research (often done in partnership with Ofcom) as well as the age assurance panel being convened by Ukie as part of the self-regulatory work on loot boxes.

Another core theme of the OSB is the prevention of illegal and harmful content, including the use of “proactive technology” to assist with this. It’s also important to think about the removal of such content if it does appear. A few points games companies may want to start thinking over:

  • What systems do you have in place to limit illegal content / ‘legal but harmful’ content to children being distributed on your service (e.g. is user-generated content on your service scanned against existing CSEA prevention databases)?
  • Are your reporting tools effective? Are players able to report at a granular enough level (e.g. can a user only report a whole server, rather than individual instances of user-generated content)?
  • What training might your support and community teams require in order to identify and deal with any reports that may trigger OSB obligations? How are more serious reports escalated? This may tie into work you’ve already done under standard 6 of the Children’s Code.

For those games companies already thinking about their compliance under the EU’s Digital Services Act, it’s worth thinking about whether any work you’ve been doing there will complement any requirements under the OSB.

An area that is going to require some finessing is how this all the information requirements fit in your game EULA, since the OSB requires several pieces of information to be detailed there. This will include specifying how users are protected from illegal content (including any proactive technology you use), how children are protected, how your complaints procedure operates and more.

Again, with the requirements around fair Terms & Conditions under the Digital Services Act, there’s plenty of reasons to revisit your EULA.

Safety by design” is a term that already started to appear throughout the Online Harms legislation process, but this will be something that games companies should consider for new products and services coming in the pipeline (and how this fits alongside any current “privacy by design” practices you already have). Essentially, online safety should be an end-to-end experience; you’ll need to assess your user flows carefully and consider at what points there may be issues – again check out the interactive service model map mentioned above for inspiration here.

Alongside thinking about the above, if you want to get ahead, there is already interim codes of practice on child sexual exploitation and abuse and terrorist content and activity online. It’s worth benchmarking your current products against these codes and thinking about where you may have gaps.

(Of course, speaking to a friendly video games lawyer will also help guide you through this process.)