Insights Information Commissioner’s Office publishes new guidance on content moderation and data protection

Contact

On 16 February 2024, the Information Commissioner’s Office (ICO) published new guidance on content moderation and data protection.

The guidance is geared towards those online services which will be deemed ‘user-to-user’ services under the Online Safety Act (OSA). Under the OSA, in-scope services have content moderation duties under the respective safety duties for illegal content (s.10(4)) and regarding the protection of children (s.12(8)) if their service is “likely to be accessed by children”.

The guidance covers both moderating the content itself, as well as any action you take as a result of the moderation (‘moderation action’). This may involve automated, semi-automated or human involvement.

Below, we’ve picked out five key takeaways from the guidance:

‘Special category’ data is a specific sub-set of personal data. This tends to be sensitive information such as a person’s biometric data, race or health data (see Art. 9 of the UK GDPR for the full list).

Content moderation may involve processing special category data – this is because users may disclose it in their content, or you’re using it to support decisions about your content moderation (either directly or by inference). If this is true of your service, then you will need to identify both a condition for processing under Art. 9 of the UK GDPR as well as a lawful basis (Art. 6). In many cases, you will also need an ‘appropriate policy document’ in place too (Sch. 1).

As highlighted above, the safety duties under the OSA mean that ‘legal obligation’ will likely cover a fair amount of content moderation activity for online services. However, for content moderation that falls outside of this (e.g. wider enforcement of your Terms of Service) then ‘legitimate interest’ may be suitable, but this will involve balancing your interests against the data subject’s rights and freedoms.

There are of course overlaps here with how online services may want to think about age assurance solutions too – see our previous blog here on the updated Commissioner’s Opinion on Age Assurance.

‘Necessary for the performance of a contract’ is also briefly touched upon in the guidance – online services will need to ask themselves if they are processing personal data to fulfil obligations outlined in their terms of service. If such terms are entered into with a child under 18 years old, you will also have to consider if they have necessary capacity. In short – the ICO highlights that ‘legitimate interest’ and ‘legal obligation’ are likely to be more suitable.

Following on from the above, using ‘legitimate interests’ as a legal basis is most likely to apply where you want to use personal data which people would ‘reasonably expect’ and where it does not have an ‘unjustified adverse impact’. Therefore, it follows that online services should be clear to users about what types of content are prohibited on their service and why, and what moderation action they will take if such content is found.

Additionally, online services should ensure that their content moderation systems perform accurately and produce unbiased, consistent outcomes. Training for trust and safety teams and moderators should be considered, as well as periodic audits of moderation action outcomes to check personal data is being used consistently and fairly.

If you’re undertaking content moderation by ‘solely automated means’ (i.e. decisions that are taken without any meaningful human involvement) you will need to consider if this has a “legal or similarly significant effect” on the user. Historically, examples of what this may include were fairly limited (e.g. refusal of a credit application or e-recruiting practices). However, this guidance adds to these examples, including decisions that “affect someone’s financial circumstances”.

This, therefore, will be particularly relevant to online services that offer ways for people to monetise their user-generated content. The example given is a content creator that makes video content as their “primary source of income” having a video removed by solely automated means. How online services can assess whether user-generated content removal would have a “significant impact” on the user’s income is, however, not discussed further.

Online services should be clear in advance about what personal data they may need to make illegal content judgements under the OSA (see Annexe 10 of Ofcom’s draft illegal content judgement guidance) – this should be kept under review with clear guidance being provided to moderators. As highlighted in Ofcom’s Annexe 10 above, the type of personal data needed will likely vary depending on the offence being considered.