HomeInsightsOnline Safety: Ofcom launches enforcement programme into child sexual abuse material

Ofcom has launched an enforcement programme into child sexual abuse material (“CSAM”) on file-sharing platforms.

It is the latest in a series of enforcement programmes launched by Ofcom this year, including one announced earlier this month into how well regulated services under the Online Safety Act 2023 (“OSA”) comply with their obligations to carry out suitable and sufficient risks assessments (which we discussed here). Last week saw the deadline pass for organisations to carry illegal content risks assessments, meaning that, as Ofcom puts it, “platforms now have to start implementing appropriate measures to remove illegal material quickly when they become aware of it, and to reduce the risk of ‘priority’ criminal content from appearing in the first place”.

The enforcement programme into online CSAM is a recognition that the spread of this type of material online can lead to significant harm, and it is therefore an early priority for Ofcom in terms of enforcement.

Under the OSA, providers of regulated user-to-user services are required, among other things, to use proportionate measures to prevent individuals from encountering priority illegal content including CSAM on the service, and to put in place systems and processes designed to minimise the length of time for which such content is on the service.

In its illegal harms codes of practice (commented upon here), Ofcom recognised that the amount of CSAM on many services meant that relying on human moderation alone may be insufficient for the content to be identified and removed. Equally, it pointed to evidence that file-sharing and file-storage services were particularly susceptible to being used for the sharing of CSAM.

As a result, the codes of practice recommended that where their illegal content risk assessment concluded that they were at a high risk of hosting image-based CSAM, all file-sharing and file-storage service providers – irrespective of size – implement so-called ‘perceptual hash-matching’ (a process which aims to identify images that are similar to images of known CSAM, thereby making it more likely to detect CSAM compared to other forms of hash-matching).

As part of the enforcement programme, Ofcom has written to a number of these services, putting them on notice that they will shortly be receiving formal investigation requests about the measures they have taken to address CSAM, and requiring them to submit their illegal harms risk assessments. Ofcom will also work with both the services themselves to understand their approaches to detecting image-based CSAM, as well as with law enforcement agencies and other organisations to “target compliance by the highest risk services”.

Where potential non-compliance is identified, Ofcom will determine whether formal enforcement action may be necessary, and states that it “expect[s] to make additional announcements on formal enforcement action over the coming weeks”.

Commenting on the announcement of the enforcement programme, Suzanne Cater, Enforcement Director at Ofcom, said “child sexual abuse is utterly sickening and file storage and sharing services are too often used to share this horrific material. Ofcom’s first priority is to make sure that sites and apps take the necessary steps to stop it being hosted or shared. Platforms must now act quickly to come into compliance with their legal duties, and our codes are designed to help them do that. But, make no mistake, any provider who fails to introduce the necessary protections can expect to face the full force of our enforcement action”.

To read more, click here.