Insights EU and UK regulators address platforms concerning disinformation and harmful content arising out of the conflict in Israel and Gaza

In the UK, video sharing platforms (“VSPs”), a type of online service allowing users to upload and share videos publicly, must register with Ofcom and are under a legal obligation to protect minors from content that may seriously impair their physical, mental or moral development and protect the general public from harmful content These rules will be replaced in due course by the new UK Online Safety Act.

On 11 October, Ofcom wrote to UK VSPs reminding them of their obligations as they might apply to material stemming from the developing situation in Israel and Gaza, such as terrorist videos and videos that incite hatred or violence. This includes the provision of systems and processes to anticipate and respond to the potential spread of harmful material.

In the EU, similar rules are contained in the Digital Services Act (“DSA”) which requires Very Large Online Platforms (those with 45m or more average monthly active users in the EU) (“VLOP”) to assess and mitigate systemic risks arising from the design or functioning of their services related to, amongst other things, the dissemination of illegal content and any negative effects on civic discourse, public security or in relation to gender-based violence, the protection of public health and minors. Under the DSA, the EU Commission can require a VLOP to implement a crisis response mechanism where extraordinary circumstances lead to a serious threat to public security or public health in the EU. The DSA also allows the European Commission to send a VLOP a request for information relating to its investigation of a suspected DSA infringement and can impose fines for failure to comply.

In a published letter to X (previously known as Twitter) on 10 October 2023, EU Commissioner Thierry Breton reminded X that it must be clear in its terms what is permitted on the platform and that such terms must be enforced. The letter alleges that recent changes to X’s public interest policies left many EU users uncertain. It also reminded X that, when it receives notices of illegal content, it must respond in a timely, diligent and objective manner and that the Commission has heard, from qualified sources, that potentially illegal content is circulating on X despite flags from relevant authorities. Finally, the letter reminded X that it must have proportionate and effective mitigation measures to protect public security and civic discourse and that there are widely reported instances of fake and manipulated images and facts circulating on the platform, including military footage originating from video games.

On 12 October 2023, the EU Commission formally sent X a request for information pursuant to the DSA seeking information on its policies and notices on illegal content, complaint handling, risk assessment and mitigations. According to reports, X must respond by 18 October on its crisis response protocol and by 31 October on the other issues. Once the response has been assessed, the Commission can decide to open formal enforcement proceedings.

For more information, click here and here.