HomeInsightsOfcom publishes new guidance for video-sharing platform providers on measures to protect users from harmful material

Contact

On 1 November 2020 the Audiovisual Media Services Regulations 2020 came into force in the UK. The new Regulations implemented certain provisions of the revised Audiovisual Media Services Directive (2010/13/EU) by making amendments to the Broadcasting Acts of 1990 and 1996 and the Communications Act 2003. In particular, Part 4 of the new Regulations inserted a new section Part 4B into the 2003 Act for Ofcom to regulate video-sharing platforms (VSPs) for the first time.

VSPs established in the UK, such as TikTok, Snapchat, Vimeo and Twitch, are now required by law to take measures to protect under-18s from potentially harmful video content, and to protect all users from videos likely to incite violence or hatred, as well as from certain types of criminal content.

Between 24 March 2021 and 2 June 2021 Ofcom consulted on draft guidance for VSPs on the regulatory requirements. This covered the measures set out in the statutory framework and how these might be implemented. The draft guidance included, among other things, information on:

  • having, and enforcing, terms and conditions for harmful material;
  • having, and effectively implementing, flagging and reporting mechanisms; and
  • applying appropriate age assurance measures to protect under-18s, including age verification for pornography.

Having considered all the responses to the consultation, Ofcom has now finalised and published the guidance. The purpose of the guidance is to help companies understand their new obligations and judge how best to protect their users from harmful material. Ofcom has also published a short explainer guide for industry on the new framework for VSPs.

Ofcom notes that its job is to enforce the rules set out in legislation and hold VSPs to account. Unlike in its broadcasting work, however, its role is not to assess individual videos; the massive volume of online content means it would be impossible to prevent every instance of harm. Instead, the laws focus on the measures providers must take, as appropriate, to protect their users. To help them meet their obligations to protect users, the guidance sets an expectation that VSPs should:

  • provide clear rules around uploading content: uploading content relating to terrorism, child sexual abuse material or racism is a criminal offence; platforms should have clear, visible terms and conditions which prohibit this, and enforce them effectively;
  • have easy reporting and complaint processes: companies should implement tools that allow users to flag harmful videos easily; they should signpost how quickly they will respond, and be open about any action taken; providers should offer a route for users formally to raise concerns with the platform, and to challenge their decisions; Ofcom says that this is vital to protect the rights and interests of users who upload and share content; and
  • restrict access to adult sites: VSPs that host pornographic material should have robust age-verification in place to protect under-18s from accessing such material.

Ofcom says that its five priorities for the year ahead, as set out in its workplan for VSPs are:

  • working with VSPs to reduce the risk of child sexual abuse material;
  • tackling online hate and terror;
  • ensuring an age-appropriate experience on platforms popular with under-18s;
  • laying the foundations for age verification on adult sites; and
  • ensuring VSPs’ processes for reporting harmful content are effective.

To read Ofcom’s statement in full and for access to the guidance, click here. To read Ofcom’s news release, click here. To access Ofcom’s workplan for VSPs, click here.