Insights Online Safety Bill: Joint Committee on Draft Online Safety Bill publishes report


In its report, the Joint Committee recommends major changes to the Online Safety Bill, which is due to be put to Parliament for approval in 2022.

The new law aims to make internet service providers responsible for what is happening on their platforms, including for serious crimes like child abuse, fraud, racist abuse, promoting self-harm and violence against women.

Main conclusions and recommendations:

  • “big tech” has failed its chance to self-regulate; they must obey this new law and comply with Ofcom as the UK regulator, or face the sanctions;
  • Ofcom should set the standards by which big tech will be held accountable; its powers to investigate, audit and fine the companies should be increased;
  • Ofcom should draw up mandatory Codes of Practice for internet service providers, e.g., on risk areas such as child exploitation and terrorism; Ofcom should also be able to introduce additional Codes as new features or problem areas arise, as technology develops;
  • Ofcom should require service providers to conduct internal risk assessments to identify reasonably foreseeable threats to user safety, including the potential harmful impact of algorithms, as well as content;
  • the new regulatory regime must contain robust protections for freedom of expression, including an automatic exemption for recognised news publishers, and acknowledge that journalism and public interest speech are fundamental to democracy;
  • the Bill should cover paid-for advertising in order to tackle scams and fraud; and
  • service providers should be required to have an Online Safety Policy that users agree to, similar to their conditions of service.

The Committee also says that the Bill should be clearer about what is specifically illegal online, and it should not be up to the tech companies to determine this. The Committee therefore agrees with the Law Commission’s recommendations to add in new criminal offences to the Bill, including cyberflashing, deliberately sending flashing images to people with photosensitive epilepsy with the intention of inducing a seizure (known as Zach’s law), and content or activity promoting self-harm.

The Committee also recommends imposing legal duties on pornography sites to keep children away, regardless of whether they host user-to-user content.

Individual users should also be able to make complaints to an ombudsman when platforms fail to comply with the new laws. Further, a senior manager at board level or reporting to the board should be designated “Safety Controller” who would be liable for the new offence of failing to comply with their obligations as regulated service providers where there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users. To read the Committee’s news release in full and for a link to the full report, click here.