March 21, 2022
The Government says that the Online Safety Bill, which it introduced to Parliament on 17 March 2022, “marks a milestone in the fight for a new digital age which is safer for users and holds tech giants to account”. According to the Government, the proposed legislation will protect children from harmful content, such as pornography, and limit people’s exposure to illegal content, while protecting freedom of speech.
The new legislation will impose legal requirements on:
- providers of internet services that allow users to encounter content generated, uploaded or shared by other users, i.e. user-generated content (User-to-User Services);
- providers of search engines which enable users to search multiple websites and databases (Search Services); and
- providers of internet services on which provider pornographic content is published or displayed.
The legislation will require providers of User-to-User and Search Services to:
- assess their user base and the risks of harm to those users present on the service;
- take steps to mitigate and manage the risks of harm to individuals arising from illegal content and activity, and (for services likely to be accessed by children) content and activity that is harmful to children;
- put in place systems and processes which allow users and affected persons to report specified types of content and activity to the service provider;
- establish a transparent and easy to use complaints procedure which allows for complaints of specified types to be made;
- have regard to the importance of protecting users’ legal rights to freedom of expression and protecting users from a breach of a legal right to privacy when implementing safety policies and procedures; and
- put in place systems and processes designed to ensure that detected but unreported child sexual exploitation and abuse (CSEA) content is reported to the National Crime Agency (NCA).
Those User-to-User Services that meet the Category 1 threshold conditions, specified by the Secretary of State, will be subject to additional legal requirements, including to:
- set clear and accessible provisions in terms of service explaining how content that is legal but harmful to adults will be treated, and apply those provisions consistently;
- carry out an assessment of the impact that safety policies and procedures will have on users’ legal rights to freedom of expression and users’ privacy;
- specify in a public statement the steps taken to protect users’ legal rights to freedom of expression and users’ privacy;
- put in place systems and processes designed to ensure that the importance of the free expression of content of democratic importance is taken into account when making decisions about how to treat such content;
- put in place systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about how to treat such content;
- put in place a dedicated and expedited complaints procedure that ensures that the decisions of the service provider to take action against a user in relation to a particular piece of journalistic content can be challenged;
- offer optional user verification and user empowerment tools on their sites; and
- put in place proportionate systems and processes to prevent the risk of users encountering fraudulent adverts.
Those Search Services which meet the Category 2A threshold conditions will be under a duty to produce annual transparency reports and to put in place proportionate systems and processes to prevent the risk of users encountering fraudulent adverts.
The Bill also confers new powers on Ofcom enabling it to act as the online safety regulator. Ofcom will be responsible for enforcing the legal requirements imposed on service providers. The Bill gives Ofcom:
- the power to compel in-scope providers to provide information and to require an individual from an in-scope provider to attend an interview;
- powers of entry and inspection; and
- the power to require a service provider to undertake, and pay for, a report from a skilled person.
Ofcom will also have the power to give enforcement notifications (which may set out the steps required to remedy a contravention) and the power to impose financial penalties of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater. If a service provider fails to comply with a confirmation decision, Ofcom can, in certain circumstances, apply to the courts for an order imposing business disruption measures on that provider.
In addition, the Bill requires Ofcom to produce codes of practice for service providers, setting out the recommended steps that providers can take in order to comply with the new legal requirements. A provider may take different measures to those recommended in the codes of practice. A provider will be treated as having complied with the relevant legal obligation if it takes the steps recommended in the relevant code of practice for complying with that obligation.
The Bill also requires providers of internet services which make pornographic material available by way of the service (as opposed to enabling users to generate or share such content) to ensure that children are not normally able to encounter that pornographic content.
The Bill also now replaces existing communications offences with three new offences:
- a harmful communications offence;
- a false communications offence; and
- a threatening communications offence.
The Government has also introduced a new “cyberflashing” offence into the Bill (see here) and has announced that executives whose companies fail to cooperate with Ofcom’s information requests could now face prosecution or jail time within two months of the Bill becoming law, instead of two years as it was previously drafted. To access the Bill, click here. To read the Government’s press release, click here. To access all supporting documents, including the Government’s impact assessment and the opinion of the Regulatory Policy Committee, click here.