November 8, 2021
In July 2021, the House of Lords Communications and Digital Committee published its report: “Freedom for all? Freedom of expression in the digital age”, in which it said that the Government’s plans to address “legal but harmful” online content threaten freedom of speech and would be ineffective.
The Committee welcomed the Government’s Online Safety Bill proposals to oblige digital platforms to remove illegal content and protect children from harm, but warned that the draft legislation is “flawed” in relation to keeping children off porn sites and said that platforms must ensure they do not over-remove content.
The Government’s response addresses each of the Committee’s points in detail. The below sets out some of the key responses to the Committee’s main concerns.
On the removal of harmful but legal content by platforms, the Government agrees with the Committee that platforms should “not seek to be the arbiters of truth”. The Government says that the approach to harmful content, accessed by adults, that falls below the criminal threshold has been designed to protect freedom of expression and will not require companies to remove legal content.
Where harmful content does not cross the criminal threshold, the biggest platforms (Category 1 services) will be required to set out what is and is not acceptable in their terms of services and enforce the rules. Clause 11(3) of the draft Bill requires them to apply the rules consistently so that they do not remove content if it is not prohibited in their terms of service. In addition to the platforms being able to identify harmful content through risk assessments, the Government will set categories of priority harmful content that companies must address in their terms of service. However, companies will be free to determine their own policies for how they treat such content, so long as these are clearly understandable to users. In the Government’s view, this is the best way to identify legal, but harmful, content and provide certainty to businesses whilst ensuring the legislation is flexible to emerging harms and risks.
As for illegal content, the Government says that it is currently considering the Law Commission’s recommendations for reform of the communications offences and that the draft Bill includes “strong safeguards for freedom of expression which will minimise the risk of platforms removing legal content to comply with their new responsibilities”.
Under the Law Commission’s recommendations, the Government says that content amounting to a communications offence would fall into the “illegal content” category of the online harms framework, meaning that all companies in scope would need to comply with the law in the same way they would for any other illegal material, as defined in clause 41 of the draft Bill. Platforms will need to act where they have reasonable grounds to believe that content amounts to a relevant offence. They will need to ensure their content moderation systems are able to decide whether something meets that test. However, platforms will not be penalised if they make a wrong call on whether certain content is illegal, so long as they have put appropriate systems and processes in place. When identifying illegal content, companies will be able to draw on Ofcom’s codes of practice and any supplementary guidance.
Further, it says, clause 12(2) of the draft Bill requires that companies consider the importance of protecting users’ right to freedom of expression within the law when fulfilling their duties, including when designing systems to identify and remove illegal content. Providers of Category 1 services will have a duty to carry out an assessment of the impact that safety policies would have on protecting users’ rights to freedom of expression and must set out any positive steps taken to secure users’ rights to expression. In the Government’s view, these duties will reduce the risk that platforms over-remove content.
The Government also says that the online safety regime will capture video-sharing sites, forums and content shared via image or video search engines, and pornography on social media. All pornography sites which host user-generated content or facilitate online user interactions (including video and image sharing, commenting and live streaming) will be in scope of the legislation.
Further, companies will be expected to use age verification technologies to prevent children from accessing services which pose the highest risk of harm to children, such as online pornography. Companies would need to put in place these technologies or demonstrate that their alternative approach delivers the same level of protection for children, or face enforcement action by the regulator.
However, the Government recognises the concerns raised about protecting children from online pornography on services that do not currently fall within the scope of the Bill. It says that the DCMS Secretary of State will “use the pre-legislative scrutiny process to explore whether further measures to protect children are required”. To access the full response, click here.