HomeInsightsGovernment publishes initial response to the consultation on its Online Harms White Paper

Contact

The Online Harms White Paper set out the intention to improve protections for users online by the introduction of a new duty of care on companies and the appointment of an independent regulator responsible for overseeing this framework. The consultation ran from 8 April 2019 to 1 July 2019. It received over 2,400 responses ranging from companies in the technology industry, including large tech giants, and small and medium sized enterprises, academics, think tanks, children’s charities, rights groups, publishers, governmental organisations and individuals. In parallel to the consultation process, the Government says that it undertook extensive engagement over the last 12 months with representatives from industry, civil society and others.

The initial government response provides an overview of the consultation responses and wider engagement on the proposals in the White Paper. It includes an in-depth breakdown of the responses to each of the 18 consultation questions asked and an overview of the feedback in response to our engagement with stakeholders. While it does not provide a detailed update on all policy proposals, the Government says that it gives “an indication of our direction of travel in a number of key areas raised as overarching concern across some responses”.

Freedom of expression:

  • to safeguard freedom of expression, the Government says that regulation will not require the removal of specific pieces of legal content, but will focus on the wider systems and processes that platforms have in place to deal with online harms, while maintaining a proportionate and risk-based approach;
  • regulation will establish differentiated expectations on companies for illegal content and activity, versus conduct that is not illegal, but has the potential to cause harm;
  • regulation will require companies, where relevant, to explicitly state what content and behaviour they deem to be acceptable on their sites and enforce this consistently and transparently;
  • all companies in scope will need to ensure a higher level of protection for children, and take reasonable steps to protect them from inappropriate or harmful content;
  • illegal content will have to be removed expeditiously and the risk of it appearing will have to be minimised;
  • companies will be required to take particularly robust action to tackle terrorist content and online child sexual exploitation and abuse;
  • the regulator will not investigate or adjudicate on individual complaints; and
  • companies will be required to have effective and proportionate user redress mechanisms which will enable users to report harmful content and to challenge content takedown where necessary. These processes will need to be transparent, in line with terms and conditions, and consistently applied.

Ensuring clarity for businesses:

  • to ensure certainty for businesses, the Government will provide guidance to help businesses understand potential risks arising from different types of service, and the actions that businesses would need to take to comply with the duty of care as a result; and
  • the Government will ensure that the regulator consults with relevant stakeholders to ensure the guidance is clear and practicable.

Businesses in scope:

  • the legislation will only apply to companies that provide services or use functionality on their websites which facilitate the sharing of user generated content or user interactions, for example by way of comments, forums or video sharing;
  • to ensure clarity, guidance will be provided by the regulator to help businesses understand whether or not the services they provide or functionality contained on their website would fall into the scope of the regulation;
  • just because a business has a social media page that does not bring it in scope of regulation. It would be the social media platform hosting the content that is in scope;
  • the Government will introduce this legislation proportionately, minimising the regulatory burden on small businesses;

Identity of the regulator:

  • the Government says it is “minded” to make Ofcom the new regulator, in preference to giving this function to a new body or to another existing organisation. This preference is based on its organisational experience, robustness, and experience of delivering challenging, high profile remits across a range of sectors. Ofcom’s focus on the communications sector means it already has relationships with many of the major players in the online arena, and its spectrum licensing duties mean that it is practised at dealing with large numbers of small businesses.

Transparency:

  • the Government says that effective transparency reporting will “help ensure that content removal is well-founded and freedom of expression is protected”. In particular, increasing transparency around the reasons behind, and prevalence of, content removal may address concerns about some companies’ existing processes for removing content;
  • to ensure that conversations are ongoing, the Government established a multi-stakeholder Transparency Working Group chaired by the Minister for Digital and Broadband which includes representation from all sides of the debate, including from industry and civil society. This group will feed into the government’s transparency report, which was announced in the Online Harms White Paper and which it intends to publish in the coming months; and
  • in line with the overarching principles of the regulatory framework, the reporting requirements that a company may have to comply with will also vary in proportion with the type of service that is being provided, and the risk factors involved. To maintain a proportionate and risk-based approach, the regulator will apply minimum thresholds in determining the level of detail that an in-scope business would need to provide in its transparency reporting, or whether it would need to produce reports at all.

Ensuring that the regulator acts proportionately:

  • companies will be expected to take reasonable and proportionate steps to protect users. This will vary according to the organisation’s associated risk, first and foremost, size and the resources available to it, as well as by the risk associated with the service provided; and
  • to ensure clarity about how the duty of care could be fulfilled, the Government will ensure there is sufficient clarity in the regulation and codes of practice about the applicable expectations on business, including where businesses are exempt from certain requirements due to their size or risk.

Enforcement:

  • The Government says that it recognises the importance of the regulator having a range of enforcement powers that it uses in a fair, proportionate and transparent way;
  • it is equally essential that company executives are sufficiently incentivised to take online safety seriously and that the regulator can take action when they fail to do so;
  • the Government is considering the responses to the consultation on senior management liability and business disruption measures and will set out its final policy position in the spring.

Protection of children:

  • under the proposals the Government expects companies to use a proportionate range of tools including age assurance, and age verification technologies to prevent children from accessing age-inappropriate content and to protect them from other harms. This would achieve the objective of protecting children from online pornography, and would also fulfil the aims of the Digital Economy Act.

As for the next steps, the Government says that while preparation of legislation continues, and in addition to the full response to be published in the spring, it is developing other wider measures in order to ensure progress now on online safety. These will include, interim codes of practice, a government transparency report, non-legislative measures and wider regulation and governance of the digital landscape. To read the Government’s initial response in full, click here.

Topics