Insights Ofcom’s Chief Executive, Melanie Dawes, says that “the time has come for strong, external oversight of social media companies”

In an article published on 24 November 2021 on Ofcom’s website, Ms Dawes explains why we cannot rely on tech media companies responding to users’ outrage when serious harm takes place. Instead, Ms Dawes says, we need a clear set of rules.

Ms Dawes says that when people spend so much time connected, safety matters online. However, at present, search and social media sites can choose whether to heed the concerns of the public. “If these companies are regulated at all, it is only by outrage. The time has come for strong, external oversight”, Ms Dawes says.

Ms Dawes sees the draft Online Safety Bill, currently being scrutinised by Parliament, as an important piece of law, as tech firms will have new duties of care to their users, which Ofcom will enforce. “We plan to build on our track record of upholding broadcast standards, supporting a range of views and promoting innovation”, Ms Dawes says.

However, the new law does not mean that Ofcom will be regulating or moderating individual pieces of online content. The sheer volume of online content makes that impractical. Further, Ofcom will not act as a censor, prevent robust debate or trample over users’ rights. “Free speech is the lifeblood of the internet. It is a foundation of democratic society, at the heart of public life, and a value that Ofcom holds dear, Ms Dawes says. Instead, Ofcom’s role will be “to lift the ‘vague and cloudy uncertainty’ that hovers over search and social media”.

When regulating online safety, Ofcom will demand answers to questions, such as how do platforms really work, why do users see the content they see, what are platforms doing to protect children, and when platforms design their services, is safety the first priority or just a secondary thought?

Ms Dawes says that Ofcom will require companies to assess risk with the user’s perspective in mind. Ofcom will require them to explain what they are doing to minimise, and quickly remove, illegal content, and to protect children from harm. Ofcom will hold companies to account on how they use algorithms, address complaints and ensure a safe experience for children. The biggest services will also have to explain how they protect journalistic and democratic content. Currently, “these decisions are being made behind companies’ doors, with no visibility or accountability”, Ms Dawes says.

Ms Dawes explains that Ofcom will also set codes of practice, and report publicly on platforms’ performance. If it finds companies fail in their duties of care, it will be able to levy fines or audit their work. Ofcom will also work closely with its international counterparts to find global solutions to these challenges.

Ms Dawes says that “Ofcom is gearing up for the job, acquiring new skills in areas like data and technology”. It has also opened a new technology hub in Manchester to help it attract skills and expertise from across the country and beyond. Ms Dawes concludes: “We will be ready; and I believe the new laws will make a genuine difference. By shining that very bright light on immensely powerful companies, we can ensure they take proper care of their users and create a safer life online for everyone.” To read Ms Dawes’ article in full, click here.