HomeInsightsGovernment publishes report on understanding how platforms with video-sharing capabilities protect users from harmful content online

Contact

+44 (0)20 7612 9612
info@wiggin.co.uk

The Department for Digital, Culture, Media & Sport (DCMS) commissioned consultants EY to review the current landscape of online video-sharing platforms, including an analysis of the recent growth and innovation in the sector. This review was undertaken in the context of new regulation requiring platforms to take appropriate measures to protect their users from certain types of harmful content online.

To form their view, EY undertook two key areas of market research. First, they designed and administered three separate consumer surveys focused on children, teenagers and adults to understand how they use platforms with video-sharing capabilities and their awareness of online harms. Secondly, they interviewed seven, and surveyed 12, online platforms with video-sharing capabilities to better understand how sophisticated their measures are and the costs they incur to enforce those measures.

The report finds that the measures that platforms currently employ to protect their users from harmful content online can vary, but can include acceptable use policies, community guidelines, age assurance, age verification, parental controls, user prompts, content moderation, mechanisms for users to flag violative content, and transparency reports published to disclose a range of information relating to content that has been reported to a platform’s moderators.

EY’s research suggests that rather than focusing on individual measures, risk assessments need to be carried out to ensure the suite of measures in place are in line with the specific risks on the platform. Further, the report finds, platforms that consider themselves likely to be accessed by children tend to report having more effective measures in place to protect their users from harmful content online.

All platforms EY spoke with explicitly stated that illegal content on their platforms is banned. Most platforms employ the use of industry-wide resources, such as content and image scanning software, to prevent the spread of child sexual abuse material.

To access the report, click here.