Insights The EU Digital Services Act: Upgrading Liability, Responsibility and Safety Online

The European Commission is preparing a proposal for a Digital Services Act (DSA). The DSA will revise the regulation of online platforms, such as social media platforms, search engines and online marketplaces.

The Commission has committed to publish a consultation on the DSA in the first quarter of 2020, with a legislative proposal (or legislative proposals) to follow before the end of the year. In the light of the COVID-19 pandemic, however, the timetable is now subject to some delay.

The President of the European Commission, Ursula von der Leyen, describes the DSA as follows: “A new Digital Services Act will upgrade our liability and safety rules for digital platforms, services and products, and complete our Digital Single Market.”

In short, the DSA is intended to address the problematic aspects of the impact of online platforms, including on the media sector, on competition, on consumers, on platform workers and last, but not least, on democracy.

A key question will be the extent to which online platforms are liable for the content their users make available, the products their users sell and the services their users offer – whether that be disinformation, unsafe child car seats, content inciting terrorist offences, defamation, content infringing copyright or any other of the myriad categories of illegal, unlawful and/or harmful things available online. This question is particularly pressing, given that the current rules, set out in the E-Commerce Directive, were adopted in 2000, in a very different platform landscape pre-dating the ubiquity and enormous impact of online platforms today.

Another key part of the proposals is expected to be ex ante regulation of platforms to address the systemic impact of platforms.

This short introductory note introduces the DSA, what we can expect from the legislative process and some of the key issues that will be considered by the EU legislature and their impact on the media sector.

State of Play

At the time of writing, the Digital Services Act is all things to all men, or at least, nothing has yet been committed to paper and published. Once the European Commission has published its legislative proposal, the proposal will be considered by the EU legislature, comprising the European Parliament and the Council, representing the 27 Member States.

Not content with waiting for the draft proposal before commencing work, two committees of the European Parliament have appointed rapporteurs to follow and influence the preparatory phase of the proposal. The Internal Market and the Legal Affairs Committees are both expected to publish their reports on the DSA in the summer of 2020, with plenary votes taking place in early autumn. These reports should be useful indicators of the European Parliament’s priorities and concerns. In addition, the Civil Liberties, Justice and Home Affairs Committee is going to work on a report considering the fundamental rights aspects of the Digital Services Act.

At the same time, a number of Member States, including Germany, France and Belgium, have adopted or are starting to adopt legislation in this field, illustrating the clear appetite by Member States to address the question of illegal, unlawful and harmful material made available through platforms, though also raising the spectre of fragmentation of the internal market.

Key Questions

What can we expect from the DSA? At the moment, it appears that regulation will be brought forward under two broad headings: liability (or responsibility) for content and addressing the systemic impact of platforms.

Under the liability heading, these are some of the key issues that are likely to be addressed in the course of the legislative process:

  • Good Samaritan protection for platforms;
  • Addressing the question of anonymity online; and
  • Enforcement.

Good Samaritan

The notion of Good Samaritan protection in the platform context originates from Section 230 of the US Communications Decency Act. Recently it has been called a stewardship provision in the EU context, which to many sounds better than importing the controversial US Good Samaritan concept into EU law. (A rose by any other name, springs to mind.)

This provision protects platforms when they take voluntary measures to restrict access to or availability of certain content but also, crucially, when they miss such content and do not take any action at all. Its critics point out that it is a charter of immunity for those turning a wilful blind eye.

The rationale for Good Samaritan is to encourage platforms to take voluntary proactive measures to address illegal, unlawful and harmful content available on their services. In the EU context, this seems misplaced, for two reasons: (1) There is already a liability limitation in place (in the form of Article 14 of the E-Commerce Directive), which protects platforms to the extent they are (i) neutral in respect of the information relating to the illegal content, and (ii) remove it, when notified of it. It is difficult to see that it is desirable to grant protection to platforms going beyond that. (2) Moreover, as noted in the European Commission’s Communication on Tackling Illegal Content, platforms can and do take voluntary proactive measures to address content, and doing so does not mean that platforms cannot benefit from the liability limitation.

Anonymity

A key issue relating to illegal, unlawful and harmful content online is that the third parties offering that content are often impossible to identify, and where they can be identified, it is often found that they reside outside of the jurisdiction of EU Member State courts.

In short, it is difficult – if not impossible – to prevent the availability of the illegal content by pursuing the third parties which makes it available. While the protection of anonymity online clearly is crucial in many spheres, a targeted, pragmatic requirement for platforms to verify the identity of certain business partners might constitute a part-solution to the issue of illegal, unlawful and harmful content online.

Enforcement

It has been noted that there is no real EU-wide enforcement mechanism. With a view to strengthening the internal market, the Commission is likely to be looking at measures to ensure more coherent enforcement across the EU. This could include a co-operation mechanism for Member States.