HomeInsightsDSA update: platform liability

Contact

The European Commission is expected to present its proposals for the Digital Services Act (DSA) and the Digital Markets Act (DMA) in early December. According to some reports the initial date of 2nd December has slid to the 9th or possibly even the 16th. Here is an update on recent DSA-related developments and what we can expect on the key question of platform liability.

When can we expect it?

Following the public consultation period, which began on 2 June 2020 and ended on the 8 September 2020, the Commission has been evaluating the +3000 consultation responses and working on its proposal and related Impact Assessments (early drafts of which were in leaked at the end of September which raises the question of the extent to which the consultation responses were evaluated).

Recent developments

The EU Parliament Internal Market Committee (IMCO) and the Legal Affairs Committee (JURI) has approved two “legislative initiative” reports calling on the Commission to adopt, within the DSA package, stronger rules tackling illegal content online and rules providing increased control to users over what they see online (in the context of content curation and targeted advertising).

In relation to platform liability the IMCO report called for:

  • all digital service providers established in third countries to adhere to the DSA rules when their services are aimed at consumers or users in the EU;
  • a binding ‘notice-and-action’ mechanism to enable users to notify online platforms about illegal online content or activities (though IMCO did not appear to embrace the concept of staydown – without it notice and takedown is relatively meaningless for most victims of online illegalities – another concern relates to the extent to which a notice imputes knowledge which is also crucial to incentivise meaningful action);
  • a strict distinction to be made between ‘illegal’ content and ‘harmful’ content;
  • enhanced transparency obligations; and
  • a ‘Know Your Business Customer’ (KYBC) principle to be introduced to require platforms to run checks to prevent fake companies selling illegal or unsafe products through the platform.

A third, non-legislative resolution by the Civil Liberties Committee which focused on fundamental rights issues was also approved. These three resolutions appear to be a bit all over the map, but one can discern some trends. See here for further discussion of the reports.

Following these own initiative reports and the leaked draft Impact Assessments, a leaked presentation from the European Commission to the European Council in early November has given an indication of what we can expect from the new legislative package in relation to platform liability.

Platform liability

One of the key questions surrounding the new legislative package is how it will affect platform liability and the current legal framework which has gradually led to more co-operation from intermediaries.

The current rules on intermediary liability, primarily set out in the E-Commerce Directive and related caselaw from the European Court of Justice, were adopted in 2000 in the context of a very different landscape and before the enormous impact of online platforms today. One of the primary aims of the DSA from the outset is apparently to expand and harmonise digital service providers’ responsibilities, including the extent to which online platforms are liable for the content their users make available, the products their users sell and the services their users offer. The key question of course is whether the proposal will include a new safe harbour for platforms which may have been considered ‘active’ under the current jurisprudence of the Court of Justice. The Commission is apparently concerned that the distinction between ‘active’ and ‘passive’ intermediaries is outdated meaning there is a possibility that the active/passive dichotomy may be jettisoned and replaced with a new regime.

The leaked presentation shows that the Commission is also considering the implementation of a range of due diligence obligations, including many of those highlighted by the IMCO report mentioned above. These include:

  • the introduction of ‘notice-and-action’ mechanisms;
  • a KYBC principle; and
  • rules on transparency of content moderation and cooperation with authorities.

The due diligence obligations will apply to all service providers not just the largest online platforms. However, asymmetric measures in the form of enhanced responsibilities for very large platforms are also being considered. This means that major online platforms, such as Google and Facebook, will likely be subject to additional responsibilities in view of their dominant positions in the market. The proposal for the Digital Markets Act (DMA), which is expected to be set out at the same time as the DSA, will also set out additional rules for these large ‘gatekeeper’ platforms. The DMA will be focused on preventing anti-competitive practices and may include prohibitions on preferential display/ranking; exclusive pre-installation; anti-steering and side-loading; and the prevention of complaints.

The Commission also indicated in the leaked presentation that there will be a focus on strong co-operation between Member State authorities on systemic issues, with individual complaints being addressed through notice and action procedures, court injunctions, administrative orders and alternative dispute resolution procedures.

Whilst the new legislation may provide some welcome clarity on the responsibilities of online platforms, there are potential risks for the victims of online illegalities who seek not only redress but also meaningful proactive measures from online platforms. It seems likely that the introduction of the DSA may actually create new liability safe harbours. The Commission also considers the adoption of a European Good Samaritan principle despite in the past having said that “taking such voluntary proactive measures does not automatically lead to the hosting service provider concerned losing the benefit of the liability exemption provided for in Article 14 of Directive 2000/31/EC” (see the Commission’s Recommendation on measures to effectively tackle illegal content online (2018)). It is not at all clear that there is any need for a Good Samaritan principle in EU law particularly in light of the powers which most online platforms reserve for themselves pursuant to their terms of service.

In fact, there is actually the potential for new safe harbours to lead to erosion of incentives for intermediary co-operation. The current jurisprudence of the Court of Justice (see e.g., L’Oreal v. eBay) may have encouraged platforms to undertake voluntary measures which they may be less inclined to do if they are certain they will fit within a safe harbour.

The harmonisation of notice and action procedures at EU level could potentially be problematic to the extent that it is cumbersome and makes enforcement more difficult or undermines the existing liability regime. Indeed, without staydown, notice and takedown is relatively meaningless. While the use of content recognition technologies by online platforms is widespread, the Commission now appears to view such technology with trepidation.